Abstract: Skepticism of the building block hypothesis has previously been expressed on account of the weak theoretical foundations of this hypothesis and anomalies in the empirical record of the simple genetic algorithm. In this paper we focus on a more fundamental cause for skepticism—the extraordinary strength of some of the assumptions undergirding the building block hypothesis. As many of these assumptions have been embraced by the designers of so called “competent” genetic algorithms, our critique is relevant to an appraisal of such algorithms. We argue that these assumptions are too strong to be acceptable without additional evidence. We then point out weaknesses in the arguments that have been provided in lieu of such evidence.
overview
Optimization, Adaptation, Machine Learning and Evolutionary Computation
From the introduction of a manuscript that I recently submitted for review
The practice of Machine Learning research can be characterized as the effective semiprincipled reduction of learning problems to problems for which robust and efficient solution techniques exist – ideally ones with provable bounds on their use of time and space. In a recent paper Bennett and Parrado-Hern´andez (2006) describe the synergistic relationship between the fields of machine learning (ML) and mathematical programming (MP). They remark:
“Optimization lies at the heart of machine learning. Most machine learning problems reduce to optimization problems. Consider the machine learning analyst in action solving a problem for some set of data. The modeler formulates the problem by selecting an appropriate family of models and massages the data into a format amenable to modeling. Then the model is typically trained by solving a core optimization problem that Continue reading “Optimization, Adaptation, Machine Learning and Evolutionary Computation”