Q4: How many EAs exist? Which?

The All Stars

There are currently 3 main paradigms in EA research: GENETIC ALGORITHMs, EVOLUTIONARY PROGRAMMING, and EVOLUTION STRATEGIEs. CLASSIFIER SYSTEMs and GENETIC PROGRAMMING are OFFSPRING of the GA community. Besides this leading crop, there are numerous other different approaches, alongside hybrid experiments, i.e. there exist pieces of software residing in some researchers computers, that have been described in papers in conference proceedings, and may someday prove useful on certain tasks. To stay in EA slang, we should think of these evolving strands as BUILDING BLOCKs, that when recombined someday, will produce new offspring and give birth to new EA paradigm(s).

One such interesting offspring is the Memetic Algorithm. This is a hybrid evolutionary algorithm, which makes use of local search operators. For details, see http://www.densis.fee.unicamp.br/~moscato/memetic_home.html

Promising Rookies

As far as "solving complex function and COMBINATORIAL OPTIMIZATION tasks" is concerned, Davis' work on real-valued representations and adaptive operators should be mentioned (Davis 89). Moreover Whitley's Genitor system incorporating ranking and "steady state" mechanism (Whitley 89), Goldberg's "messy GAs", involves adaptive representations (Goldberg 91), and Eshelman's CHC algorithm (Eshelman 91). For real FUNCTION OPTIMIZATION, Differential EVOLUTION seems hard to beat in terms of convergence speed as well as simplicity: With just three control variables, tuning is particularly easy to do.

For "the design of robust learning systems", i.e. the field characterized by CFS, Holland's (1986) CLASSIFIER SYSTEM, with its state-of-the-art implementation CFS-C (Riolo 88), we should note developments in SAMUEL (Grefenstette 89), GABIL (De Jong & Spears 91), and GIL (Janikow 91).

References

Davis, L. (1989) "Adapting operator probabilities in genetic algorithms", [ICGA89], 60-69.

De Jong K.A. & Spears W. (1991) "Learning concept classification rules using genetic algorithms". Proc. 12th IJCAI, 651-656, Sydney, Australia: Morgan Kaufmann.

Dorigo M. & E. Sirtori (1991)."ALECSYS: A Parallel Laboratory for Learning Classifier Systems". Proceedings of the Fourth International Conference on Genetic Algorithms, San Diego, California, R.K.Belew and L.B.Booker (Eds.), Morgan Kaufmann, 296-302.

Dorigo M. (1995). "ALECSYS and the AutonoMouse: Learning to Control a Real Robot by Distributed Classifier Systems". Machine Learning, 19, 3, 209-240.

Eshelman, L.J. et al. (1991) "Preventing premature convergence in genetic algorithms by preventing incest", [ICGA91], 115-122.

Goldberg, D. et al. (1991) "Don't worry, be messy", [ICGA91], 24-30.

Grefenstette, J.J. (1989) "A system for learning control strategies with genetic algorithms", [ICGA89], 183-190.

Holland, J.H. (1986) "Escaping brittleness: The possibilities of general-purpose learning algorithms applied to parallel rule-based systems". In R. Michalski, J. Carbonell, T. Mitchell (eds), Machine Learning: An Artificial Intelligence Approach. Los Altos: Morgan Kaufmann.

Janikow C. (1991) "Inductive learning of decision rules from attribute-based examples: A knowledge-intensive Genetic Algorithm approach". TR91-030, The University of North Carolina at Chapel Hill, Dept. of Computer Science, Chapel Hill, NC.

Riolo, R.L. (1988) "CFS-C: A package of domain independent subroutines for implementing classifier systems in arbitrary, user-defined environments". Logic of computers group, Division of computer science and engineering, University of Michigan.

Whitley, D. et al. (1989) "The GENITOR algorithm and selection pressure: why rank-based allocation of reproductive trials is best", [ICGA89], 116-121.


[Previous question] [Next question] [HHGTEC main contents page]

Mistakes in this page?
Hitch Hiker's Guide to Evolutionary Computation, Issue 9.1, released 12 April 2001
Copyright © 1993-2001 by J. Heitkötter and D. Beasley, all rights reserved.