Evolutionary algorithm

In artificial intelligence, an evolutionary algorithm (EA) is a subset of evolutionary computation, a generic population-based metaheuristic optimization algorithm. An EA uses mechanisms inspired by biological evolution, such as reproduction, mutation, recombination, and selection. Candidate solutions to the optimization problem play the role of individuals in a population, and the fitness function determines the quality of the solutions (see also loss function). Evolution of the population then takes place after the repeated application of the above operators. Artificial evolution (AE) describes a process involving individual evolutionary algorithms; EAs are individual components that participate in an AE..

Evolutionary algorithms often perform well approximating solutions to all types of problems because they ideally do not make any assumption about the underlying fitness landscape; this generality is shown by successes in fields as diverse as engineering, art, biology, economics, marketing, genetics, operations research, robotics, social sciences, physics, politics and chemistry.

Techniques from evolutionary algorithms applied to the modeling of biological evolution are generally limited to explorations of microevolutionary processes and planning models based upon cellular processes. The computer simulations Tierra and Avida attempt to model macroevolutionary dynamics.

In most real applications of EAs, computational complexity is a prohibiting factor. In fact, this computational complexity is due to fitness function evaluation. Fitness approximation is one of the solutions to overcome this difficulty. However, seemingly simple EA can solve often complex problems; therefore, there may be no direct link between algorithm complexity and problem complexity.

A possible limitation of many evolutionary algorithms is their lack of a clear genotype-phenotype distinction. In nature, the fertilized egg cell undergoes a complex process known as embryogenesis to become a mature phenotype. This indirect encoding is believed to make the genetic search more robust (i.e. reduce the probability of fatal mutations), and also may improve the evolvability of the organism.[1][2] Such indirect (aka generative or developmental) encodings also enable evolution to exploit the regularity in the environment.[3] Recent work in the field of artificial embryogeny, or artificial developmental systems, seeks to address these concerns. And gene expression programming successfully explores a genotype-phenotype system, where the genotype consists of linear multigenic chromosomes of fixed length and the phenotype consists of multiple expression trees or computer programs of different sizes and shapes.[4]

Implementation of biological processes

Step One: Generate the initial population of individuals randomly. (First generation)

Step Two: Evaluate the fitness of each individual in that population (time limit, sufficient fitness achieved, etc.)

Step Three: Repeat the following regenerational steps until termination:

  1. Select the best-fit individuals for reproduction. (Parents)
  2. Breed new individuals through crossover and mutation operations to give birth to offspring.
  3. Evaluate the individual fitness of new individuals.
  4. Replace least-fit population with new individuals.

Evolutionary algorithm types

Similar techniques differ in genetic representation and other implementation details, and the nature of the particular applied problem.

Related techniques

Swarm algorithms, including:

Other population-based metaheuristic methods

See also

Gallery[10][11]

References

  1. G.S. Hornby and J.B. Pollack. Creating high-level components with a generative representation for body-brain evolution. Artificial Life, 8(3):223–246, 2002.
  2. Jeff Clune, Benjamin Beckmann, Charles Ofria, and Robert Pennock. "Evolving Coordinated Quadruped Gaits with the HyperNEAT Generative Encoding". Proceedings of the IEEE Congress on Evolutionary Computing Special Section on Evolutionary Robotics, 2009. Trondheim, Norway.
  3. J. Clune, C. Ofria, and R. T. Pennock, "How a generative encoding fares as problem-regularity decreases," in PPSN (G. Rudolph, T. Jansen, S. M. Lucas, C. Poloni, and N. Beume, eds.), vol. 5199 of Lecture Notes in Computer Science, pp. 358–367, Springer, 2008.
  4. Ferreira, C., 2001. Gene Expression Programming: A New Adaptive Algorithm for Solving Problems. Complex Systems, Vol. 13, issue 2: 87–129.
  5. Wayward World, by Jon Roland. Novel that uses fetura to select candidates for public office.
  6. F. Merrikh-Bayat, The runner-root algorithm: A metaheuristic for solving unimodal and multimodal optimization problems inspired by runners and roots of plants in nature, Applied Soft Computing, Vol. 33, pp. 292–303, 2015
  7. R. Oftadeh et al (2010), A novel meta-heuristic optimization algorithm inspired by group hunting of animals: Hunting search, 60, 2087–2098.
  8. A. Agharghor and M,E. Riffi (2017), First Adaptation of Hunting Search Algorithm for the Quadratic Assignment Problem, 520, 263-267. doi=10.1007/978-3-319-46568-5_27
  9. Hasançebi, O., Kazemzadeh Azad, S. (2015), Adaptive Dimensional Search: A New Metaheuristic Algorithm for Discrete Truss Sizing Optimization, Computers and Structures, 154, 1–16.
  10. Simionescu, P.A.; Dozier, G.V.; Wainwright, R.L. (2006), A Two-Population Evolutionary Algorithm for Constrained Optimization Problems, Proc 2006 IEEE International Conference on Evolutionary Computation, Portland, OR, pp. 1647–1653, doi:10.1109/CEC.2006.1688506
  11. Simionescu, P.A. (2014). Computer Aided Graphing and Simulation Tools for AutoCAD Users (1st ed.). Boca Raton, FL: CRC Press. ISBN 978-1-4822-5290-3.

Bibliography

This article is issued from Wikipedia - version of the 12/4/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.