"Overview of genetic algorithms 001.ps.gz" - читать интересную книгу автораAn Overview of Genetic Algorithms : Part 1, Fundamentals David Beasley\Lambda Department of Computing Mathematics, University of Cardiff, Cardiff, CF2 4YN, UK David R. Bully Department of Electrical and Electronic Engineering, University of Bristol, Bristol, BS8 1TR, UK Ralph R. Martinz Department of Computing Mathematics, University of Cardiff, Cardiff, CF2 4YN, UK University Computing, 1993, 15(2) 58-69. cfl Inter-University Committee on Computing. All rights reserved. No part of this article may be reproduced for commercial purposes. 1 Introduction Genetic Algorithms (GAs) are adaptive methods which may be used to solve search and optimisation problems. They are based on the genetic processes of biological organisms. Over many generations, natural populations evolve according to the principles of natural selection and "survival of the fittest", first clearly stated by Charles Darwin in The Origin of Species. By mimicking this process, genetic algorithms are able to "evolve" solutions to real world problems, if they have been suitably encoded. For example, GAs can be used to design bridge structures, for maximum strength/weight ratio, or to determine the least wasteful layout for cutting shapes from cloth. They can also be used for online process control, such as in a chemical plant, or load balancing on a multi-processor computer system. In nature, individuals in a population compete with each other for resources such as food, water and shelter. Also, members of the same species often compete to attract a mate. Those individuals which are most successful in surviving and attracting mates will have relatively larger numbers of offspring. Poorly performing individuals will produce few of even no offspring at all. This means that the genes from the highly adapted, or "fit" individuals will spread to an increasing number of individuals in each successive generation. The combination of good characteristics from different ancestors can sometimes produce "superfit" offspring, whose fitness is greater than that of either parent. In this way, species evolve to become more and more well suited to their environment. GAs use a direct analogy of natural behaviour. They work with a population of "individuals", each representing a possible solution to a given problem. Each individual is assigned a "fitness score" according to how good a solution to the problem it is. For example, the fitness score might be the strength/weight ratio for a given bridge design. (In nature this is equivalent to assessing how effective an organism is at competing for resources.) The highly fit individuals are given opportunities to "reproduce", by "cross breeding" with other \Lambda email: David.Beasley@cm.cf.ac.uk yemail: Dave.Bull@bristol.ac.uk zemail: Ralph.Martin@cm.cf.ac.uk 1 individuals in the population. This produces new individuals as "offspring", which share some features taken from each "parent". The least fit members of the population are less likely to get selected for reproduction, and so "die out". A whole new population of possible solutions is thus produced by selecting the best individuals from the current "generation", and mating them to produce a new set of individuals. This new generation contains a higher proportion of the characteristics possessed by the good members of the previous generation. In this way, over many generations, good characteristics are spread throughout the population, being mixed and exchanged with other good characteristics as they go. By favouring the mating of the more fit individuals, the most promising areas of the search space are explored. If the GA has been designed well, the population will converge to an optimal solution to the problem. GAs are not the only algorithms based on an analogy with nature. Neural networks are based on the behaviour of neurons in the brain. They can be used for a variety of classification tasks, such as pattern recognition, machine learning, image processing and expert systems. Their area of application partly overlaps that of GAs. The use of GAs for the design of neural networks is a current research area [HS91]. Simulated annealing is a search technique which is based on physical, rather than biological processes, and this is described in Section 3.4. The power of GAs comes from the fact that the technique is robust, and can deal successfully with a wide range of problem areas, including those which are difficult for other methods to solve. GAs are not guaranteed to find the global optimum solution to a problem, but they are generally good at finding "acceptably good" solutions to problems "acceptably quickly". Where specialised techniques exist for solving particular problems, they are likely to out-perform GAs in both speed and accuracy of the final result. The main ground for GAs, then, is in difficult areas where no such techniques exist. Even where existing techniques work well, improvements have been made by hybridising them with a GA. |
|
© 2026 Библиотека RealLib.org
(support [a t] reallib.org) |