Sei sulla pagina 1di 19

Genetic Algorithm

What is a genetic algorithm?

GA part of the broader soft computing (aka "computational intelligence") paradigm known as evolutionary computation

First introduced by Holland (1975) Inspired by possibility of problem solving through a process of evolution

What is a GA? (contd)


GA mimics biological evolution to generate better solutions from existing solutions through survival of the fittest crossbreeding and mutation

What is a GA? (contd)


A GA is capable of finding solutions for many problems for which no usable algorithmic solutions exist
GA methodology particularly suited for optimization Optimization searches a solution space consisting of a large number of possible solutions GA reduces the search space through evolution of solutions, conceived as individuals in a population In nature, evolution operates on populations of organisms, ensuring by natural selection that characteristics that serve the members well tend to be passed on to the next generation, while those that dont die out

Evolution as Optimisation
Evolution can be seen as a process leading to the optimisation of a populations ability to survive and thus reproduce in a specific environment. Evolutionary fitness - the measure of the ability to respond adequately to the environment, is the quantity that is actually optimised in natural life Consider a normal population of rabbits. Some rabbits are naturally faster than others. Any characteristic has a range of variation that is due to i) sexual reproduction and ii) mutation

How GAs work


A population of candidate solutions - mathematical representations - is repeatedly altered until an optimal solution is found

The GA evolutionary cycle Starts with a randomly generated initial population of solutions (1st generation) Selects a population of better solutions (next generation) by using a measure of 'goodness' (a fitness evaluation function)
Alters new generation population through crossbreeding and mutation Processes of selection (step 2) and alteration (step 3) lead to a population with a higher proportion of better solutions

How GAs work (contd)

The GA evolutionary cycle continues until an acceptable solution is found in the current population, or some control parameter such as the maximum number of generations is exceeded

How solutions are represented


A series of genes, known as a chromosome, represents one possible solution Each gene in the chromosome represents one component of the solution pattern Each gene can have one of a number of possible values known as alleles The process of converting a solution from its original form into a chromosome is known as coding The most common form of representing a solution as a chromosome is a string of binary digits (aka a binary vector) eg 1010110101001 Each bit in this string is a gene with two alleles: 0 and 1 Other forms of representation are also used, eg, integer vectors Solution bit strings are decoded to enable them to be evaluated using a fitness measure

How solutions are represented (contd..)


If each design variable xi , i = 1, 2, . . . , n is coded in a string of length q, a design vector is represented using a string of total length nq. For example, (x1 = 18, x2 = 3, x3 = 1, x4 = 4):

In general, if a binary number is given by bqbq1 b2b1b0,where bk = 0 or 1, k = 0, 1, 2, . . . , q, then its equivalent decimal number y (integer) is given by

If a variable x bounds are given by x(l) and x(u), its decimal value can be computed as

GA Selection or Reproduction
Selection in GA based on a process analogous to that of biological evolution Only the fittest survive and contribute to the gene pool of the next generation Fitness proportional selection Each chromosomes likelihood of being selected is proportional to its fitness value. Solutions failing selection are bad, and are discarded Crossover by splicing two chromosomes at a crossover point and swapping the spliced parts A better chromosome may be created by combining genes with good characteristics from one chromosome with some good genes in the other chromosome Crossover carried out with a probability typically 0.7 Chromosomes not crossed over are cloned

GA Selection or Reproduction
if Fi denotes the fitness of the ith string in the population of size n, the probability for selecting the ith string for the mating pool (pi ) is given by

Roulette-wheel selection scheme

Crossover and Mutation

Mutation A random adjustment in the genetic composition Can be useful for introducing new characteristics in a population May be counterproductive Probability kept low: typically 0.001 to 0.01

Genetic Algorithm process


1. Represent the solution as a chromosome of fixed length, choose size of population N, crossover probability pc and mutation probability pm. 2. Define a fitness function f for measuring fitness of chromosomes. 3. Create an initial solution population randomly of size N: x1, x2, , xN 4. Use the fitness function f to evaluate the fitness value of each solution in the current generation: f(x1), f(x2), , f(xN) 5. Select good solutions based on fitness value. Discard rest of the solutions. 6. If acceptable solution (s) found in the current generation or maximum number of generations is exceeded then stop. 7. Alter the solution population using crossover and mutation to create a new generation of solutions with population size N. 8. Go to step 4

Genetic Algorithm process

Advantages of GA systems
Useful when no algorithms or heuristics are available for solving a problem No formulation of the solution is required - only "recognition" of a good solution A GA system can be built as long as a solution representation and an evaluation scheme can be worked out So minimal domain expert access is required GA can act as an alternative to Expert Systems if number of rules is too large or the nature of the knowledge-base too dynamic
Traditional optimization techniques if constraints and objective functions are non-linear and/or discontinuous

Advantages of GA systems
GA does not guarantee optimal solutions, but produce near optimal solutions which are likely to be very good Solution time with GA is highly predictable Determined by
Size of the population Time taken to decode and evaluate a solution and Number of generations of population

GA uses simple operations to solve problems that are computationally prohibitive otherwise Because of simplicity, GA software are
Reasonably sized and self-contained Easier to embed them as a module in another system

GA can also aid in developing intelligent business systems that use other methodologies, eg,
Building the rule base of an expert system Finding optimal neural networks

Some issues related to GA based systems


Level of explainability Scalability Data requirements Local maxima
Local maxima are regions that hold good solutions relative to regions around them, but which do not necessarily contain the best overall solutions The region(s) that contain the best solutions are called global maxima GAs are less prone to being trapped in local maxima because of the use of mutation and crossover

Premature convergence
A GA is said to have converged prematurely if it explores a local maximum extensively Most significant factor leading to such convergence is a mutation rate which is too slow Mutation interference is an effect opposite to that of premature convergence

Simulated Annealing
Simulates the process of slow cooling of molten metal to achieve the absolute minimum energy state. The temperature needs to be reduced at a slow rate. The cooling phenomenon is simulated by controlling a temperature like parameter introduced with the concept of the Boltzmann probability distribution. A system in thermal equilibrium at a temperature T has its energy distributed probabilistically according to

where k = Boltzmann constant

Algorithm
Step 1: Initial point and termination criterion set T = high temperature and number of iterations at T Step2: Calculate a neighbouring point in the vicinity of current point. Step3: If E = E(Xi+1)- E(Xi)< 0 and go to next iteration. Else use Metropolis algorithm to accept the current point Create a random number r in the range of (0,1). If r exp (- E /T), accept the point according to Metropolis and go to next iteration. Else go to step 2. Step4: If number of iterations reached maximum, terminate the process at T. Step5: If T is small, terminate. Else lower T according to a cooling schedule. New temperature = 0.5 *T , go to step 2.

Potrebbero piacerti anche