Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
GA part of the broader soft computing (aka "computational intelligence") paradigm known as evolutionary computation
First introduced by Holland (1975) Inspired by possibility of problem solving through a process of evolution
Evolution as Optimisation
Evolution can be seen as a process leading to the optimisation of a populations ability to survive and thus reproduce in a specific environment. Evolutionary fitness - the measure of the ability to respond adequately to the environment, is the quantity that is actually optimised in natural life Consider a normal population of rabbits. Some rabbits are naturally faster than others. Any characteristic has a range of variation that is due to i) sexual reproduction and ii) mutation
The GA evolutionary cycle Starts with a randomly generated initial population of solutions (1st generation) Selects a population of better solutions (next generation) by using a measure of 'goodness' (a fitness evaluation function)
Alters new generation population through crossbreeding and mutation Processes of selection (step 2) and alteration (step 3) lead to a population with a higher proportion of better solutions
The GA evolutionary cycle continues until an acceptable solution is found in the current population, or some control parameter such as the maximum number of generations is exceeded
In general, if a binary number is given by bqbq1 b2b1b0,where bk = 0 or 1, k = 0, 1, 2, . . . , q, then its equivalent decimal number y (integer) is given by
If a variable x bounds are given by x(l) and x(u), its decimal value can be computed as
GA Selection or Reproduction
Selection in GA based on a process analogous to that of biological evolution Only the fittest survive and contribute to the gene pool of the next generation Fitness proportional selection Each chromosomes likelihood of being selected is proportional to its fitness value. Solutions failing selection are bad, and are discarded Crossover by splicing two chromosomes at a crossover point and swapping the spliced parts A better chromosome may be created by combining genes with good characteristics from one chromosome with some good genes in the other chromosome Crossover carried out with a probability typically 0.7 Chromosomes not crossed over are cloned
GA Selection or Reproduction
if Fi denotes the fitness of the ith string in the population of size n, the probability for selecting the ith string for the mating pool (pi ) is given by
Mutation A random adjustment in the genetic composition Can be useful for introducing new characteristics in a population May be counterproductive Probability kept low: typically 0.001 to 0.01
Advantages of GA systems
Useful when no algorithms or heuristics are available for solving a problem No formulation of the solution is required - only "recognition" of a good solution A GA system can be built as long as a solution representation and an evaluation scheme can be worked out So minimal domain expert access is required GA can act as an alternative to Expert Systems if number of rules is too large or the nature of the knowledge-base too dynamic
Traditional optimization techniques if constraints and objective functions are non-linear and/or discontinuous
Advantages of GA systems
GA does not guarantee optimal solutions, but produce near optimal solutions which are likely to be very good Solution time with GA is highly predictable Determined by
Size of the population Time taken to decode and evaluate a solution and Number of generations of population
GA uses simple operations to solve problems that are computationally prohibitive otherwise Because of simplicity, GA software are
Reasonably sized and self-contained Easier to embed them as a module in another system
GA can also aid in developing intelligent business systems that use other methodologies, eg,
Building the rule base of an expert system Finding optimal neural networks
Premature convergence
A GA is said to have converged prematurely if it explores a local maximum extensively Most significant factor leading to such convergence is a mutation rate which is too slow Mutation interference is an effect opposite to that of premature convergence
Simulated Annealing
Simulates the process of slow cooling of molten metal to achieve the absolute minimum energy state. The temperature needs to be reduced at a slow rate. The cooling phenomenon is simulated by controlling a temperature like parameter introduced with the concept of the Boltzmann probability distribution. A system in thermal equilibrium at a temperature T has its energy distributed probabilistically according to
Algorithm
Step 1: Initial point and termination criterion set T = high temperature and number of iterations at T Step2: Calculate a neighbouring point in the vicinity of current point. Step3: If E = E(Xi+1)- E(Xi)< 0 and go to next iteration. Else use Metropolis algorithm to accept the current point Create a random number r in the range of (0,1). If r exp (- E /T), accept the point according to Metropolis and go to next iteration. Else go to step 2. Step4: If number of iterations reached maximum, terminate the process at T. Step5: If T is small, terminate. Else lower T according to a cooling schedule. New temperature = 0.5 *T , go to step 2.