Sei sulla pagina 1di 41

'

A Gentle Introduction to Evolutionary Computation


Xin Yao (x.yao@cs.bham.ac.uk) 1. What Is Evolutionary Computation 2. Dierent Evolutionary Algorithms 3. Major Areas Within Evolutionary Computation

&

Xin Yao

'

What Is Evolutionary Computation


1. It is the study of computational systems which use ideas and get inspirations from natural evolution. 2. One of the principles borrowed is survival of the ttest. 3. Evolutionary computation (EC) techniques can be used in optimisation, learning and design. 4. EC techniques do not require rich domain knowledge to use. However, domain knowledge can be incorporated into EC techniques.

&

Xin Yao

'

A Simple Evolutionary Algorithm


1. Generate the initial population P (0) at random, and set i 0; 2. REPEAT (a) Evaluate the tness of each individual in P (i); (b) Select parents from P (i) based on their tness in P (i); (c) Generate ospring from the parents using crossover and mutation to form P (i + 1); (d) i i + 1; 3. UNTIL halting criteria are satised & %

Xin Yao

'

EA as Population-Based Generate-and-Test
Generate: Mutate and/or recombine individuals in a population. Test: Select the next generation from the parents and osprings.

&

Xin Yao

'

How Does the Simple EA Work


Lets use the simple EA to maximise the function f (x) = x2 with x in the integer interval [0, 31], i.e., x = 0, 1, , 30, 31. The rst step of EA applications is encoding (i.e., the representation of chromosomes). We adopt binary representation for integers. Five bits are used to represent integers up to 31. Assume that the population size is 4. 1. Generate initial population at random, e.g., 01101, 11000, 01000, 10011. These are chromosomes or genotypes. 2. Calculate tness value for each individual. (a) Decode the individual into an integer (called phenotypes), 01101 13, 11000 24, 01000 8, 10011 19; & %

Xin Yao

' (b) Evaluate the tness according to f (x) = x2 , 13 169, 24 576, 8 64, 19 361. 3. Select two individuals for crossover based on their tness. If roulette-wheel selection is used, then pi = fi
j fj

Two ospring are often produced and added to an intermediate population. Repeat this step until the intermediate population is lled. In our example, p1 (13) = 169/1170 = 0.14 p3 (8) = 64/1170 = 0.06 p2 (24) = 576/1170 = 0.49 p4 (19) = 361.1170 = 0.31

Assume we have crossover(01101, 11000) and crossover(10011, 11000). We may obtain ospring 0110 0 and &

Xin Yao

' 1100 1 from crossover(01101, 11000) by choosing a random crossover point at 4, and obtain 10 000 and 11 011 from crossover(10011, 11000) by choosing a random crossover point at 2. Now the intermediate population is 01100, 11001, 10000, 11011 4. Apply mutation to individuals in the intermediate population with a small probability. A simple mutation is bit-ipping. For example, we may have the following new population P (1) after random mutation: 01101, 11001, 00000, 11011 5. Goto Step 2 if not stop. &

Xin Yao

'

Recombination/Crossover
1. One-point crossover 2. k-point crossover (k > 1), uniform vs. non-uniform 3. Uniform crossover Crossover rate: The probability of applying crossover.

&

Xin Yao

'

Mutation
1. bit-ipping 2. Random bit assignment 3. Multiple bit mutations Mutation rate: Note the dierence between per bit (gene) and per chromosome (individual) mutation rates.

&

Xin Yao

'

Search Bias
When a search operator is used, some ospring tend to be more likely to be generated than others. We called such a tendancy as bias. Crossover bias, e.g., 1-point crossover vs. uniform crossover Mutation bias, e.g., ipping 1 bit vs. ipping k bits

&

Xin Yao

10

'

Dierent Evolutionary Algorithms


There are several well-known EAs with dierent historical backgrounds, representations, variation operators, and selection schemes. In fact, EAs refer to a whole family of algorithms, not a single algorithm.

&

Xin Yao

11

'

Genetic Algorithms (GAs)


1. First formulated by Holland for adaptive search and by his students for optimisation from mid 1960s to mid 1970s. 2. Binary strings have been used extensively as individuals (chromosomes). 3. Simulate Darwinian evolution. 4. Search operators are only applied to the genotypic representation (chromosome) of individuals. 5. Emphasise the role of recombination (crossover). Mutation is only used as a background operator. 6. Often use roulette-wheel selection. & %

Xin Yao

12

'

Evolutionary Programming (EP)


1. First proposed by Fogel et al. in mid 1960s for simulating intelligence. 2. Finite state machines (FSMs) were used to represent individuals, although real-valued vectors have always been used in numerical optimisation. 3. It is closer to Lamarckian evolution. 4. Search operators (mutations only) are applied to the phenotypic representation of individuals. 5. It does not use any recombination. 6. Usually use tournament selection. & %

Xin Yao

13

'

Evolution Strategies (ES)


1. First proposed by Rechenberg and Schwefel in mid 1960s for numerical optimisation. 2. Real-valued vectors are used to represent individuals. 3. They are closer to Larmackian evolution. 4. They do have recombination. 5. They use self-adaptive mutations.

&

Xin Yao

14

'

Genetic Programming (GP)


1. First used by de Garis to indicate the evolution of articial neural networks, but used by Koza to indicate the application of GAs to the evolution of computer programs. 2. Trees (especially Lisp expression trees) are often used to represent individuals. 3. Both crossover and mutation are used.

&

Xin Yao

15

'

Preferred Term: Evolutionary Algorithms


EAs face the same fundamental issues as those classical AI faces, i.e., representation, and search. Although GAs, EP, ES, and GP are dierent, they are all dierent variants of population-based generate-and-test algorithms. They share more similarities than dierences! A better and more general term to use is evolutionary algorithms (EAs).

&

Xin Yao

16

'

Summary (I)
1. EC is a eld of study that includes EAs and other areas. EAs include many dierent types of algorithms. 2. Search operators enable us to generate new ospring from parents. There is no limitation on what operators can and cannot be used. 3. Search operators are applied to individuals. It is very important to realise the interdependency between operators and the representation of individuals.

&

Xin Yao

17

'

Selection and Reproduction


Selection is not normally regarded as a search operator although it does inuence search signicantly. Selection can be used either before or after search operators. When selection is used before search operators, the process of choosing the next generation from the union of all parents and ospring is sometimes called reproduction. The generational gap of an EA refers to the overlap (i.e., individuals that did not go through any search operators) between the old and new generations. The two extremes are generational EAs and steady-state EAs. Elitism keeps the best individual. & %

Xin Yao

18

'

Fitness Proportional Selection


Also known as roulette wheel selection. Use raw tness in computing selection probabilities. Does not allow negative tness values. Weakness: Domination of super individuals in early generations and slow convergence in later generations. Fitness scaling has often been used in early days to combat the two problems.

&

Xin Yao

19

'

Fitness Scaling
Simple scaling The ith individuals tness is dened as: fscaled (t) = foriginal (t) fworst (t), where t is the generation number and fworst (t) the tness of the worst individual so far. Sigma scaling The ith individuals tness is dened as: f original (t) f (t) cf (t) , if fscaled (t) > 0 fscaled (t) = 0, otherwise where c is a constant, e.g., 2, f (t) is the average tness in the current population, and f (t) is the standard deviation of the tness in the current population. &

Xin Yao

20

'

Power scaling The ith individuals tness is dened as: fscaled (t) = (foriginal (t)) where k > 0. Exponential scaling The ith individuals tness is dened as: fscaled (t) = exp(foriginal (t)/T ) where T > 0 is the temperature, approaching zero.
k

&

Xin Yao

21

'

Ranking
Linear ranking Assume that the population size is , rank 0 indicates the worst individual and rank 1 the best. The probability of selecting the ith individual is Plinear (i) = + [rank(i)/( 1)]( ) ,

where () indicates the expected number of ospring produced by the worst (best) individuals. Note that 1 i=0 Plinear (i) = 1. Hence + = 2. So 1 2 and = 2 . Power ranking C below is a normalising factor. 0 < < . + [rank(i)/( 1)]k ( ) Ppower (i) = C & %

Xin Yao

22

'

Geometric ranking (1 )1rank(i) Pgeom (i) = C Exponential ranking 1 erank(i) Pexp (i) = C My ranking Pmine (i) = i+1
1 j=0 (j

+ 1)

&

Xin Yao

23

'

Recombination for Real-valued Representation


Discrete Recombination does not change actual (gene) values. Very similar to the crossover operators on binary strings. Intermediate Recombination does change actual (gene) values. Usually based on some kind of average/mixture among multiple parents.

&

Xin Yao

24

'

Discrete Recombination
Multi-point Recombination Similar to that for the binary representation.

Global Discrete Recombination Similar to uniform crossover for the binary representation.

Geometric explanation.

&

Xin Yao

25

'

Intermediate Recombination
With two parents Given x1 and x2 . xi = x1i + (1 )x2i , With more parents Given x1 and x2 . xi = 1 x1i + 2 x2i + ... where
i

[0, 1]

i = 1.

&

Xin Yao

26

'

Other Recombination
Heuristic Recombination Assume x2 is no worse than x1 . x = u(x2 x1 ) + x2 where u is a uniformly distributed random number in [0,1]. Geometric Recombination Can be generalised to multiple parents. x = ( x11 x21 , x12 x22 , ...)

&

Xin Yao

27

'

Quadratic Recombination
Let xi,j be the j-th component of the vectors xi , i {1, 2, 3}, j {1, , n}, where n is the dimensionality. We approximate the position of P4 using the quadratic interpolation method as follow. 1 (x2 x2 )f (x1 ) + (x2 x2 )f (x2 ) + (x2 x2 )f (x3 ) 2,j 3,j 3,j 1,j 1,j 2,j = 2 (x2,j x3,j )f (x1 ) + (x3,j x1,j )f (x2 ) + (x1,j x2,j )f (x3 )

x4,j

One ospring is generated from three parents.

&

Xin Yao

28

'

What Does Quadratic Recombination Mean


20

15 P1 10 P3 P4 5 P4 0 -8 -6 -4 -2 0 2 4 6 8 P2

Note that we are minimising tness here. &

Xin Yao

29

'

A Hybrid EA With Local Search


1. Initialize individuals at random. 2. Perform local search from each individual. 3. REPEAT (a) Generate 3 points P1 , P2 , P3 by global discrete recombination. (b) Perform a quadratic approximation using P1 , P2 , P3 to produce a point P4 . (c) Perform a local search from P4 and update P4 with the search result. (Sounds Lamarkian.) (d) Place P1 , P2 , P3 , P4 into the population and do a ( + 4) truncation selection. 4. UNTIL termination criteria are met. & %

Xin Yao

30

'

Local Search With Random Memorising


Store best solutions in a memory. Retrieve a random one (old best) when a new best solution is found. Search along the direction of oldnew, i.e., the direction of s := xnew xold xnew xold

&

Xin Yao

31

'

Summary (II)
1. Dierent problems require dierent search operators and selection schemes. There is no universally best one. 2. Using real vectors is usually more appropriate than binary strings for function optimisation. 3. Many search operators are heuristics-based. Domain knowledge can often be incorporated into search operators and representation.

&

Xin Yao

32

'

References
1. T. Bck, D. B. Fogel, and Z. Michalewicz (eds.), Handbook of a Evolutionary Computation, IOP Publ. Co. & Oxford University Press, 1997. Part C. (in the School library) 2. K.-H. Liang, X. Yao and C. S. Newton, Combining landscape approximation and local search in global optimization, Proc. of the 1999 Congress on Evolutionary Computation, Vol. 2, IEEE Press, Piscataway, NJ, USA, pp.1514-1520, July 1999. (Downloadable from my web page)

&

Xin Yao

33

'

Major Areas in Evolutionary Computation


1. Optimisation 2. Learning 3. Design 4. Theory

&

Xin Yao

34

'

Evolutionary Optimisation
1. Numerical (global) optimisation. 2. Combinatorial optimisation (of NP-hard problems). 3. Mixed optimisation. 4. Constrained optimisation. 5. Multiobjective optimisation. 6. Optimisation in a dynamic environment (with a dynamic tness function).

&

Xin Yao

35

'

Evolutionary Learning
Evolutionary learning can be used in supervised, unsupervised and reinforcement learning. 1. Learning classier systems (Rule-based systems). 2. Evolutionary articial neural networks. 3. Evolutionary fuzzy logic systems. 4. Co-evolutionary learning. 5. Automatic modularisation of machine learning systems by speciation and niching. & %

Xin Yao

36

'

Evolutionary Design
EC techniques are particularly good at exploring unconventional designs which are very dicult to obtain by hand. 1. Evolutionary design of articial neural networks. 2. Evolutionary design of electronic circuits. 3. Evolvable hardware. 4. Evolutionary design of (building) architectures.

&

Xin Yao

37

'

Theoretical Foundations
Convergence Does the EA converge? Convergence rate How fast does it converge (potentially to a local optimum only)? Computational time complexity How well does it scale to large problems?

&

Xin Yao

38

'

Summary (III)
1. Evolutionary algorithms can be regarded as population-based generate-and-test algorithms. 2. Evolutionary computation techniques can be used in optimisation, learning and design. 3. Evolutionary computation techniques are exible and robust. 4. Evolutionary computation techniques are denitely useful tools in your toolbox, but there are problems for which other techniques might be more suitable. & %

Xin Yao

39

'

References
1. J. He and X. Yao, Towards an Analytic Framework for Analysing the Computation Time of Evolutionary Algorithms, Articial Intelligence, 145(1-2):59-97, April 2003. 2. J. He and X. Yao, From an Individual to a Population: An Analysis of the First Hitting Time of Population-Based Evolutionary Algorithms, IEEE Transactions on Evolutionary Computation, 6(5):495-511, October 2002. 3. J. He and X. Yao, Drift Analysis and Average Time Complexity of Evolutionary Algorithms, Articial Intelligence, 127(1):57-85, March 2001. (Erratum in 140(1):245-248, September 2002.) 4. X. Yao, Evolutionary computation: A gentle introduction, In & %

Xin Yao

40

'

Evolutionary Optimization, R. Sarker, M. Mohammadian and X. Yao (eds.), Chapter 2, pp.27-53, Kluwer Academic Publishers, Boston, 2002. (ISBN 0-7923-7654-4) 5. K.-H. Liang, X. Yao and C. Newton, Evolutionary search of approximated N -dimensional landscapes, International Journal of Knowledge-Based Intelligent Engineering Systems, 4(3):172-183, July 2000.

&

Potrebbero piacerti anche