Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
&
Xin Yao
'
&
Xin Yao
'
Xin Yao
'
EA as Population-Based Generate-and-Test
Generate: Mutate and/or recombine individuals in a population. Test: Select the next generation from the parents and osprings.
&
Xin Yao
'
Xin Yao
' (b) Evaluate the tness according to f (x) = x2 , 13 169, 24 576, 8 64, 19 361. 3. Select two individuals for crossover based on their tness. If roulette-wheel selection is used, then pi = fi
j fj
Two ospring are often produced and added to an intermediate population. Repeat this step until the intermediate population is lled. In our example, p1 (13) = 169/1170 = 0.14 p3 (8) = 64/1170 = 0.06 p2 (24) = 576/1170 = 0.49 p4 (19) = 361.1170 = 0.31
Assume we have crossover(01101, 11000) and crossover(10011, 11000). We may obtain ospring 0110 0 and &
Xin Yao
' 1100 1 from crossover(01101, 11000) by choosing a random crossover point at 4, and obtain 10 000 and 11 011 from crossover(10011, 11000) by choosing a random crossover point at 2. Now the intermediate population is 01100, 11001, 10000, 11011 4. Apply mutation to individuals in the intermediate population with a small probability. A simple mutation is bit-ipping. For example, we may have the following new population P (1) after random mutation: 01101, 11001, 00000, 11011 5. Goto Step 2 if not stop. &
Xin Yao
'
Recombination/Crossover
1. One-point crossover 2. k-point crossover (k > 1), uniform vs. non-uniform 3. Uniform crossover Crossover rate: The probability of applying crossover.
&
Xin Yao
'
Mutation
1. bit-ipping 2. Random bit assignment 3. Multiple bit mutations Mutation rate: Note the dierence between per bit (gene) and per chromosome (individual) mutation rates.
&
Xin Yao
'
Search Bias
When a search operator is used, some ospring tend to be more likely to be generated than others. We called such a tendancy as bias. Crossover bias, e.g., 1-point crossover vs. uniform crossover Mutation bias, e.g., ipping 1 bit vs. ipping k bits
&
Xin Yao
10
'
&
Xin Yao
11
'
Xin Yao
12
'
Xin Yao
13
'
&
Xin Yao
14
'
&
Xin Yao
15
'
&
Xin Yao
16
'
Summary (I)
1. EC is a eld of study that includes EAs and other areas. EAs include many dierent types of algorithms. 2. Search operators enable us to generate new ospring from parents. There is no limitation on what operators can and cannot be used. 3. Search operators are applied to individuals. It is very important to realise the interdependency between operators and the representation of individuals.
&
Xin Yao
17
'
Xin Yao
18
'
&
Xin Yao
19
'
Fitness Scaling
Simple scaling The ith individuals tness is dened as: fscaled (t) = foriginal (t) fworst (t), where t is the generation number and fworst (t) the tness of the worst individual so far. Sigma scaling The ith individuals tness is dened as: f original (t) f (t) cf (t) , if fscaled (t) > 0 fscaled (t) = 0, otherwise where c is a constant, e.g., 2, f (t) is the average tness in the current population, and f (t) is the standard deviation of the tness in the current population. &
Xin Yao
20
'
Power scaling The ith individuals tness is dened as: fscaled (t) = (foriginal (t)) where k > 0. Exponential scaling The ith individuals tness is dened as: fscaled (t) = exp(foriginal (t)/T ) where T > 0 is the temperature, approaching zero.
k
&
Xin Yao
21
'
Ranking
Linear ranking Assume that the population size is , rank 0 indicates the worst individual and rank 1 the best. The probability of selecting the ith individual is Plinear (i) = + [rank(i)/( 1)]( ) ,
where () indicates the expected number of ospring produced by the worst (best) individuals. Note that 1 i=0 Plinear (i) = 1. Hence + = 2. So 1 2 and = 2 . Power ranking C below is a normalising factor. 0 < < . + [rank(i)/( 1)]k ( ) Ppower (i) = C & %
Xin Yao
22
'
Geometric ranking (1 )1rank(i) Pgeom (i) = C Exponential ranking 1 erank(i) Pexp (i) = C My ranking Pmine (i) = i+1
1 j=0 (j
+ 1)
&
Xin Yao
23
'
&
Xin Yao
24
'
Discrete Recombination
Multi-point Recombination Similar to that for the binary representation.
Global Discrete Recombination Similar to uniform crossover for the binary representation.
Geometric explanation.
&
Xin Yao
25
'
Intermediate Recombination
With two parents Given x1 and x2 . xi = x1i + (1 )x2i , With more parents Given x1 and x2 . xi = 1 x1i + 2 x2i + ... where
i
[0, 1]
i = 1.
&
Xin Yao
26
'
Other Recombination
Heuristic Recombination Assume x2 is no worse than x1 . x = u(x2 x1 ) + x2 where u is a uniformly distributed random number in [0,1]. Geometric Recombination Can be generalised to multiple parents. x = ( x11 x21 , x12 x22 , ...)
&
Xin Yao
27
'
Quadratic Recombination
Let xi,j be the j-th component of the vectors xi , i {1, 2, 3}, j {1, , n}, where n is the dimensionality. We approximate the position of P4 using the quadratic interpolation method as follow. 1 (x2 x2 )f (x1 ) + (x2 x2 )f (x2 ) + (x2 x2 )f (x3 ) 2,j 3,j 3,j 1,j 1,j 2,j = 2 (x2,j x3,j )f (x1 ) + (x3,j x1,j )f (x2 ) + (x1,j x2,j )f (x3 )
x4,j
&
Xin Yao
28
'
15 P1 10 P3 P4 5 P4 0 -8 -6 -4 -2 0 2 4 6 8 P2
Xin Yao
29
'
Xin Yao
30
'
&
Xin Yao
31
'
Summary (II)
1. Dierent problems require dierent search operators and selection schemes. There is no universally best one. 2. Using real vectors is usually more appropriate than binary strings for function optimisation. 3. Many search operators are heuristics-based. Domain knowledge can often be incorporated into search operators and representation.
&
Xin Yao
32
'
References
1. T. Bck, D. B. Fogel, and Z. Michalewicz (eds.), Handbook of a Evolutionary Computation, IOP Publ. Co. & Oxford University Press, 1997. Part C. (in the School library) 2. K.-H. Liang, X. Yao and C. S. Newton, Combining landscape approximation and local search in global optimization, Proc. of the 1999 Congress on Evolutionary Computation, Vol. 2, IEEE Press, Piscataway, NJ, USA, pp.1514-1520, July 1999. (Downloadable from my web page)
&
Xin Yao
33
'
&
Xin Yao
34
'
Evolutionary Optimisation
1. Numerical (global) optimisation. 2. Combinatorial optimisation (of NP-hard problems). 3. Mixed optimisation. 4. Constrained optimisation. 5. Multiobjective optimisation. 6. Optimisation in a dynamic environment (with a dynamic tness function).
&
Xin Yao
35
'
Evolutionary Learning
Evolutionary learning can be used in supervised, unsupervised and reinforcement learning. 1. Learning classier systems (Rule-based systems). 2. Evolutionary articial neural networks. 3. Evolutionary fuzzy logic systems. 4. Co-evolutionary learning. 5. Automatic modularisation of machine learning systems by speciation and niching. & %
Xin Yao
36
'
Evolutionary Design
EC techniques are particularly good at exploring unconventional designs which are very dicult to obtain by hand. 1. Evolutionary design of articial neural networks. 2. Evolutionary design of electronic circuits. 3. Evolvable hardware. 4. Evolutionary design of (building) architectures.
&
Xin Yao
37
'
Theoretical Foundations
Convergence Does the EA converge? Convergence rate How fast does it converge (potentially to a local optimum only)? Computational time complexity How well does it scale to large problems?
&
Xin Yao
38
'
Summary (III)
1. Evolutionary algorithms can be regarded as population-based generate-and-test algorithms. 2. Evolutionary computation techniques can be used in optimisation, learning and design. 3. Evolutionary computation techniques are exible and robust. 4. Evolutionary computation techniques are denitely useful tools in your toolbox, but there are problems for which other techniques might be more suitable. & %
Xin Yao
39
'
References
1. J. He and X. Yao, Towards an Analytic Framework for Analysing the Computation Time of Evolutionary Algorithms, Articial Intelligence, 145(1-2):59-97, April 2003. 2. J. He and X. Yao, From an Individual to a Population: An Analysis of the First Hitting Time of Population-Based Evolutionary Algorithms, IEEE Transactions on Evolutionary Computation, 6(5):495-511, October 2002. 3. J. He and X. Yao, Drift Analysis and Average Time Complexity of Evolutionary Algorithms, Articial Intelligence, 127(1):57-85, March 2001. (Erratum in 140(1):245-248, September 2002.) 4. X. Yao, Evolutionary computation: A gentle introduction, In & %
Xin Yao
40
'
Evolutionary Optimization, R. Sarker, M. Mohammadian and X. Yao (eds.), Chapter 2, pp.27-53, Kluwer Academic Publishers, Boston, 2002. (ISBN 0-7923-7654-4) 5. K.-H. Liang, X. Yao and C. Newton, Evolutionary search of approximated N -dimensional landscapes, International Journal of Knowledge-Based Intelligent Engineering Systems, 4(3):172-183, July 2000.
&