Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
ISSN 2079-8407
http://www.cisjournal.org
ABSTRACT
This work investigates the performance of two Evolutionary Algorithms Genetic Algorithm and Memetic Algorithm for
Constrained Optimization Problem. In particular, a knapsack problem was solved using the two algorithms and their results
were compared. Two selection techniques were used for both algorithms. The results of comparative analysis show that
Roulette-Wheel selection method outperforms Ranking and scaling method by 4.1% in term of the accuracy of the optimal
results obtained. Furthermore, Memetic Algorithm converges faster than Genetic Algorithm even as it also produces more
optimal results than Genetic Algorithm produces by a factor of 4.9% when the results obtained from Roulette Wheel
selection were compared for both algorithms. It is however pertinent to state that the time taken by an iteration in Genetic
Algorithm is 35.9% less than the time taken by an iteration in Memetic Algorithm.
Keywords: Optimization, Accuracy, Convergence, Evolutionary Algorithm, Iteration
498
ISSN 2079-8407
http://www.cisjournal.org
1.0 INTRODUCTION
At some point in time, it is entirely necessary to
use some mathematical models to help make decisions
from a list of possible alternate solutions subject to a
constraint. When we use such mathematical models to
select the best alternative out of a large number of
possibilities, then we call such a model a constrained
optimization model. Constrained Optimization is
performed by the operating system kernel in scheduling
policies such as the Shortest Job First, Round Robin,
Highest Response Ratio Next, Shortest Job Next etc
where the trade off is to reduce starving of jobs with
smaller processing time.
Optimization algorithm can be deterministic or
probabilistic. Deterministic optimization algorithms are
most often used if a clear relation between the
characteristics of the possible solution and their utility
for a given problem exists. This includes state space
search, branch and bound, algebraic geometry.
Probabilistic methods, on the other hand, may only
consider those elements of the search space in further
computations that have been selected by the heuristic.
Global optimization is a branch of applied
mathematics and numerical analysis. The goal of global
optimization is to find the best possible elements x from
a set X according to a set of criteria F = {f1, f2, ..., fn}
called objective functions. An abstract formulation of
such problem is to maximize/minimize f(x), x
S R,
iWi
M and 0 Xi
Qi.
The problem is an example of NP (Non
deterministic Polynomial) hard problem in which no
known algorithms that are guaranteed to run in a
polynomial time. Examples include: Travelling salesman
problem, Hamilton Circuit Problem, Bin Packing
Problem, Clique Problem and the knapsack Problem:
Techniques such as Greedy algorithm, backtracking
technique, dynamic programming technique etc. are not
best suited for such problems.
Evolutionary Algorithms have shown to be well
suited for high-quality solutions to larger NP problems
and currently they are the most efficient methods for
finding approximate optimal solution for optimization
problems. They do not involve extensive search
algorithms and do not try to find the best solution, but
they simply generate candidate solution, check in
polynomial time whether it is a solution or not and how
good a solution it is.
In handling constraints in this kind of
optimization problem, different strategies are available
for use. They include: Rejecting Strategy, Repairing
strategy and Penalizing strategy.
Rejecting Strategy discards all infeasible
chromosomes created throughout an evolutionary
process. This is a popular option in many GA. The
method may work reasonably well when the feasible
search space is convex and it constitutes a reasonable
part of the whole search space. However, such an
approach has serious limitations. For example, for many
constrained optimization problems where the initial
population consists of infeasible chromosomes only, it
might be essential to improve them.
In Repairing Strategy, Repairing a chromosome
means to take an infeasible chromosome and generate a
feasible one through some repairing procedure. It
depends on the existence of a deterministic repair
procedure to converting an infeasible offspring into a
feasible one. The weakness of the method is in its
problem dependence. For each particular problem, a
specific repair algorithm is designed to repair infeasible
solution.
499
ISSN 2079-8407
http://www.cisjournal.org
OPERATORS IN GA AND MA
SELECTION
The selection process is based on fitness.
Chromosomes that are evaluated with higher values
(fitter) will most likely be selected to reproduce,
whereas, those with lower values will be discarded. The
fittest chromosomes may be selected several times,
however, the number of chromosomes selected to
reproduce is equal to the population size, therefore,
keeping the size constant for every generation. This
phase has an element of randomness just like the
survival of organisms in nature. The selection methods
implemented and compared in this paper are:
1.
2.
CROSSOVER
Crossover is the process of combining the bits
of one chromosome with those of another. This is to
create an offspring for the next generation that inherits
traits of both parents. Crossover randomly chooses a
locus and exchanges the subsequences before and after
that locus between two chromosomes to create two
offspring.
For example, consider the following parents
and a single crossover point at position 3
Parent 1
100|0111
Parent 2
111|1000
Offspring1
100 1000
Offspring 2
111 0111
The crossover methods implemented and compared in
this paper is the Random Point Crossover.
MUTATION
Mutation is performed after crossover to
prevent all solutions in the population to fall into a local
optimum of solved problem. Mutation changes the new
offspring by flipping bits from 1 to 0 or from 0 to 1.
Mutation can occur at each bit position in the string with
some probability, usually very small (e.g. 0.001). For
example, consider the following chromosome with
mutation point at position 2:
Not mutated chromosome: 1 0 0 0 1 1 1
Mutated: 1 1 0 0 1 1 1
The 0 at position 2 flips to 1 after mutation.
Mutation probability is the percentage of total
number of genes in the population. The mutation
probability controls the probability with which new
genes or memes are introduced into the population for
trial. If the probability is too low, many genes that would
have been useful are never tried out, while if it is too
high, there will be much random perturbation, the
offspring will start losing their resemblance to the
parents and the algorithm will lose the ability to learn
from the history of the search.
However, several mutation operators for integer
encoding have been proposed which includes: inversion
mutation, insertion mutation and displacement mutation.
The operator implemented in this paper is the insertion
mutation.
500
ISSN 2079-8407
http://www.cisjournal.org
V.
LOCAL SEARCH
A local search is a heuristic for solving
optimization problems. There are two common forms of
genetic local search. One features Lamarckian evolution
and the other features the Baldwin effect [1]. They both
use the metaphor that an individual learns (hillclimbs)
during its lifetime (generation).
VI.
VII.
II.
III.
CHROMOSOME
The
chromosomes
in
genetic
algorithms represent the space of candidate
solutions. Possible chromosomes encodings are
binary, permutation, value, and tree encodings.
In binary encoding of the chromosomes, the
chromosomes of composed are BITS (Binary
Digits): 0s and 1s. Problems that can be
formulated using this encoding scheme are
often referred to as 0-1 problems. Value
encoding scheme are problems that take the
value of the parameters being encoded. An
example of optimization problem using the
value encoding is the time table scheduling
using genetic algorithm.
PHENOTYPE
Phenotype is a decoded string. It is the
direct representation of the problem. The
phenotype of a problem is in most cases the
input to the algorithm and also the output of the
algorithm.
GENOTYPE
The elements g
IX.
X.
F.
PROBLEM SPACE
The problem space X of an
optimization problem is the set containing all
elements x which could be its solution. The
problem space is often restricted by logical
constraints and physical constraints.
CANDIDATE SOLUTION
A solution candidate x is an element of
the problem space X of a certain optimization
problem. In evolutionary algorithm a candidate
solution is sometimes called a phenotype.
SOLUTION SPACE
The union of all solutions of an
optimization problem is solution
space S.X*
S
GENE
The
distinguishable
units
of
information in a genotype that encode the
properties of a phenotype are called genes.
R). The
G of the search
ALLELE
An allele is a value of specific gene.
LOCUS
The locus is the position where a
specific gene can be found in a genotype.
OBJECTIVE FUNCTION
An objective function f: X Y with Y
XI.
SEARCH SPACE
The search space G of an optimization
problem is the set of all elements g which can
501
ISSN 2079-8407
http://www.cisjournal.org
XII.
2.3
MEMETIC ALGORITHM
502
ISSN 2079-8407
http://www.cisjournal.org
the population.
b. Local search: search for the best
chromosomes
c. Crossover: Perform crossover on the
2 chromosomes selected.
d. Local search: search for the best
chromosomes
e. Mutation: Perform mutation on the
chromosomes obtained with small
probability.
4. Replace: Replace the current population with
the new population.
5. Test: Test whether the termination condition is
satisfied. If so, stop. If not, return the best
solution in current population and go to Step
2.
Algorithm 2.4: Memetic Algorithm for solving
knapsack problem.
iWi
503
ISSN 2079-8407
http://www.cisjournal.org
REFERENCES
5.0 CONCLUSION
Two Evolutionary Algorithm techniques, that is, Genetic
Algorithm and Memetic Algorithm have been applied to
solve knapsack problem. Their performances were
measured against metrics such as the optimal value of
the objective functions, convergence rate, and accuracy.
504
ISSN 2079-8407
http://www.cisjournal.org
Weight
32
46
20
Benefit
76
90
139
60
Weight
30
45
45
60
30
Benefit
60
70
30
60
50
Weight
54
59
40
67
31
35
Benefit
92
93
80
25.1
71
57
Weight
80
76
82
53
10
40
63
Benefit
129
100
63
131
18
100
150
505
ISSN 2079-8407
http://www.cisjournal.org
No of
Max
Items
C/R
Time
Size
Gen
Fit.
Chosen
atio
11
226
3,4
100
28
12
15
226
1,2,4
91.7
16
48
226
1,2,4
93.8
No of
Max
Items
C/R
Time
Acc
Gen.
Fit.
Chosen
atio
88
226
1,2,4
100
16
100
18
100
226
1,2,4
100
16
100
18
100
13
226
1,2,4
100
16
100
Table 4.2.2: Result Obtained from test data on table 3.2 with GA
ROULETTE-WHEEL SELECTION
Pop
No of
Max
Items
C/R
Time
Size
Gen
Fit.
Chosen
atio
Used
19
130
1,2
100
13
16
25
120
1,2
93.8
32
13
130
1,2
93.8
No of
Max
Items
C/R
Time
Acc
Gen.
Fit.
Chosen
atio
100
130
1,2
100
11
100
16
92
130
1,2
100
15
100
38
100
38
130
1,2
95
35
100
Table 4.2.3 Result Obtained from test data on table 3.3 with GA
ROULETTE-WHEEL SELECTION
Pop
No of
Max
Items
C/R
Time
Size
Gen
Fit.
Chosen
atio
Used
12
180
229
1,2,5
91.7
16
16
684
229
1,2,5
93.8
24
1710
256
1,2,5
92.7
No of
Max
Items
C/R
Time
Acc
Gen.
Fit.
Chosen
atio
89
23
229
1,2,5
93
25
89
16
89
28
229
1,2,5
100
25
89
21
100
35
229
1,2,5
100
42
89
Table 4.2.4: Result Obtained from test data on table 3.4 with GA
ROULETTE-WHEEL SELECTION
Pop
No of
Max
Items
C/R
Time
Size
Gen
Fit.
Chosen
atio
Used
14
136
300
4,5,7
93
15
No of
Max
Items
C/R
Gen.
Fit.
Chosen
atio
300
4,5,7
93
Time
Acc
18
100
506
ISSN 2079-8407
http://www.cisjournal.org
18
2216
300
4,5,7
100
17
100
46
300
4,5,7
100
24
100
21
161
300
4,5,7
90.5
22
100
21
300
4,5,7
93
28
100
Pop
No of
Max
Items
C/R
Time
Acc(
No of
Max
Items
C/R
Size
Gen
Fit.
Chosen
atio
(ms)
%)
Gen.
Fit.
Chosen
atio
Time
Acc
11
226
1,2,4
100
10
100
226
1,2,4
100
16
100
12
226
1,2,4
100
20
100
226
1,2,4
100
26
100
16
15
226
1,2,4
93.8
22
100
10
226
1,2,4
93.8
36
100
Table 4.3.2: Result Obtained from test data on table 3.2 with MA
ROULETTE-WHEEL SELECTION
Pop
No of
Max
Items
C/R
Time
Size
Gen
Fit.
Chosen
atio
(ms)
No of
Max
Items
C/R
Gen.
Fit.
Chosen
atio
Time
Acc
130
1,2
100
100
130
1,2
100
14
100
16
25
130
1,2
100
13
100
11
130
1,2
100
15
100
32
74
130
1,2
93.8
40
100
28
130
1,2
96.9
45
100
Table 4.3.3: Result Obtained from test data on table 3.3 with MA
ROULETTE-WHEEL SELECTION
Pop
No of
Max
Items
C/R
Time
Size
Gen
Fit.
Chosen
atio
Used
12
30
256
1,2,5
93.8
24
16
217
256
1,2,5
91.6
24
174
256
1,2,5
90.6
No of
Max
Items
C/R
Time
Acc
Gen.
Fit.
Chosen
atio
100
256
1,2,5
100
26
100
26
100
22
256
1,2,5
100
23
100
43
100
39
256
1,2,5
92.3
44
100
Table 4.3.4: Result Obtained from test data on table 3.4 with MA
ROULETTE-WHEEL SELECTION
Pop
No of
Max
Items
C/R
Time
Size
Gen
Fit.
Chosen
atio
(ms)
14
4,5,7
100
18
300
No of
Max
Items
C/R
Gen.
Fit.
Chosen
atio
300
4,5,7
100
Time
29
Acc
100
507
ISSN 2079-8407
http://www.cisjournal.org
18
29
300
4,5,7
95
23
100
22
300
4,5,7
96
34
100
21
118
300
4,5,7
93.3
28
100
81
300
4,5,7
96
39
100
Table 4.4.1: Result Obtained from test data on table 3.1 with GA and MA
GENETIC ALGORITHM
Pop
No of
Max
Items
C/R
Time
Size
Gen
Fit.
Chosen
atio
Used
11
226
3,4
100
18
12
15
226
1,2,4
91.7
16
48
226
1,2,4
93.8
MEMETIC ALGORITHM
Acc
No of
Max
Items
C/R
Time
Acc
Gen.
Fit.
Chosen
atio
88
226
1,2,4
100
16
100
18
100
226
1,2,4
100
26
100
18
100
10
226
1,2,4
93.8
36
100
Table 4.4.3: Result Obtained from test data on table 3.3 with GA and MA
GENETIC ALGORITHM
Pop
No of
Max
Items
C/R
Time
Size
Gen
Fit.
Chosen
atio
Used
19
130
1,2
100
13
16
25
120
1,2
93.8
32
13
130
1,2
93.8
MEMETIC ALGORITHM
Acc
No of
Max
Items
C/R
Gen.
Fit.
Chosen
atio
100
130
1,2
100
100
16
92
25
130
1,2
100
34
100
38
100
74
130
1,2
93.8
40
100
GENETIC ALGORITHM
Pop
No of
Max
Items
C/R
Time
Size
Gen
Fit.
Chosen
atio
Used
16
180
229
1,2,5
91.7
16
24
684
229
1,2,5
93.8
32
1710
256
1,2,5
92.7
Time
Acc
MEMETIC ALGORITHM
Acc
No of
Max
Items
C/R
Time
Acc
Gen.
Fit.
Chosen
atio
89
256
1,2,5
100
26
100
16
89
22
256
1,2,5
100
23
100
21
100
39
256
1,2,5
92.3
44
100
Table 4.4.4: Result Obtained from test data on table 3.4 with GA and MA
GENETIC ALGORITHM
Pop
No of
Max
Items
C/R
Time
Size
Gen
Fit.
Chosen
atio
Used
10
136
300
4,5,7
93
15
20
2216
300
4,5,7
100
30
161
300
4,5,7
90.5
MEMETIC ALGORITHM
Acc
No of
Max
Items
C/R
Time
Acc
Gen.
Fit.
Chosen
atio
100
300
4,5,7
100
29
100
17
100
22
300
4,5,7
96
34
100
22
100
81
300
4,5,7
96
39
100
508