Sei sulla pagina 1di 11

VOL. 2, NO.

10, October 2011

ISSN 2079-8407

Journal of Emerging Trends in Computing and Information Sciences


2009-2011 CIS Journal. All rights reserved.

http://www.cisjournal.org

Comparative Assessment of Genetic and Memetic Algorithms


H. A. Sanusi, A. Zubair , R. O. Oladele
Department of Computer Science
University of Ilorin, P. M. B 1515
Ilorin, NIGERIA
ahmedsanusi@gmail.com, ahmzu2@yahoo.co.uk, roladele@yahoo.com

ABSTRACT
This work investigates the performance of two Evolutionary Algorithms Genetic Algorithm and Memetic Algorithm for
Constrained Optimization Problem. In particular, a knapsack problem was solved using the two algorithms and their results
were compared. Two selection techniques were used for both algorithms. The results of comparative analysis show that
Roulette-Wheel selection method outperforms Ranking and scaling method by 4.1% in term of the accuracy of the optimal
results obtained. Furthermore, Memetic Algorithm converges faster than Genetic Algorithm even as it also produces more
optimal results than Genetic Algorithm produces by a factor of 4.9% when the results obtained from Roulette Wheel
selection were compared for both algorithms. It is however pertinent to state that the time taken by an iteration in Genetic
Algorithm is 35.9% less than the time taken by an iteration in Memetic Algorithm.
Keywords: Optimization, Accuracy, Convergence, Evolutionary Algorithm, Iteration

498

VOL. 2, NO. 10, October 2011

ISSN 2079-8407

Journal of Emerging Trends in Computing and Information Sciences


2009-2011 CIS Journal. All rights reserved.

http://www.cisjournal.org

Where the value of Xi is restricted to either 0 or 1, the


problem is termed 0/1 knapsack.

1.0 INTRODUCTION
At some point in time, it is entirely necessary to
use some mathematical models to help make decisions
from a list of possible alternate solutions subject to a
constraint. When we use such mathematical models to
select the best alternative out of a large number of
possibilities, then we call such a model a constrained
optimization model. Constrained Optimization is
performed by the operating system kernel in scheduling
policies such as the Shortest Job First, Round Robin,
Highest Response Ratio Next, Shortest Job Next etc
where the trade off is to reduce starving of jobs with
smaller processing time.
Optimization algorithm can be deterministic or
probabilistic. Deterministic optimization algorithms are
most often used if a clear relation between the
characteristics of the possible solution and their utility
for a given problem exists. This includes state space
search, branch and bound, algebraic geometry.
Probabilistic methods, on the other hand, may only
consider those elements of the search space in further
computations that have been selected by the heuristic.
Global optimization is a branch of applied
mathematics and numerical analysis. The goal of global
optimization is to find the best possible elements x from
a set X according to a set of criteria F = {f1, f2, ..., fn}
called objective functions. An abstract formulation of
such problem is to maximize/minimize f(x), x

S R,

where R denotes a space of real n component vectors


x, and f is a real valued function defined on S. If S
consists only of vectors whose elements are integer, then
the problem is one of integer programming. One of the
common examples of integer programming problem is
the Knapsack Problem.
The knapsack problem is concerned with a
knapsack that has positive integer weight (or capacity)
W. There are n distinct items that may potentially be
placed in the knapsack. Item i have a positive integer
weight Wi and positive integer benefit Bi. In addition,
there are Qi copies of item i available, where quantity Qi
is a positive integer satisfying 1 Qi .
Let Xi determines how many copies of item i to be
placed into the knapsack. The goal is to: Maximize
iXi

Subject to the constraints

iWi

M and 0 Xi

Qi.
The problem is an example of NP (Non
deterministic Polynomial) hard problem in which no
known algorithms that are guaranteed to run in a
polynomial time. Examples include: Travelling salesman
problem, Hamilton Circuit Problem, Bin Packing
Problem, Clique Problem and the knapsack Problem:
Techniques such as Greedy algorithm, backtracking
technique, dynamic programming technique etc. are not
best suited for such problems.
Evolutionary Algorithms have shown to be well
suited for high-quality solutions to larger NP problems
and currently they are the most efficient methods for
finding approximate optimal solution for optimization
problems. They do not involve extensive search
algorithms and do not try to find the best solution, but
they simply generate candidate solution, check in
polynomial time whether it is a solution or not and how
good a solution it is.
In handling constraints in this kind of
optimization problem, different strategies are available
for use. They include: Rejecting Strategy, Repairing
strategy and Penalizing strategy.
Rejecting Strategy discards all infeasible
chromosomes created throughout an evolutionary
process. This is a popular option in many GA. The
method may work reasonably well when the feasible
search space is convex and it constitutes a reasonable
part of the whole search space. However, such an
approach has serious limitations. For example, for many
constrained optimization problems where the initial
population consists of infeasible chromosomes only, it
might be essential to improve them.
In Repairing Strategy, Repairing a chromosome
means to take an infeasible chromosome and generate a
feasible one through some repairing procedure. It
depends on the existence of a deterministic repair
procedure to converting an infeasible offspring into a
feasible one. The weakness of the method is in its
problem dependence. For each particular problem, a
specific repair algorithm is designed to repair infeasible
solution.

499

VOL. 2, NO. 10, October 2011

ISSN 2079-8407

Journal of Emerging Trends in Computing and Information Sciences


2009-2011 CIS Journal. All rights reserved.

http://www.cisjournal.org

The strategies explained above have the


advantage that they never generate infeasible solutions
but have the disadvantage that they consider no points
outside the feasible regions. For highly constrained
problem, infeasible solution may take a relatively big
portion in population. In such a case, feasible solutions
may be difficult to be found if we just confine genetic
search within feasible regions.

2.0 METHODS USED


2.1

OPERATORS IN GA AND MA

SELECTION
The selection process is based on fitness.
Chromosomes that are evaluated with higher values
(fitter) will most likely be selected to reproduce,
whereas, those with lower values will be discarded. The
fittest chromosomes may be selected several times,
however, the number of chromosomes selected to
reproduce is equal to the population size, therefore,
keeping the size constant for every generation. This
phase has an element of randomness just like the
survival of organisms in nature. The selection methods
implemented and compared in this paper are:
1.

2.

Roulette Wheel Selection


It is proposed by Holland. The basic idea is to
determine selection probability or survival
probability for each chromosome proportional
to the fitness value. Then a model roulette
wheel can be made displaying these
probabilities. The selection process is based on
spinning the wheel the number of times equal to
population size, each selecting a single
chromosome for the new procedure.
Ranking and Scaling Selection
They are proposed to mitigate these problems.
The scaling method maps raw objective
function values to positive real values, and the
survival probability for each chromosome is
determined according to these values. Fitness
scaling has a twofold intention:
To maintain a reasonable differential
between relative fitness ratings of
chromosomes.
To prevent too-rapid takeover by some
super-chromosomes to meet the
requirement to limit competition early
but to stimulate it later.

CROSSOVER
Crossover is the process of combining the bits
of one chromosome with those of another. This is to
create an offspring for the next generation that inherits
traits of both parents. Crossover randomly chooses a
locus and exchanges the subsequences before and after
that locus between two chromosomes to create two
offspring.
For example, consider the following parents
and a single crossover point at position 3
Parent 1
100|0111
Parent 2
111|1000
Offspring1
100 1000
Offspring 2
111 0111
The crossover methods implemented and compared in
this paper is the Random Point Crossover.
MUTATION
Mutation is performed after crossover to
prevent all solutions in the population to fall into a local
optimum of solved problem. Mutation changes the new
offspring by flipping bits from 1 to 0 or from 0 to 1.
Mutation can occur at each bit position in the string with
some probability, usually very small (e.g. 0.001). For
example, consider the following chromosome with
mutation point at position 2:
Not mutated chromosome: 1 0 0 0 1 1 1
Mutated: 1 1 0 0 1 1 1
The 0 at position 2 flips to 1 after mutation.
Mutation probability is the percentage of total
number of genes in the population. The mutation
probability controls the probability with which new
genes or memes are introduced into the population for
trial. If the probability is too low, many genes that would
have been useful are never tried out, while if it is too
high, there will be much random perturbation, the
offspring will start losing their resemblance to the
parents and the algorithm will lose the ability to learn
from the history of the search.
However, several mutation operators for integer
encoding have been proposed which includes: inversion
mutation, insertion mutation and displacement mutation.
The operator implemented in this paper is the insertion
mutation.

500

VOL. 2, NO. 10, October 2011

ISSN 2079-8407

Journal of Emerging Trends in Computing and Information Sciences


2009-2011 CIS Journal. All rights reserved.

http://www.cisjournal.org

V.

LOCAL SEARCH
A local search is a heuristic for solving
optimization problems. There are two common forms of
genetic local search. One features Lamarckian evolution
and the other features the Baldwin effect [1]. They both
use the metaphor that an individual learns (hillclimbs)
during its lifetime (generation).

VI.

VII.

R is a mathematical function which is


subject to optimization. The co domain Y of an
objective function as well as its range must be a

2.2 GA AND MA VOCABULARY


The vocabulary is the basic elements of GA and
MA representing the smallest unit of information in
which genetic operators are applied. The elements
include:
I.

II.

III.

CHROMOSOME
The
chromosomes
in
genetic
algorithms represent the space of candidate
solutions. Possible chromosomes encodings are
binary, permutation, value, and tree encodings.
In binary encoding of the chromosomes, the
chromosomes of composed are BITS (Binary
Digits): 0s and 1s. Problems that can be
formulated using this encoding scheme are
often referred to as 0-1 problems. Value
encoding scheme are problems that take the
value of the parameters being encoded. An
example of optimization problem using the
value encoding is the time table scheduling
using genetic algorithm.
PHENOTYPE
Phenotype is a decoded string. It is the
direct representation of the problem. The
phenotype of a problem is in most cases the
input to the algorithm and also the output of the
algorithm.
GENOTYPE
The elements g

subset of the real numbers (Y

be used to find the best elements x in X with

respect to such criteria f


VIII.

IX.

X.

F.

PROBLEM SPACE
The problem space X of an
optimization problem is the set containing all
elements x which could be its solution. The
problem space is often restricted by logical
constraints and physical constraints.
CANDIDATE SOLUTION
A solution candidate x is an element of
the problem space X of a certain optimization
problem. In evolutionary algorithm a candidate
solution is sometimes called a phenotype.
SOLUTION SPACE
The union of all solutions of an
optimization problem is solution
space S.X*
S

X. Where X* is the global optimal set

and S is the candidate solutions in the problem


space.

GENE
The
distinguishable
units
of
information in a genotype that encode the
properties of a phenotype are called genes.

R). The

domain X of f is called problem space and can


represent any type of elements like numbers,
lists, construction plans, and so on. It is chosen
according to the problem to be solved with the
optimization process. Objective functions are
not necessarily mere mathematical expressions,
but can be complex algorithms that, for
example, involve multiple simulations. Global
optimization comprises all techniques that can

G of the search

space G of a given optimization problem are


called the genotypes. The genotype is an
encoding of the problem from its phenotypic
representation. It is known as a coded string.
The algorithm processes the genotypic
representation of the problem.
IV.

ALLELE
An allele is a value of specific gene.
LOCUS
The locus is the position where a
specific gene can be found in a genotype.
OBJECTIVE FUNCTION
An objective function f: X Y with Y

XI.

SEARCH SPACE
The search space G of an optimization
problem is the set of all elements g which can

501

VOL. 2, NO. 10, October 2011

ISSN 2079-8407

Journal of Emerging Trends in Computing and Information Sciences


2009-2011 CIS Journal. All rights reserved.

http://www.cisjournal.org

XII.

2.3

be processed by the search operations. In


Genetic Algorithm search space is often
referred to as Genome. Genome was coined by
the German biologist Winkler as a portmanteau
of the words gene and chromosome. The
genome is the whole hereditary information of
organisms.
SEARCH OPERATIONS
The search operations are used by
optimization algorithms in order to explore the
search space G.
GENETIC ALGORITHM

Genetic algorithms (GAs) are a subclass of


evolutionary algorithms introduced by John Holland at
the University of Michigan based on Darwins theory of
natural selection where the elements of the search space
G are binary strings (G = B*) or arrays of other
elementary types. GA is a stochastic global search
method that simulates the metaphor of natural biological
evolution. GA operates on a population of potential
solutions applying the principle of survival of the fittest
to produce (hopefully) better approximations to a
solution. At each generation, a new set of approximations
is created by the process of selecting individuals
according to their level of fitness in the problem domain
and breeding them together using operators (crossover
and mutation) borrowed from natural genetics. This
process leads to the evolution of population of individuals
that are better suited to their environment than the
individuals that they were created from, just as in natural
adaptation.
GA is used to perform global exploration among
a population while heuristic methods are used to perform
local exploitation around the chromosomes. The
behaviors of GA are characterized by the balance
between exploitation and exploration in the search space.
The balance is strongly affected by the strategy
parameters such as population size, maximum generation,
crossover probability and mutation probability.
In an iteration or generation in genetic
algorithm, the selection, crossover and the mutation only
operator operates on the population to produce new
population that constitute the new generation.
An outline of the algorithm is given below:
1. Start: Randomly generate a population of N
chromosomes.
2. Fitness: Calculate the fitness of all
chromosomes.

3. Create a new population:


a. Selection: According to the selection
method implemented, select 2
chromosomes from the population.
b. Crossover: Perform crossover on the
2 chromosomes selected.
c. Mutation: Perform mutation on the
chromosomes obtained.
4. Replace: Replace the current population with
the new population.
5. Test: Test whether the termination condition is
satisfied. If so, stop. If not, return the best
solution in current population and go to Step
2.
Algorithm 2.3: Genetic Algorithm for Solving
Knapsack Problem.
2.4

MEMETIC ALGORITHM

Memetic algorithm (MA) is motivated


by Dawkinss notion of a meme as a unit of information
that reproduces itself as people exchange ideas [2]. A key
difference exists between genes and memes. Before a
meme is passed on, it is typically adapted by the person
who transmits it as that person thinks, understands and
processes the meme, whereas genes get passed on whole.
Moscato and Norman linked this thinking to local
refinement, and therefore promoted the term memetic
algorithm to describe genetic algorithms that use local
search heavily [3]. Radcliffe and Surry gave a formal
description of memetic algorithms [4], which provided a
homogeneous formal framework for considering memetic
and GA. According to Radcliffe and Surry, if a local
optimizer is added to a GA and applied to every child
before it is inserted into the population, then a memetic
algorithm can be thought of simply as a special kind of
genetic search over the subspace of local optima.
Recombination and mutation will usually produce
solutions that are outside this space of local optima, but a
local optimizer can then repair such solutions to produce
final children that lie within this subspace.
An outline of memetic algorithm is given below:
1. Start: Randomly generate a population of N
chromosomes.
2. Fitness: Calculate the fitness of all
chromosomes.
3. Create a new population:
a. Selection: According to the selection
method, select 2 chromosomes from

502

VOL. 2, NO. 10, October 2011

ISSN 2079-8407

Journal of Emerging Trends in Computing and Information Sciences


2009-2011 CIS Journal. All rights reserved.

http://www.cisjournal.org

the population.
b. Local search: search for the best
chromosomes
c. Crossover: Perform crossover on the
2 chromosomes selected.
d. Local search: search for the best
chromosomes
e. Mutation: Perform mutation on the
chromosomes obtained with small
probability.
4. Replace: Replace the current population with
the new population.
5. Test: Test whether the termination condition is
satisfied. If so, stop. If not, return the best
solution in current population and go to Step
2.
Algorithm 2.4: Memetic Algorithm for solving
knapsack problem.

3.0 EXPERIMENT DESCRIPTION


This section discusses the experimental setup
that is used to assess the performance of the two
algorithms as they are applied to the knapsack problem.
The algorithms implemented (algorithm 2.2 and 2.3) are
as explained in section 2 above of this paper.
The algorithms were implemented using
different test cases and test data. The population sizes
are properly selected in order to explore and exploit the
search space.
In order to get representative computing times,
it is necessary to generate many problem instances for a
fixed n and obtain computing times for these instances,
and the average computed.
The problem instance is designed for 4, 5, 6 and
7 items and the size of the initial population carefully
selected. The instances are shown on table 3.1 3.4
below representing 4, 5, 6 and 7 items respectively. The
problem instance for each item size has weight and point
as two parameters associated with the object. The weight
is measured in kilograms while the benefit is measured
in points.
The performance parameters used for assessing
the performance of the two algorithms include the
capacity of the knapsack, the optimal value of the

objective function, the value of the Constrained function


(i.e. the size of the knapsack filled) and other parameters
as explained below:
TIME
This is the average time a generation takes to
execute the genetic or memetic operators within the
defined population size. It is measured in milliseconds
(ms).
CONVERGENCY RATIO
This is the number of converged chromosome
multiplied by one hundred and then divided by the total
number of chromosomes in the population. It is
measured in percentage (%).
MAXIMUM FITNESS FOUND
This is the phenotype equivalent of a particular
chromosome that gives the optimal solution at a
particular generation.
NUMBER OF GENERATION
This is the number of iterations the algorithm
takes before it converges at the termination condition.
ACCURACY OF RESULT
The accuracy of the results obtained is a
measure of variation between the observed optimal result
and the expected optimal result. Mathematically,
accuracy of results obtained can be calculated as the
observed result divided by the expected results
multiplied by 100 in percentage (%).
The two algorithms were written in JAVA programming
language and run on Toshiba Computer System with
Windows 7 operating system.

4.0 DISCUSSION OF RESULTS


The results of the experiments are shown in
tables 4.1, 4.2, 4.3 and 4.4 representing the results
obtained from the problem instance given above in
section 3.0 tables 3.1, 3.2, 3.3 and 3.4.
The main objective of the experiment was to
measure the performance of each algorithm for varying
data set n, in terms of the optimal value of the objective
function, that is the maximum profit obtained,
iXi

and the constrained function value, that is,

the maximum capacity of knapsack filled,

iWi

and the average time taken to complete an iteration


where Bi and Wi are benefits and weights respectively.
The table of results is as given below.

503

VOL. 2, NO. 10, October 2011

ISSN 2079-8407

Journal of Emerging Trends in Computing and Information Sciences


2009-2011 CIS Journal. All rights reserved.

http://www.cisjournal.org

From the results of the experiment, as shown in


the tables above, the tables show that Ranking and
Scaling Algorithm converges after smaller number of
iterations for both memetic and genetic algorithms
compared to the Roulette-wheel algorithm. But when
timing is the bottleneck, the Roulette wheel selection
algorithm takes lesser time to complete an iteration
compared to the Ranking and Scaling algorithm.

Results of the experiments show that Roulette wheel


selection outperforms the Ranking and Scaling selection
by for GA and MA respectively by 4.1% in terms of the
accuracy of the optimal result. The results also show that
memetic algorithm is more efficient than the genetic
algorithm in terms of accuracy and the optimal value of
objective function. Consequently, memetic algorithm is
better suited to handling the NP hard problems.

Since we are more interested in getting an


optimal value of the objective function within a
reasonable amount of time, then the Roulette wheel
selection algorithm is much more efficient than the
Ranking and Scaling algorithm.

REFERENCES

In the comparative analysis between the genetic


and memetic algorithm, the Roulette wheel results are
then compared together since they produce the best
optimal solution for the two algorithms. The justification
below explains the conclusion obtained from the two
comparisons.
The two algorithms (genetic and memetic)
converge to the optimal results with 100% accuracy
when the population size is about 2.5 times the number
of items available in the knapsack. When the population
size is greater than 2.5 times the number of items, the
accuracy of the optimal results obtained becomes stable
at the global optimum but this requires more iterations
since the size of the search space will increase for both
algorithms.
Also, a major difference between the algorithms
is that the time it takes to complete iteration in memetic
algorithm is much more than the time it takes genetic
algorithm to complete the same iteration. This is because
of the local search algorithm that is embedded in
memetic algorithm.
However, it is safe to state that memetic
algorithm is more efficient and better than genetic
algorithm since it is guaranteed to converge to an
optimal result within a reasonable amount of iterations
regardless of the initial population size.

5.0 CONCLUSION
Two Evolutionary Algorithm techniques, that is, Genetic
Algorithm and Memetic Algorithm have been applied to
solve knapsack problem. Their performances were
measured against metrics such as the optimal value of
the objective functions, convergence rate, and accuracy.

[1] P.B Shola (2003), Logic and Program Verification


and Development, pp 82.
[2] Belanger, N., Desaulniers, G., Soumis, F. &
Desrosiers, J. (2006). Periodic airline fleet
assignment with time windows, spacing constraints,
and time dependent revenues, European Journal of
Operational Research, 175(3), 1754-1766.
[3] Belanger, N., Desaulniers, G., Soumis, F.,
Desrosiers, J. & Lavigne, J. (2006). Weekly airline
fleet assignment with homogeneity, Transportation
Research Part B: Methodological, 40(4), 306-318.
[4] Stojkovic, M.,& Soumis, F. (2001). An
OptimizationModel for the Simultaneous
Operational Flight and Pilot Scheduling Problem,
Management Science, 47(9), 1290-1305
[5] Darrell, W, A Genetic Algorithm Tutorial,
Computer Science Department, Colorado State
University.
[6] Deitel P.J. and H.M. (2007), JAVA How to
Program, Seventh Edition. Pearson International
Edition.
[7] Edemenang E. and Ahmed M. (1994), Algorithm
Design Techniques: Case Study the Knapsack
Problem. The Journal of Computer Science and its
Applications (JCSA). 5(1):57-65.
[8] Francis A. (2007), Charles Darwin and the Origin of
Species, pp 2-94.
[9] Hristakeva M. and Shrestha D., Solving the 0-1
Knapsack Problem with Genetic Algorithms
Computer Science Department, Simpson College
[10] Kim, K. H. & Kim, H. (1998). The optimal
determination of the space requirement & the
number of transfer cranes for import containers,
Computers & Industrial Engineering, 35, 427430.
[11] Mitsuo G., Runwei C., Lin L. (2008), Network
Models and Optimization Multiobjective Genetic
Algorithm Approach, pp 1-44.
[12] Thomas
Weise
Version:
(2009),
Global
Optimization Algorithms Theory and Application,
pp
1-88.

504

VOL. 2, NO. 10, October 2011

ISSN 2079-8407

Journal of Emerging Trends in Computing and Information Sciences


2009-2011 CIS Journal. All rights reserved.

http://www.cisjournal.org

Table 3.1: 4 Items table


Items

Weight

32

46

20

Benefit

76

90

139

60

Knapsack Capacity = 100 Kilograms


Optimal Result Expected = 1101
Result Interpretation = Items 1, 2 and 4
Constraint Function Value = 98 Kilograms
Optimal Value of the Objective Function = 226
Table 3.2: 5 Items table
Items

Weight

30

45

45

60

30

Benefit

60

70

30

60

50

Knapsack Capacity = 95 Kilograms


Optimal Result Expected = 11000
Result Interpretation = Items 1 and 2
Constraint Function Value = 75 Kilograms
Optimal Value of the Objective Function = 130
Table 3.3: 6 Items table
Items

Weight

54

59

40

67

31

35

Benefit

92

93

80

25.1

71

57

Knapsack Capacity = 145 Kilograms


Optimal Result Expected = 110010
Result Interpretation = Items 1, 2 and 5
Constraint Function Value = 144 Kilograms
Optimal Value of the Objective Function = 256
Table 3.4: 7 Items table
Items

Weight

80

76

82

53

10

40

63

Benefit

129

100

63

131

18

100

150

505

VOL. 2, NO. 10, October 2011

ISSN 2079-8407

Journal of Emerging Trends in Computing and Information Sciences


2009-2011 CIS Journal. All rights reserved.

http://www.cisjournal.org

Knapsack Capacity = 130 Kilograms


Optimal Result Expected = 0001101
Result Interpretation = Items 4, 5 and 7
Constraint Function Value = 126 Kilograms
Optimal Value of the Objective Function = 299
Table 4.2.1: Result Obtained from test data on table 3.1 with GA
ROULETTE-WHEEL SELECTION
Pop

No of

Max

Items

C/R

Time

Size

Gen

Fit.

Chosen

atio

11

226

3,4

100

28

12

15

226

1,2,4

91.7

16

48

226

1,2,4

93.8

RANKING AND SCALING SELECTION


Acc

No of

Max

Items

C/R

Time

Acc

Gen.

Fit.

Chosen

atio

88

226

1,2,4

100

16

100

18

100

226

1,2,4

100

16

100

18

100

13

226

1,2,4

100

16

100

Table 4.2.2: Result Obtained from test data on table 3.2 with GA
ROULETTE-WHEEL SELECTION
Pop

No of

Max

Items

C/R

Time

Size

Gen

Fit.

Chosen

atio

Used

19

130

1,2

100

13

16

25

120

1,2

93.8

32

13

130

1,2

93.8

RANKING AND SCALING SELECTION


Acc

No of

Max

Items

C/R

Time

Acc

Gen.

Fit.

Chosen

atio

100

130

1,2

100

11

100

16

92

130

1,2

100

15

100

38

100

38

130

1,2

95

35

100

Table 4.2.3 Result Obtained from test data on table 3.3 with GA
ROULETTE-WHEEL SELECTION
Pop

No of

Max

Items

C/R

Time

Size

Gen

Fit.

Chosen

atio

Used

12

180

229

1,2,5

91.7

16

16

684

229

1,2,5

93.8

24

1710

256

1,2,5

92.7

RANKING AND SCALING SELECTION


Acc

No of

Max

Items

C/R

Time

Acc

Gen.

Fit.

Chosen

atio

89

23

229

1,2,5

93

25

89

16

89

28

229

1,2,5

100

25

89

21

100

35

229

1,2,5

100

42

89

Table 4.2.4: Result Obtained from test data on table 3.4 with GA
ROULETTE-WHEEL SELECTION
Pop

No of

Max

Items

C/R

Time

Size

Gen

Fit.

Chosen

atio

Used

14

136

300

4,5,7

93

15

RANKING AND SCALING SELECTION


Acc
100

No of

Max

Items

C/R

Gen.

Fit.

Chosen

atio

300

4,5,7

93

Time

Acc

18

100

506

VOL. 2, NO. 10, October 2011

ISSN 2079-8407

Journal of Emerging Trends in Computing and Information Sciences


2009-2011 CIS Journal. All rights reserved.

http://www.cisjournal.org

18

2216

300

4,5,7

100

17

100

46

300

4,5,7

100

24

100

21

161

300

4,5,7

90.5

22

100

21

300

4,5,7

93

28

100

4.3 RESULTS FROM MA IMPLEMENTATION


The results obtained when memetic algorithm is implemented for same type of knapsack problem are depicted in the tables
below:
Table 4.3.1 Result Obtained from test data on table 3.1 with MA
ROULETTE-WHEEL SELECTION

RANKING AND SCALING SELECTION

Pop

No of

Max

Items

C/R

Time

Acc(

No of

Max

Items

C/R

Size

Gen

Fit.

Chosen

atio

(ms)

%)

Gen.

Fit.

Chosen

atio

Time

Acc

11

226

1,2,4

100

10

100

226

1,2,4

100

16

100

12

226

1,2,4

100

20

100

226

1,2,4

100

26

100

16

15

226

1,2,4

93.8

22

100

10

226

1,2,4

93.8

36

100

Table 4.3.2: Result Obtained from test data on table 3.2 with MA
ROULETTE-WHEEL SELECTION
Pop

No of

Max

Items

C/R

Time

Size

Gen

Fit.

Chosen

atio

(ms)

RANKING AND SCALING SELECTION


Acc

No of

Max

Items

C/R

Gen.

Fit.

Chosen

atio

Time

Acc

130

1,2

100

100

130

1,2

100

14

100

16

25

130

1,2

100

13

100

11

130

1,2

100

15

100

32

74

130

1,2

93.8

40

100

28

130

1,2

96.9

45

100

Table 4.3.3: Result Obtained from test data on table 3.3 with MA
ROULETTE-WHEEL SELECTION
Pop

No of

Max

Items

C/R

Time

Size

Gen

Fit.

Chosen

atio

Used

12

30

256

1,2,5

93.8

24

16

217

256

1,2,5

91.6

24

174

256

1,2,5

90.6

RANKING AND SCALING SELECTION


Acc

No of

Max

Items

C/R

Time

Acc

Gen.

Fit.

Chosen

atio

100

256

1,2,5

100

26

100

26

100

22

256

1,2,5

100

23

100

43

100

39

256

1,2,5

92.3

44

100

Table 4.3.4: Result Obtained from test data on table 3.4 with MA
ROULETTE-WHEEL SELECTION
Pop

No of

Max

Items

C/R

Time

Size

Gen

Fit.

Chosen

atio

(ms)

14

4,5,7

100

18

300

RANKING AND SCALING SELECTION


Acc
100

No of

Max

Items

C/R

Gen.

Fit.

Chosen

atio

300

4,5,7

100

Time
29

Acc
100

507

VOL. 2, NO. 10, October 2011

ISSN 2079-8407

Journal of Emerging Trends in Computing and Information Sciences


2009-2011 CIS Journal. All rights reserved.

http://www.cisjournal.org

18

29

300

4,5,7

95

23

100

22

300

4,5,7

96

34

100

21

118

300

4,5,7

93.3

28

100

81

300

4,5,7

96

39

100

Table 4.4.1: Result Obtained from test data on table 3.1 with GA and MA
GENETIC ALGORITHM
Pop

No of

Max

Items

C/R

Time

Size

Gen

Fit.

Chosen

atio

Used

11

226

3,4

100

18

12

15

226

1,2,4

91.7

16

48

226

1,2,4

93.8

MEMETIC ALGORITHM
Acc

No of

Max

Items

C/R

Time

Acc

Gen.

Fit.

Chosen

atio

88

226

1,2,4

100

16

100

18

100

226

1,2,4

100

26

100

18

100

10

226

1,2,4

93.8

36

100

Table 4.4.3: Result Obtained from test data on table 3.3 with GA and MA
GENETIC ALGORITHM
Pop

No of

Max

Items

C/R

Time

Size

Gen

Fit.

Chosen

atio

Used

19

130

1,2

100

13

16

25

120

1,2

93.8

32

13

130

1,2

93.8

MEMETIC ALGORITHM
Acc

No of

Max

Items

C/R

Gen.

Fit.

Chosen

atio

100

130

1,2

100

100

16

92

25

130

1,2

100

34

100

38

100

74

130

1,2

93.8

40

100

GENETIC ALGORITHM
Pop

No of

Max

Items

C/R

Time

Size

Gen

Fit.

Chosen

atio

Used

16

180

229

1,2,5

91.7

16

24

684

229

1,2,5

93.8

32

1710

256

1,2,5

92.7

Time

Acc

MEMETIC ALGORITHM
Acc

No of

Max

Items

C/R

Time

Acc

Gen.

Fit.

Chosen

atio

89

256

1,2,5

100

26

100

16

89

22

256

1,2,5

100

23

100

21

100

39

256

1,2,5

92.3

44

100

Table 4.4.4: Result Obtained from test data on table 3.4 with GA and MA
GENETIC ALGORITHM
Pop

No of

Max

Items

C/R

Time

Size

Gen

Fit.

Chosen

atio

Used

10

136

300

4,5,7

93

15

20

2216

300

4,5,7

100

30

161

300

4,5,7

90.5

MEMETIC ALGORITHM
Acc

No of

Max

Items

C/R

Time

Acc

Gen.

Fit.

Chosen

atio

100

300

4,5,7

100

29

100

17

100

22

300

4,5,7

96

34

100

22

100

81

300

4,5,7

96

39

100

508

Potrebbero piacerti anche