Sei sulla pagina 1di 9

Applied Mathematics and Computation 192 (2007) 413–421

www.elsevier.com/locate/amc

Randomized gravitational emulation search algorithm


for symmetric traveling salesman problem
S. Raja Balachandar *, K. Kannan
Department of Mathematics, SASTRA University, Thanjavur 613 402, India

Abstract

This paper presents a new heuristic method called randomized gravitational emulation search (RGES) algorithm for
solving symmetric traveling salesman problems (STSP). This algorithm is found upon introducing randomization concept
along with the two of the four primary parameters ‘velocity’ and ‘gravity’ in physics through swapping in terms of groups
by using random numbers in the existing local search algorithm GELS in order to avoid local minima and thus can yield
global minimum for STSP. To validate the proposed method numerous simulations were conducted to compare the quality
of solutions with other existing algorithms like genetic algorithm (GA), simulated annealing (SA), hill climbing (HC), etc.,
using a range of STSP benchmark problems. According to the results of the simulations, the performance of RGES is
found significantly enhanced and provided optimal solutions in almost all test problems of sizes up to 76. Also a compar-
ative computational study of 11 chosen benchmark problems from TSPLIB library shows that this RGES algorithm is an
efficient tool of solving STSP and this heuristic is competitive with other heuristics too.
 2007 Elsevier Inc. All rights reserved.

Keywords: Symmetric traveling salesman problem; Velocity; Gravitational force; Newton’s law; Swapping

1. Scope and purpose

Many and varied are the applications of STSP. To name a few Scheduling problems, vehicle routing prob-
lems [9], mixed Chinese postman problems, integrated circuit board chip insertion problem [4], printed circuit
board punching sequence problems [12] and a wing-nozzle design problem in the aircraft design are worth
mentioning. Although there exists several heuristic procedures and branch and bound algorithms for it, this
problem is still a difficult NP-complete problem. The main purpose of this paper is to enhance the GELS algo-
rithm to a Global Algorithm by introducing randomization concept in the available parameters of GELS and
to show its effectiveness. A justification is also given for the improvement of the GELS through the results, by
solving STSP of various sizes from 5 to 76. Improvement of GELS through randomization has also been

*
Corresponding author.
E-mail addresses: srbala@maths.sastra.edu (S. Raja Balachandar), kkannan_1960@yahoo.com (K. Kannan).

0096-3003/$ - see front matter  2007 Elsevier Inc. All rights reserved.
doi:10.1016/j.amc.2007.03.019
414 S. Raja Balachandar, K. Kannan / Applied Mathematics and Computation 192 (2007) 413–421

established by making a comparative study of other available heuristic algorithms too. The table showing the
results of 11 chosen(for brevity) benchmark problems from TSPLIB library ensure the efficiency of RGES.

2. Introduction

2.1. NP complete problems

There is a class of problems, which have not been theoretically proved to have exponential complexities, but
no polynomial algorithms have been designed. Some problems (which are known as NP complete problems)
are: traveling salesman problem, problem of Hamiltonian paths, knapsack problem, problem of optimal graph
coloring. If a polynomial solution could be found for any of these problems, then all of the NP problems
would have polynomial solutions. NP complete problems are described in more detail in [8,10,11,13].

2.2. A brief survey of algorithms for solving traveling salesman problem

The traveling salesman problem is a well-known Combinatorial optimization problem, in which the sales-
man wants to minimize the total distance traveled to visit all cities in his visiting list.
Direct search algorithms of exponential complexity, which give optimal solutions, but are applicable only
for a small number of nodes, are available in literature. Not many heuristic algorithms of polynomial com-
plexity, which give solutions near optimal, but are applicable for large number of nodes are found in literature.
The symmetric version of the traveling salesman problem, in which bi-directional distances between a pair of
cities are identical, has been studied by many researchers and the understanding of the polyhedral structure of
the problem has made it possible to optimally solve large size problems [10]. Nonetheless, efforts to develop
good heuristic procedures have continued because of their practical importance in quickly obtaining approx-
imate solutions of large size problems. Heuristic methods such as the K-opt heuristic by Lin and Kernighan
[11] and the Or-opt heuristic by Or [13] are well known for the symmetric traveling salesman problem. There
also have been a few attempts to solve it by various meta-heuristics such as tabu searches [8,19] and genetic
algorithms [3,5,14,15,17]. Interestingly, Brest and Zerovnik [2] report that a heuristic algorithm, based on
‘arbitrary insertion’, performs remarkably well on benchmark test problems in terms of both solution quality
and computation time. Karp [7] also presents a heuristic, patching algorithm, and shows that the heuristic
solution asymptotically converges to the optimal solution of the asymmetric TSP.

2.3. Optimum solution algorithm and heuristic methods

Backtracking can search all possible tours. This method has small memory requirements, but largest exe-
cution time. For example, for 20 nodes graph, execution time would be several millenniums. Other famous
algorithm available in literature are dynamic programming, branch and bound method, etc., Prim or Kruskal
algorithm gives us solutions to STSP which is not more than twice of optimal solution length. It’s applicability
is bounded only to Euclidean TSP. This algorithm works in polynomial time.
This paper is organized as follows. In Section 3, Mathematical description of STSP is discussed brief. In
Section 4, a brief description of GELS algorithm proposed by Webster is provided. In Section 5, the RGES
algorithm, based on GELS with concept of randomization is described and a rationale for its anticipated good
performance is provided. In Section 6, results of computational experiments are given, solution quality is tab-
ulated, efficiency of the algorithm is established through comparative study, parameters used in various algo-
rithms are tabulated and algorithm features are elaborated. Finally, Section 7 includes merits and concluding
remarks.

3. Description of a STSP

Suppose a salesman has to visit n cities. He wishes to start from a particular city, visit each city only once
and then return to his starting point. Minimizing the total travelling distance by selecting a sequence of all n
cities (distance between cities i and j is same as j and i) is known as a symmetric travelling salesman problem.
S. Raja Balachandar, K. Kannan / Applied Mathematics and Computation 192 (2007) 413–421 415

Consider a symmetric traveling salesman problem with n cities to visit. Let Cij be the distance from city j
(Cij = Cji). Moreover, let xij be 1 if the salesman travels from city i to city j and 0 otherwise. Then, the sym-
metric traveling salesman problem can be formulated as follows:
n X
X n
min cij xij
i¼1 j¼1

X
n
such that xij ; j ¼ 1; . . . ; n; ð1Þ
i¼1

X
n
xij ¼ 1; i ¼ 1; . . . ; n; ð2Þ
j¼1
X X
xij 6 jP j  1; 8P  f1; 2; . . . ; ng; 2 6 jP j 6 n  2 ð3Þ
i2P j2P ;i6¼j

ðsub tour elimination constraintÞ


xij 2 f0; 1g; i ¼ 1; . . . ; n; j ¼ 1; . . . ; n: ð4Þ

4. Gravitational emulation local search (GELS) algorithm

4.1. Preliminary work done with GELS

For the study of GELS, a conceptual frame work was first developed by Voudouris and Tsang [20] and that
preliminary designed algorithm was called gravitational local search algorithm (GLSA). As there are a num-
ber of places in the literature where abbreviations, GLS are used to refer guided local search [20], this name
was later changed to GELS. Two separate versions of GLSA were implemented using C language. They differ
by two key differences. The first version, dubbed GLSA1, used as its heuristic Newton’s equation for gravi-
tational force between two objects and the pointer object moved through the search space one position at
a time, while the second, dubbed GLSA2, used as its heuristic Newton’s method for gravitational field calcu-
lation and the pointer object was allowed to move multiple positions at a time. These issues are precisely elab-
orated in [1].
Both the procedures had operational parameters that the user could set to tune its performance, the param-
eters used were density (DENS), drag (DRAG) [18], friction (FRIC), gravity (GRAV) [18], initial velocity
(IVEL), iteration limit(ITER), mass (MASS), radius (RADI), silhouette (SILH) and threshold (THRE)
among which GELS uses only four parameters namely, velocity, iteration limit, radius and direction of move-
ment. Barry Lynn Webster [1] has beautifully designed this robust algorithm (GELS) which had overcome the
following difficulties (i) increased the number of iterations caused by hypothetical search space (ii) poor solu-
tion caused by random determination of objective function values of neighboring solution. Webster assumed
objective function values of neighboring solutions as the functions of earlier solutions being randomly deter-
mined. Because of this objective function values of the invalid solutions of the problem were tactically avoided
in the beginning itself. He has also avoided the sensitive and redundant parameters of GLSA, because with
these parameters finding the points of equilibrium was extremely difficult.

4.2. GELS algorithm

4.2.1. Parameters used in GELS algorithm

(a) Max velocity: Defines the maximum value that any element within the velocity vector can have used to
prevent velocities that became too large to use.
(b) Radius: Sets the radius value in the gravitational force formula; used to determine how quickly the grav-
itational force can increase (or) decrease.
416 S. Raja Balachandar, K. Kannan / Applied Mathematics and Computation 192 (2007) 413–421

(c) Iterations: Defines a number of iterations of the algorithm that will be allowed to complete before it is
automatically terminated (used to ensure that the algorithm will terminate).
(d) Pointer: It is used to identify the direction of movement of the elements in the vectors.

4.2.2. Gravitational force (GF)


GELS algorithm uses the formula for the gravitational force between the two solutions as
GðCU  CAÞ
F ¼ ;
R2
where

G = 6.672 (Universal constant of gravitation) [18]


CU = objective function value of the current solution
CA = objective function value of the candidate solution
R = value of radius parameter.

4.2.3. Webster’s algorithm


BL Webster designed GELS algorithms by using two methods and two stepping modes. They are as under:

GELS 11:
(i) To find the GF between single solution and current solution.
(ii) Movement within current local search.

GELS 12
(i) To find the GF between single solution and current solution.
(ii) Movement to outside of the neighbourhood.

GELS 21
(i) To find GF between each solution within neighbourhood.
(ii) Movement within current local search.

GELS 22
(i) To find GF between each solution within neighbourhood.
(ii) Movement to outside of the neighbourhood.

The GELS algorithms have various distinguishing features from other algorithm like GA, SA, HC, in terms
of search space, number of iterations, etc.
During the development of GELS setting of afore said parameters have been arrived at through trial and
error and some settings caused repetitive visits in a neighbourhood, others caused large numbers, causing the
algorithms to behave erratically. Webster, after a number of tests had settled the algorithm with 10 for max-
imum velocity, 4 for radius and 10,000 for the iterations. After a number of tests we have settled the RGES
algorithm with 7 for max. velocity, 7 for radius and 1000 for iterations. The RGES algorithm and it’s expla-
nation is given in Section 5.

4.3. Distinguishing properties of GELS algorithm

1. Introduction of the velocity vector and relative gravitational force in each direction of movement emulated
the acceleration effect of the process towards its goal.
2. Multiple stepping procedures helped the solution pointer to show the next direction leading to the other
solutions.
S. Raja Balachandar, K. Kannan / Applied Mathematics and Computation 192 (2007) 413–421 417

3. The algorithm is designed in such a way that it terminates on the two conditions either all the elements in
the velocity have gone to zero or the maximum allowable number of iterations has been completed.

5. Proposed ‘RGES’ algorithm

RGES search technique starts out with a guess of one likely candidate, actually chosen at random in the
search space. This candidate is called the current solution (CU). The RGES algorithm generates a neighbour-
hood. The algorithm is given in Fig. 1. The basic mechanisms are similar to the GELS in the construction of
neighbourhood.
For n city STSP, the total number of cities is divided into l groups by means of parameter called Radius (R)
and a random number (ri) to each of l groups such that ri 6 R, i = 1 to l
Here P(R  ri) for each i does not undergo any swapping. The number of cities in which the swapping takes
place is ri The total number of candidate solutions in the neighbourhood Pri. During swapping within the
group a number of solutions are generated and objective values of the tours are also found.
The current solution for the next iteration is obtained by updating the objective function value as under:
Find the lowest objective value in the neighbourhood. The algorithm is given in Fig. 2. Update the velocity
of each group of the current solution with the GF (between lowest objective value function (LOVF) and the
objective value of the current solution (CS). If the LOVF found in the iteration one is less than the CS, the
LOVF is considered as the CS for the next iteration. If not, we choose one candidate solution as CS randomly
from Pri candidates solutions as discussed earlier. The algorithm is given in Fig. 3. we repeat the process till

Fig. 1. Neighboring points generation.

Fig. 2. Best candidate solution.


418 S. Raja Balachandar, K. Kannan / Applied Mathematics and Computation 192 (2007) 413–421

Fig. 3. Parameters and current solution updating.

Fig. 4. Complete RGES algorithm.

either velocity vanishes or number of iterations exceed the assigned value. The complete algorithm for RGES
is given in Fig. 4.

6. Computational results

6.1. Experimental design

Fischetti et al. [6] describe a branch-cut-algorithm to solve the symmetric generalised TSP,using partition-
ing method to 46 standard TSP instances from TSPLIB library [16] since they provide adequate description of
the partitioning method to provide optimal objective values for each of the problem. Here, we applied our
heuristic to the selected eleven problems in order to ensure the robustness of the proposed GELS algorithm
and compare the results with other GELS algorithm proposed by Barry Webster. The results of the problems
having sizes 5, 6, 10, 11, 15, 20, 21, 24, 51 and 76 are shown in Table 1. Solution of problems having other sizes
are not shown here for brevity. The randomization procedure introduced in the GELS algorithm ensure the
efficiency of the algorithm to handle STSP of large size up to 76, for each problem we ran RGES 5 times to
examine algorithms performance and its variations from trail to trail. This algorithm was implemented in
C++ and tested in Pentium IV, 3.2 GHz processor and 256 MB RAM running under Windows XP.
S. Raja Balachandar, K. Kannan / Applied Mathematics and Computation 192 (2007) 413–421 419

Table 1
RGES results
Problem Opt obj value #opt #best Mean Minimum Maximum
Value Pct (%) Value Pct (%) Value Pct (%)
5sample5 21 5 5 21 0.0 21 0.0 21 0.0
6sample6 56 5 5 56 0.0 56 0.0 56 0.0
10sample10 323 5 5 323 0.0 323 0.0 323 0.0
10s_ample10 725 5 5 725 0.0 725 0.0 725 0.0
11eil51 174 5 5 174 0.0 174 0.00 174 0.0
16eil76 209 5 5 209 0.0 209 0.0 209 0.0
20kroa100 9711 5 5 9711 0.0 2707 0.0 2707 0.0
Gr21 2707 5 5 2707 0.0 2707 0.0 2707 0.0
Gr24 1272 5 5 1272 0.0 1272 0.0 1272 0.0
Eil51 426 4 4 426 0.0 426 0.0 434 1.8
Eil76 538 3 4 538 0.0 538 0.0 560 4.1
Objective values, No. of runs = 5, Max No. of iterations = 1000.

6.2. Solution quality

Table 1 summarized the results for each of the solved problem. The Columns in the table are furnished as
follows:
Problem: The name of the test problem the digits of the beginning of the name give the clusters (m) these at
the end give the number of nodes (n).
Opt obj value: The Optimal objectives value for the problems reported here are taken from TSPLIB [16]
some of which have been obtained through direct search method.
#Opt: The number of trials out of 5 in which the RGES found the optimum solution.
#Best: The number of trails out of 5 in which the RGES found its best solution (for problems in which the
optimal solution was found, this is equal to # opt column).
Mean, Min, Max: The mean, min and maximum objective values returned in the 5 trials (the value column)
and the respective percentages above the optimal value (in pct column) The RGES found optimal solutions in at
least one of the 5 trials for all the 11 tested problems. For 9(81%) of the problems the RGES found the optimal
solution in every trials. The RGES solved 11 problems and in only one case Eil 76 it returns a solution more
than 4% above the optimal. The heuristic to return consistent solutions for smaller size problems, for large size
problems if returns solutions that vary a little bit best higher (but close to each other in objective value).

6.3. Comparison with other algorithms

In order to bring out the efficiency of the proposed RGES bring algorithm the same set of 11 bench mark
problems have been solved by the other several algorithm (SA, HC, GA, GELS11, GELS 12, GELS 21, GELS
22) reported in the literature, these algorithms have been coded in C Language. The results of solutions of
these problems under various category are shown in Table 3. The values of the parameters into each of the
algorithms to solve the above problems use anticipated in Table 2.
To compare the results of RGES algorithm with other algorithms and to test its efficiency the number of
iterations is fixed 1000, number of trials 5 and only best solutions out of 5 are taken to account. Based on the
data from the experiment and the solutions from Table 3, a number of observations can be made: For sample

Table 2
Parameters set for various methods
Methods Parameters
GA Population size = 10, crossover(1-point) = 0.9, mutation(1-point) = 0.05 Iteration = 1000
SA Temp = 2000, reconfig interval = 10 Attempt interval = 10 Annealing rate = 0.01 Threshold = 0.01
GELS Radius = 10 Velocity = 4 Iterations = 1000
RGES Radius = 7 Velocity = 7 Iteration = 1000
420 S. Raja Balachandar, K. Kannan / Applied Mathematics and Computation 192 (2007) 413–421

Table 3
RGES versus other algorithms
Problem Optimum objective Simulated Hill Genetic GELS11 GELS12 GELS21 GELS22 RGES
value annealing climbing algorithm
5sample1 21 21 21 21 21 21 21 21 21
6sample2 56 56 56 56 56 56 56 56 56
10sample3 323 324 323 324 359 343 357 329 323
10sample4 725 742 725 761 805 754 786 725 725
11eil51 174 177 195 227 195 176 193 183 174
16eil76 209 255 247 255 265 265 255 255 209
20kroa100 9711 11792 11723 11792 11792 11792 11723 11723 9711
Gr21 2707 3333 3303 3333 3333 3333 3303 3303 2707
Gr24 1272 1553 1423 1528 1553 1553 1472 1472 1272
Eil51 426 497 495 497 497 497 496 496 426
Eil76 538 661 651 661 657 661 657 657 538
No. of iterations = 1000 No.of trials = 5 Best solutions out of 5.

problems of size 5 and 6, all algorithms are alike when size of the problem is 10 the algorithms efficiency differ
Hill Climbing algorithms give best and optimal solutions when size of the problem is greater than 11 the other
algorithms do provide better solutions or nearing optimal solutions only. However, the RGES algorithms
maintains both optimality and the quality of the solutions.
Hence the introduction of randomization concept to GELS algorithm is found to be advantageous in two
ways.

1. Randomization enables the algorithm to escape from the local search and pave a way leading to global
search with the help of established GELS.
2. In all the problems up to the size 76, RGES provide best and optimum solutions.

6.4. Contribution of algorithm features

When examining the relative contribution of the features of RGES algorithm to the heuristic overall per-
formance, we find that swapping in terms of groups by using random numbers is effectively utilized in making
local search to be the global search, which the GELS in its two by two swapping gets entangled into the neigh-
bourhood of the current solution and unable to escape from the local domain. We also find that RGES exhib-
its an excellent performance over GELS in terms of obtaining optimum solutions even for larger size problems
at the same computational cost as in GELS.
In our algorithm, if R is constant then there are O(n/R) groups. Hence it appears the product r · i is in
O(Rn/R) and so grow exponentially. But since R is chosen as a · n For finite n,O(Rn/R) becomes O(R1/a) some
finite ‘a’ hence RGES becomes polynomial time algorithm.

7. Conclusion

Webster, used two stepping modes and two methods to enhance gravitational emulation properties of a
local search algorithm to compete with other algorithms available in literature to provide best solutions,
though not focusing the optimality. This new proposed algorithm is capable of bringing the optimal solutions
without affecting the quality of GELS and also paves a way for Global search through randomization. In our
attempt we have solved 11 bench mark problems up to the problem of size 76. Attempts of designing RGES
like algorithms for solving ASTSP problems are on going.

Acknowledgement

The authors express their gratitude to Barry Lynn Webster Melbourne, Florida Institute of Technology,
USA for his prime directions and help.
S. Raja Balachandar, K. Kannan / Applied Mathematics and Computation 192 (2007) 413–421 421

References

[1] Barry Lynn Webster, Solving Combinatorial Optimization Problems Using a New Algorithm Based on Gravitational Attraction,
Ph.D., thesis, Melbourne, Florida Institute of Technology, May, 2004.
[2] J. Brest, J. Zerovnik, An approximation algorithm for the asymmetric traveling salesman problem, Ricerca Operativa 28 (1998) 59–
67.
[3] T. Bui, B. Moon, A new genetic approach for the traveling salesman problem, Proceedings of the First IEEE Conference on Evolution
Computation 2 (1994) 7–13.
[4] D. Chan, D. Mercier, IC insertion: an application of the traveling salesman problem, International Journal of Production Research 27
(1989) 1837–1841.
[5] S. Chatterjee, C. Carrera, L. Lynch, Genetic algorithms and traveling salesman problems, European Journal of Operational Research
93 (1996) 490–510.
[6] M. Fischetti, J.J. Salazar-Gonza’lez, P. Toth, A branch and- cut algorithm for the symmetric generalized traveling salesman problem,
Operations Research 45 (3) (1997) 378–394.
[7] R. Karp, A patching algorithm for the non symmetric traveling-salesman problem, SIAM Journal Computing 8 (1979) 561–573.
[8] J. Knox, Tabu search performance on the symmetric traveling salesman problem, Computers and Operations Research 21 (1994) 867–
876.
[9] G. Laporte, The vehicle routing problem: an overview of exact and approximate algorithms, European Journal of Operational
Research 59 (1992) 345–358.
[10] E. Lawler, J. Lenstra, A. Rinnooy Kan, D. Shmoys, The traveling salesman problem: a guided tour of combinatorial optimization,
Wiley, New York, 1985.
[11] S. Lin, B. Kernighan, An effective heuristic algorithm for the traveling salesman problem, Operations Research 21 (1973) 498–516.
[12] J. Litke, An improved solution to the traveling salesman problem with thousands of nodes, Communications of the ACM 27 (1984)
1227–1236.
[13] I. Or, Traveling Salesman-type Combinatorial Problems and their Relation to the Logistics of Regional Blood Banking, Ph.D. thesis,
Northwestern University, Evanston, IL, 1976.
[14] P. Poon, J. Carter, Genetic algorithm crossover operations for ordering applications, Computers and Operations Research 22 (1995)
135–147.
[15] J. Potvin, Genetic algorithm for the traveling salesman problem, Annals of Operations Research 63 (1996) 339–370.
[16] G. Reinelt, TSPLIB – a traveling salesman problem library, ORSA Journal on Computing 4 (1996) 134–143.
[17] L. Schmitt, M. Amini, Performance characteristics of alternative genetic algorithmic approaches to the traveling salesman problem
using path representation: an empirical study, European Journal of Operational Research 108 (1998) 551–570.
[18] Francis W. Sears, Mark W. Zemansky, Hugh D. Young, University Physics, seventh ed., Addison – Wesley, Reading, MA, 1987.
[19] S. Tsubakitani, J. Evans, Optimizing tabu list size for the traveling salesman problem, Computers and Operations Research 25 (1998)
91–97.
[20] Voudouris, chris, Edward Tsang, Guided Local Search. Technical Report CSM-247, Department of Computer Science, University of
Essex, UK, August 1995.

Potrebbero piacerti anche