Printer Friendly

A hybrid sequential constructive sampling algorithm for the bottleneck traveling salesman problem.

Introduction

The bottleneck traveling salesman problem (BTSP) is a variation of the usual traveling salesman problem (TSP). In BTSP, the objective is to find a Hamiltonian circuit that minimizes the largest cost of any of its arcs in a graph. It can be defined as follows:

A network with n nodes (or cities), with 'node 1' (suppose) as 'headquarters' and a cost (or distance, or time etc.) matrix C=[[c.sub.ij]] of order n associated with ordered node pairs (i,j) is given. Let {1=[[alpha].sub.0], [[alpha].sub.1], [[alpha].sub.2], ...., [[alpha].sub.n-1], [[alpha].sub.n]=1} [equivalent to] (1[right arrow][[alpha].sub.1][right arrow][[alpha].sub.2][right arrow].....[right arrow][[alpha].sub.n-1][right arrow]1} be a tour, representing irreducible permutations interpreted as simple cycles. The tour value is now defined as max [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. The objective is to choose a tour which has minimum tour value.

Both the TSP and the BTSP are well known NP-hard combinatorial optimization problems. The TSP remains an active area of research where polynomial time algorithms have been developed for some special cases [4]. But, for the BTSP, it can be shown that unless P=NP, it is impossible to construct a polynomial time 2 - [epsilon] approximate solution to the problem even if the network is a complete graph and its associated costs satisfy the triangle inequality [13]. Vairaktarakis [18] considered a polynomially solvable TSP and shows that the corresponding BTSP is strongly NP-complete. BTSP finds application in the area of workforce planning [18] and in minimizing makespan in a two-machine flowshop with no-wait-in-process [12].

Gilmore and Gomory [8] introduced the BTSP, and discussed a specific case of the problem. The general BTSP was first considered by Gabovic et al. [6]. A detailed study on BTSP is available in a book chapter by Kabadi and Punen [11]. Parker and Rardin [13] presented an approximation algorithm, but did not report any computational experiment. Exact algorithms based on lexisearch algorithm have been developed [3, 16]. Algorithms based on branch and bound have been developed [5, 7]. A linear time algorithm for the problem on special instance, that is, on Halin graph, have been proposed [14]. Ahmed et al. [2] reported this problem as min-max TSP, and developed a genetic algorithm to efficiently solve the problem. An approximation algorithm, by exploiting the underlying geometry of the special distance matrix, proposed by Gilmore and Gomory [8], has been proposed to solve the problem [12]. Recently, an efficient heuristic algorithm based on binary search threshold has been developed to find heuristic solutions for the general BTSP instances [15].

In this paper, we are not considering any special case of the problem, rather the general BTSP. A sequential constructive sampling algorithm has been developed to obtain heuristic solution to the problem. To improve the quality of the solution, we then apply a combined mutation operator that combines insertion, inversion and swap mutation operators. The hybrid algorithm incorporates the combined mutation operator to the sequential constructive sampling for the problem. The efficiency of the hybrid sequential constructive algorithm to the BTSP as against algorithm of Ramakrishnan et al. [15] has been examined for some TSPLIB instances.

This paper is organized as follows: Section II presents the sequential constructive sampling algorithm for the BTSP. The hybrid algorithm that combines the sequential constructive sampling and a combined mutation is presented in Section III. Computational experiments for the hybrid algorithm are presented in Section IV. Finally, Section V presents comments and concluding remarks.

The Sequential Constructive Sampling

The sequential constructive sampling algorithm is an algorithm to 'guess' or 'estimate' the best tour. It can be described as a statistical version of the lexisearch algorithm. It combines the lexisearch algorithm with sequential sampling algorithm for estimating the minimum of a statistical population through adaptive sampling, with reasonable stopping rule [1].

It is not a simple random generation of tours. The 'alphabet table' of lexisearch algorithm is considered, and instead of visiting the first 'legitimate' node (the node not yet visited) in a row of the 'alphabet table', one of the 'legitimate' nodes is visited probabilistically. For this the probability of visiting each 'legitimate' node is assigned in such a way that the first legitimate node gets more probability than second one, and so on. The probability of visiting each 'legitimate' node is assigned as follows.

Suppose the number of 'legitimate' nodes in a row is k. The probability of visiting the ith 'legitimate' node is

[p.sub.i] = 2(k - i + 1)/k(k + 1) (1)

Thereafter, the cumulative probability ([P.sub.i]) of each node being visited is calculated. Now, a random number between 0 and 1 is generated and the node that represents the chosen random number in the cumulative probability range for that node is accepted. Thus a partial tour is constructed, which is called a block leader. Next, a block leader bound is computed for the value of the objective function over the block. This is compared with the 'best solution value' found so far. If no tour in the block can be better than the 'best solution value', jump out the block to the next super-block, for example, super-block of the block ([[alpha].sub.0], [[alpha].sub.1], [[alpha].sub.2]) is the block ([[alpha].sub.0], [[alpha].sub.1]). However, if the bound indicates a possibility of better solutions in the block, then we go for constructing the tour further.

A. Alphabet table

The alphabet matrix, A=[a(i,j)], is a square matrix of order n formed by the positions of the elements of the cost matrix of order n, C=[[c.sub.ij]]. The ith row of the matrix A consists of the positions of the elements in the ith row of the matrix C when they are arranged in the non-decreasing order of their values [3]. Alphabet table "[a(i, j) - [c.sub.i,a(i,j)]]" is the combination of elements of the alphabet matrix and their values. For example, for the cost matrix shown in Table 1, the corresponding 'alphabet table' is shown in Table 2.

B. Lower bound

Block leader bound

The objective of lower bound is to skip as many subproblems in the search procedure as possible. A subproblem is skipped if its lower bound exceeds the 'best solution value' found so far in the process. The higher the lower bound the larger the set of subproblems that are skipped. In this study we are setting lower bound for each block leader on the value of objective function for the instance as follows:

Suppose the partial tour is (1=[[alpha].sub.0], [[alpha].sub.1], [[alpha].sub.2]) and the node [[alpha].sub.3] is selected for concatenation. Before concatenation, we check the bound for the leader ([[alpha].sub.0], [[alpha].sub.1], [[alpha].sub.2], [[alpha].sub.3]). For that, we start our computation from 2nd row of the 'alphabet table' and traverse up to the nth row, excluding [[alpha].sub.1]-th and [[alpha].sub.2]-th rows, check the value of first 'legitimate' node, including node 1, in each row. Maximum value among the values of first 'legitimate' nodes and leader value is the lower bound for the leader ([[alpha].sub.0], [[alpha].sub.1], [[alpha].sub.2], [[alpha].sub.3]).

Overall lower bound

We also consider an overall lower bound which is a modification of 2-max lower bound [15] as a termination condition. We have seen that the 2-max lower bound is more than the optimal solution for at least one symmetric TSPLIB instance, that is, dantzig42. Suppose, A=[a(i,j)] is the n ordered alphabet matrix for the n ordered cost matrix C=[[c.sub.ij]]. We define a lower bound as B = max{[c.sub.i,a(i,2)] : i = 1, 2, ..., n} for symmetric instances. For the cost matrix given in Table 1, the overall lower bound (B) is 35.

C. The algorithm

A preliminary version of the algorithm is reported [1]. Our algorithm may be summarized as follows:

Step 0: Construct the 'alphabet table' based on the given cost matrix. Consider a large sample size S. Initialize the 'best solution value' to a large number, L, and current sample size, m=l. Calculate overall lower bound for the instances as discussed in 2.2.2.

Step 1: Since 'node 1' is the starting node, we start our computation from 1st row of the 'alphabet table', so, initialize p=1 and go to step 2.

Step 2: Go to the pth row of the 'alphabet table', and visit probabilistically, by using equation (1), any legitimate node (say, node q) of the row such that value of the node is less than 'best solution value' found so far. If no legitimate node of that row has value less than the 'best solution value', go to step 9, else, go to step 3.

Step 3: Compute the block leader bound as discussed in section 2.2.1, and go to step 4.

Step 4: If the block leader bound is greater than or equal to the 'best solution value', then go to step 5, else accept the 'node q', compute the present 'tour value' and go to step 6.

Step 5: Repeat step 2 to step 5 at most k times, where k is number of 'legitimate' nodes. If within this many trials no improvement is possible then go to step 1 and then try to generate another tour; else, go to step 6.

Step 6: If all nodes of the network are visited, add an edge connecting the 'node q' to 'node 1', compute the complete 'tour value' and go to step 7, else go to step 8.

Step 7: If the 'tour value' is less than the 'best solution value', then replace the 'best solution value' by the 'tour value'. If the 'best solution value' is equal to the overall lower bound, then stop; else go to step 9.

Step 8: Rename the 'node q' as 'node p' and go to step 2.

Step 9: Increment m by 1 (one). If (m > S), then accept the 'best solution value' as the optimal solution value and corresponding tour as optimal tour, and stop, else, go to step 1.

In this algorithm the motivation for using ranks to the bias probabilities is to select the nodes, which have lesser values. In this present study we have considered a sample of maximum size S=[10.sup.6], i.e., the 'best solution value' obtained within these [10.sup.6] iterations is considered as the optimal solution value of the instance. Also, through an extensive experiment, instead of considering all the 'legitimate' nodes in a row of the 'alphabet table', we have considered a maximum 7 legitimate nodes to be explored, i.e., k[less than or equal to]7.

D. Illustration

Let us illustrate the algorithm through an example of the seven-node instance given in Table 1 and the 'alphabet table' as constructed in Table 2. Initialize L = 9999. We start from 1st row of the 'alphabet table'. The number of 'legitimate' nodes in 1st row that have value less than L is 6, which are 7, 4, 5, 6, 2, and 3, with cumulative probabilities 0.286, 0.524, 0.714, 0.857, 0.952 and 1.000 respectively. Suppose the random number 0.572 is generated, then the node 5 will be accepted for concatenation, and thus, the partial tour will be (1, 5). Next we go to 5th row of the 'alphabet table' and probabilistically select another node, and so on. Proceeding in this way it may lead to a complete tour (1, 5, 4, 3, 6, 7, 2) with tour value 75. Since 75 < L, so we replace L = 75. This completes one iteration; and we repeat the whole process [10.sup.6] times, and whatever 'best solution value' is found, is considered as optimal solution value for the BTSP instance.

The Hybrid Algorithm

Our hybrid sequential constructive sampling algorithm incorporates a combined mutation operator to the sequential constructive sampling algorithm for BTSP. In genetic algorithms, to diversify population of chromosomes mutation operator is used [9]. The most commonly used mutation operators for TSP are insertion operator (that selects a node and inserts it in a random place), inversion operator (that selects two points along the length of chromosome and reverses the substring between these points) and swap operator (that selects two nodes randomly and swaps them). Applying any of the mentioned operators with cent percentage of probability often diverge solution from the exact solution. So, generally probability is set to very small number. Here, we combine these operators with cent percentage of probabilities such that the result never diverges from the exact solution. This combined mutation operator is a modification of the hybrid mutation operator proposed [19]. The hybrid mutation was applied to the current best solution by an ant colony system, and found very good results for the usual TSP. We also, follow similar approach, that is, after obtaining a better solution by the sequential constructive sampling (in Step 7) we apply the combined mutation to the solution. Suppose the present best tour is (1=[[beta].sub.0], ([[beta].sub.1], [[beta].sub.2], ...., [[beta].sub.n-1]), then the combined mutation operator can be stated as follows:

Step 1: For I: = 1 to n-2 do the following steps.

Step 2: For J: = I+1 to n-1 do the following steps.

Step 3: If inserting node [[beta].sub.1] after node [[beta].sub.J] reduces the present tour length, then insert the node [[beta].sub.1] after the node [[beta].sub.J]. In any case go to step 4.

Step 4: If inverting substring between the nodes [[beta].sub.I] and [[beta].sub.J] reduces the present tour length, then invert the substring. In any case go to step 5.

Step 5: If swapping the nodes [[beta].sub.I] and [[beta].sub.J] reduces the present tour length, then swap them.

Computational Experiments

Our hybrid sequential constructive sampling (HSCS) algorithm has been encoded in Visual C++ on a Pentium IV personal computer with speed 3 GHz and 448 MB RAM under MS Windows XP operating system and tested with some TSPLIB instances. To see the consistency of the results, HSCS is run ten times; and average solution values and average computational times (in seconds) are reported. Recently, Ramakrishnan et al. [15] developed a heuristic algorithm called binary search threshold (BST) algorithm for the BTSP and reported results for some symmetric TSPLIB instances using IBM PC. Unfortunately, we could not carry out comparison with the algorithm in the same machine, since, to the best of our knowledge, no online source code is available. So, we report the computational times (in seconds) as were reported on an IBM PC. Our Pentium IV computer is approximately 5 times faster than the computer used by Ramakrishnan et al. [15]; see the machine speed comparison from Johnson [10]. We consider some moderate sized TSPLIB instances of sizes up to 318 only.

Table 3 summarizes the results for the instances for which lower bound and optimal solutions are equal. The solution quality and computational times of HSCS are found to be insensitive to the number of runs. HSCS finds optimal solution of twenty instances within 59.67 seconds on the average of ten runs, whereas BST finds optimal solution within 1423 seconds on IBM machine. In terms of computational time, which is the most important factor, HSCS is found to be far better than BST for these instances. These instances are found to be very easy.

Table 4 summarizes the results for the instances for which lower bound and optimal solutions are different. For these instances also, the solution quality and computational times of HSCS are found to be is insensitive to the number of runs. HSCS finds optimal solution of all fifteen instances in average of ten runs. HSCS finds better solution than BST for dantzig42. HSCS and BST find optimal solution of these instances within 808.19 (on average) and 8613.00 (on IBM machine) seconds respectively. For these instances also, in terms of computational time, HSCS is found to be better than BST for these instances. We also report average computational times of ten runs when the final solutions are seen for the first time by HSCS. It is seen that, on average, HSCS sees optimal solution within 34% of complete computational times. So, if one finds the overall lower bound which gives the same value as the reported solution value for these instances, then applying that overall lower bound to HSCS would definitely terminates the algorithm very quickly, within 34% of present computational times.

Table 5 summarizes the results for the instances which are found to be difficult. HSCS could not find optimal solution (as obtained by BST) of all twenty one instances on average of ten runs. The best solutions obtained within ten runs for the instances are also reported. Out of twenty one instances, HSCS finds optimal solution for seven instances, at least once in ten runs. Other fourteen instances are found to be hard, especially, the instances named with 'kro'. On average, solution obtained by HSCS is 20% away from the solution obtained by BST. On the basis of solution quality, BST is found to better than HSCS. However, on the basis of computational time, HSCS is found to be better than BST.

Conclusion

We presented a hybrid sequential constructive sampling algorithm for the bottleneck traveling salesman problem, and tested with some TSPLIB instances. We also reported solution values for the instances by the heuristic algorithm of Ramakrishnan et al. [15], and then showed the efficiency of our hybrid sequential constructive sampling algorithm over binary search threshold algorithm [15].

In this present study, it is very difficult to say how big n before our algorithm fails to find optimal solution. Because, for example, the instance rat99 of size 99 could not be solved optimally on average of ten runs, whereas, the instance pr264 of size 264 could be solved optimally in all of the ten runs by our algorithm. A sophisticated statistical version of lexisearch algorithm may obtain better solution of the instances very quickly, especially, the hard instances reported in Table 5, which is under our investigation.

Acknowledgements

The author wishes to acknowledge Prof. S.N. Narahari Pandit, Hyderabad, India for his valuable suggestions and moral support.

References

[1] Z.H. Ahmed. A Sequential Constructive Sampling and Related Approaches to Combinatorial Optimization, Ph.D. thesis, Tezpur University, India, 2000.

[2] Z.H. Ahmed, S.N.N. Pandit, M. Borah. "Genetic Algorithms for the Min-Max Travelling Salesman Problem". In Proceedings of the Annual Technical Session, Assam Science Society, India, pp. 64-71, 1999.

[3] Z.H. Ahmed. "A lexisearch algorithm for the bottleneck traveling salesman problem", International Journal of Computer Science and Security 3, pp. 569-577, 2010.

[4] R.E. Burkard, V.G. Deinko, R. van Dal, J.A.A. van, der Veen, G.J. Woeginger. "Well-solvable special cases of the traveling salesman problem: A survey", SIAM Review 40, pp. 195-206, 1998.

[5] G. Carpaneto, S. Martello, P. Toth. "An algorithm for the bottleneck traveling salesman problems", Operations Research 2, 380-389, 1984.

[6] E. Gabovic, A. Ciz, A. Jalas. "The bottleneck traveling salesman problem. (Russian)", Trudy Vy cisl. Centra Tartu. Gos. Univ. 22, pp. 3-24, 1971.

[7] R.S. Garfinkel, K.C. Gilbert. "The bottleneck traveling salesman problem: Algorithms and probabilistic analysis", Journal of ACM 25, pp. 435-448, 1978.

[8] P.C. Gilmore, R.E. Gomory. "Sequencing a one state-variable machine: a solvable case of the traveling salesman problem", Operations Research 12, pp. 655-679, 1964.

[9] D.E. Goldberg. Genetic Algorithms in Search, Optimization, and Machine Learning, Addison-Wesley, New York, 1989.

[10] D.S. Johnson. Machine comparison site, 2006.

[11] http://public.research.att.com/~dsj/ chtsp/speeds.html

[12] S. Kabadi, A.P. Punen. "The bottleneck TSP", in The traveling salesman problem and its variants, G. Gutin and A.P. Punen (eds.), Kluwer Academic Publishers, Secaucus, NJ, USA, 2002.

[13] M.-Y. Kao, M. Sanghi. "An approximation algorithm for a bottleneck traveling salesman problem", Journal of Discrete Algorithms doi: 10.1016/j.jda.2008.11.007 (2009)

[14] R.G. Parker, R.L. Rardin. "Guaranteed performance heuristics for the bottleneck traveling salesman problem", Operations Research Letter 2, pp. 269-272, 1984.

[15] J.M. Phillips, A.P. Punnen, S.N. Kabadi. 1998. "A linear time algorithm for the bottleneck traveling salesman problem on a Halin graph", Information Processing Letters 67, pp. 105-110, 1998.

[16] R. Ramakrishnan, P. Sharma, A.P. Punnen. 2009. "An efficient heuristic algorithm for the bottleneck traveling salesman problem", Opsearch 46, pp.275-288.

[17] M. Ramesh. A Lexisearch Approach to Some Combinatorial Programming Problems, Ph.D. Thesis, University of Hyderabad, India, 1997.

[18] TSPLIB, http://www.iwr.uni-heidelberg.de/iwr/comopt/software/TSPLIB95/

[19] G.L. Vairaktarakis. "On Gilmore-Gomory's open question for the bottleneck TSP", Operations Research Letters 31, pp. 483-491, 2003.

[20] C.-X. Wang, D.-W. Cui, Z.-R. Wang, D. Chen. "A novel ant colony system based on minimum 1-tree and hybrid mutation for TSP". in International Conference on Natural Computation, LNCS 3611, Springer-Verlag, pp. 1269-1278, 2005.

Zakir H. Ahmed

Department of Computer Science, Al-Imam Muhammad Ibn Saud Islamic University, PO. Box 5701, Riyadh 11432, Kingdom of Saudi Arabia

E-mail: zhahmed@gmail.com
Table 1: The cost matrix.

Node 1 2 3 4 5 6 7

1 999 75 99 9 35 63 8
2 75 999 86 46 88 29 20
3 99 86 999 16 28 35 28
4 9 46 16 999 59 53 49
5 35 88 28 59 999 76 72
6 63 29 35 53 76 999 52
7 8 20 28 49 72 52 999

Table 2: The alphabet table.

Node N--V N--V N--V N--V N--V N--V N--V

1 7-8 4-9 5-35 6-63 2-75 3-99 1-999
2 7-20 6-29 4-46 1-75 3-86 5-88 2-999
3 4-16 5-28 7-28 6-35 2-86 1-99 3-999
4 1-9 3-16 2-46 7-49 6-53 5-59 4-999
5 3-28 1-35 4-59 7-72 6-76 2-88 5-999
6 2-29 3-35 7-52 4-53 1-63 5-76 6-999
7 1-8 2-20 3-28 4-49 6-52 5-72 7-999

Table 3: TSPLIB instances for which lower bound and optimal solutions
are equal.

Instance n Opt. Sol. BST HSCS

burma14 14 418 NA 0.00
ulysses16 16 1504 37 0.00
gr17 17 282 13 0.04
gr21 21 355 49 0.00
ulysses22 22 1504 69 0.00
fri26 26 93 51 0.00
bayg29 29 111 35 0.01
bays29 29 154 46 0.04
swiss42 42 67 122 0.18
gr48 48 227 141 0.72
hk48 48 534 270 0.20
eil51 51 13 117 1.08
berlin52 52 475 272 0.05
st70 70 24 272 2.35
eil76 76 16 293 2.79
gr96 96 2807 356 0.18
eil101 101 13 396 9.48
gr120 120 220 673 59.67
bier127 127 7486 443 0.35
gr229 229 4027 1423 17.17

Average 267.26 4.72

Table 4: TSPLIB instances for which lower bound and optimal solutions
are different.

 BST HSCS
Instance n Lower First
 Bound Sol Time Sol Time Time

gr24 24 84 108 176 108 10.75 0.01
dantzig42 42 28 35 167 32 23.34 3.27
att48 48 504 519 446 519 25.33 2.31
brazil58 58 1686 2149 166 2149 42.03 0.06
lin105 105 308 487 3529 487 153.26 105.69
pr107 107 400 7050 665 7050 58.56 0.09
pr124 124 1492 3302 1429 3302 105.38 2.33
pr136 136 1094 2976 1414 2976 94.39 6.63
gr137 137 1738 NA NA 2132 124.66 83.95
pr152 152 650 5553 1801 5553 63.56 15.80
brg180 180 20 30 565 30 32.66 10.52
d198 198 1177 1380 2628 1380 260.62 2.56
gr202 202 2136 2230 2118 2230 295.08 0.69
pr226 226 2365 3250 6274 3250 808.19 562.16
pr264 264 696 4701 8613 4701 265.53 2.28
Average 2142.21 157.56 53.22

Table 5: TSPLIB instances which are found to be difficult.

 BST HSCS

Instance n Lower Best Avg.
 Bound Sol Time Sol Sol Time

rat99 99 20 20 406 20 22.87 77.48
kroA100 100 400 475 1348 479 524.32 95.33
kroB100 100 499 530 980 535 576.26 102.66
kroC100 100 498 498 475 505 515.34 80.34
kroD100 100 441 491 1576 499 526.16 113.03
kroE100 100 490 490 497 495 521.78 81.09
rd100 100 161 221 2617 221 228.12 108.19
pr144 144 1219 2570 1911 2570 2630.56 285.30
ch150 150 90 93 2465 98 115.46 237.00
kroA150 150 367 392 2405 410 466.46 312
u159 159 800 800 3768 825 878.12 145.74
si175 175 166 177 1376 179 200.52 212.56
rat195 195 21 21 2049 21 28.32 235.83
kroA200 200 341 408 12247 412 515.16 286.95
kroB200 200 323 344 2602 352 436.88 270.32
ts225 225 500 1000 1776 1500 1500.00 752.05
tsp225 225 32 36 8065 41 57.26 765.95
gil262 262 22 23 7844 27 38.00 812.32
a280 280 20 20 2196 20 23.54 431.66
pr299 299 401 498 17061 498 602.88 1044.12
lin318 318 308 487 8186 487 498.22 1232.06
Average 3897.62 365.81
COPYRIGHT 2010 Research India Publications
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2010 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Ahmed, Zakir H.
Publication:International Journal of Computational Intelligence Research
Date:Jul 1, 2010
Words:4272
Previous Article:A Mealy type of Neutrosophic finite automata.
Next Article:Recognition of audiovisual celebrity in unrestrained web videos.

Terms of use | Privacy policy | Copyright © 2021 Farlex, Inc. | Feedback | For webmasters |