Transcript Document

Traveling Salesman Problem:
Improvement of Heuristic Algorithms
12
2
8
12
10
4
11
11 10
3 3
1
6
5
9
7
12
9
7
6
Heuristic Algorithms
• The TSP is an NP-hard problem
– In many cases, we rely on heuristic algorithms to find a
near optimal solution
– A heuristic algorithm refers to some simple rules that are
based on some “common sense”
• Types of heuristic algorithms
– Construction heuristic algorithm
• To obtain a single solution
• Nearest neighbor, cheapest insertion
– Improvement heuristic algorithm
• To iteratively improve a known solution
• Local search algorithm
• Improved local search
– Simulated annealing, tabu search
Local Search
• Local search is to improve a known solution by
searching the neighborhood of the current solution
– Such as solving a nonlinear programming program
– A local optimal solution is usually obtained
• Example: min F(x)=4x4+3x3+2x2+x
– Try x1=1, F(x1)=10, F’(x1)=30>0
– Next step: try x2<x1 or x2>x1?
Local Search Algorithm
• General steps
– Initialization: find any feasible solution, denoted by S, as
the initial solution
– Iterations
• For the current solution, find all neighborhood solutions under a
specific definition of neighborhood
• Choose the best neighborhood solution, denoted by S1
• If S1 is better than S, update S by S=S1. Repeat the iterations
• If S1 is not better than S. Stop
• Simplex method to solve LP is local search
– It finds an optimal solution
• Definitions of neighborhood for TSP
– Sub-tour reversal
– Insertion
– Swap
Sub-tour Reversal
• A neighborhood solution can be obtained by
reversing a sub-tour
– Choose a sub-tour, and reverse the sequence of nodes
– For example, for solution (1,2,3,4,5,6,7) with cost=69, if
we choose to reverse (3,4), we have a new solution
(1,2,4,3,5,6,7) with cost=65.
– In general, a sub-tour i1i2i3i4i5 will be reversed to i5i4i3i2i1
12
2
8
12
10
4
11
11 10
6
5
9
7
12
9
7
8
12
3 3
1
12
2
10
4
11 10
6
5
6
9
7
12
9
7
8
12
11
3 3
1
12
2
10
4
11
11 10
3 3
1
6
5
6
9
7
12
9
7
6
Example for Sub-tour Reversal
• Initialization S=(1,2,3,4,5,6,7) with cost=69
• Neighborhood solutions
–
–
–
–
–
–
–
–
–
–
–
–
–
(2,1,3,4,5,6,7): infeasible (or very large cost)
(3,2,1,4,5,6,7): infeasible (or very large cost)
(4,3,2,1,5,6,7): infeasible
(5,4,3,2,1,6,7): infeasible
(6,5,4,3,2,1,7): cost=69 (same as T on undirected network)
(7,6,5,4,3,2,1): cost=69 (same as T on undirected network)
(1,3,2,4,5,6,7): cost=68
(1,4,3,2,5,6,7): infeasible
Question: how many neighborhood
(1,5,4,3,2,6,7): infeasible
solutions to consider?
(1,6,5,4,3,2,7): infeasible
(1,7,6,5,4,3,2): cost=69 (same as T on undirected network)
(1,2,4,3,5,6,7): cost=65
……
• It can be verified that (1,2,4,3,5,6,7) is the best neighborhood
solution
– Tie with (1,2,3,5,4,6,7)
Example for Sub-tour Reversal
• Now S=(1,2,4,3,5,6,7) with cost=65
• New neighborhood solutions
– There is only one neighborhood solution that
improves S: (1,2,4,6,5,3,7) with cost = 64
• Now S=(1,2,4,6,5,3,7) with cost=64
• New neighborhood solutions
– No neighborhood solutions that can improve S
– Stop (with a local optimum)
• Optimal tour is (1,2,4,6,7,5,3) with cost = 63, which
cannot be reached by such a local search
Insertion Neighborhood
• For a solution S, a neighborhood solution can
be obtained by removing one node from the
sequence and insert it to another position
– S=(i1, i2, …, ik-1, ik, ik+1, …, il-1, il, il+1, …, in)
– A neighborhood solution by inserting the k-th node
before the l-th node
(i1, i2, …, ik-1, ik+1, …, il-1, ik, il, il+1, …, in)
– For example, from (1,2,4,6,5,3,7) we can get the
optimal tour (1,2,4,6,7,5,3) by insert the 7th node
before the 5th node
Swap Neighborhood
• For a solution S, a neighborhood solution can
be obtained by swapping the positions of two
nodes
– S=(i1, i2, …, ik-1, ik, ik+1, …, il-1, il, il+1, …, in)
– A neighborhood solution by swapping the k-th
node and l-th node
(i1, i2, …, ik-1, il, ik+1, …, il-1, ik, il+1, …, in)
– For example, from (1,2,4,6,5,3,7) we can get one
neighborhood solution (1,2,7,6,5,3,4) by swapping
the 3rd and 7th nodes
Local Optimality
• For a TSP problem, local optimality is unavoidable
by any local search algorithm, unless the
neighborhood is defined as the entire feasible region
– The larger a neighborhood is, the higher quality solution
we may obtain, but the more computation time is also
needed
• Size of neighborhood
– Sub-tour reversal O(n2)
– Swap O(n2)
– Insertion O(n2)
• In order not to be trapped by local optimum, better
heuristic algorithms are desired
– Able to jump out of a local optimum
Simulated Annealing Algorithm
• A worse neighborhood solution will be accepted with
a probability
• General steps
– Find an initial current solution
• Let the objective function value be Zc
– Repeat the following steps
• Randomly select a neighborhood solution with Zn as the objective
function value
• If Zn≤Zc, let the neighborhood solution as the current solution
• If Zn>Zc, let the neighborhood solution as the current solution with
the probability of ex where x = (Zc – Zn)/T.
T, a positive parameter,
referred to as the temperature,
controls the probability of
accepting a worse
neighborhood solution
Simulated Annealing Algorithm
• The probability of accepting a worse neighborhood solution
depends on (Zc – Zn)/T
– How worse it is: a smaller Zn – Zc (not too bad) will have a higher
probability
– The temperature T: a larger T will have a higher probability
• The annealing schedule
– T is set to be a large value at the beginning, then it will be gradually
reduced
– In early iterations, a worse neighborhood solution will be very likely
accepted so that the algorithm can explore different areas of the feasible
region
– In later iterations, the probability of accepting a worse neighborhood
solution becomes smaller so that the focus of the algorithm becomes on
improving the solution locally
• When the algorithm stops
– When the temperature T becomes a very small value, or
– When there is no solution improvement for a number of continuous
iterations
Simulated Annealing Algorithm
• The algorithm is based on the analogy to a physical
annealing process
• The annealing process
– Melting a metal at a high temperature
• With high energy
• Correspondence in the SA algorithm: An initial solution with a high
cost
– Gradually cooling down the metal until it reaches a stable
state with desired property
• Low energy state
• Correspondence in the SA algorithm: An improved solution with a
low cost
– At any temperature, the energy level of the atoms fluctuates,
but tending to decrease
• Correspondence in the SA algorithm: A higher-cost solution may
be accepted during the process
Example of Simulated Annealing
• Initial solution
– Assumed to be (1,2,3,4,5,6,7) with Zc=69
• Neighborhood definition
– Assumed to be the sub-tour reversal
– Select a neighborhood solution
• Randomly select a starting node and an ending node as the sub-tour
• Temperature schedule
–
–
–
–
–
T1=10
T2=0.5T1
T3=0.5T2
T4=0.5T3
T5=0.5T4
In large-size problems, the temperature T
may be reduced at a slower pace. For
example, Tk+1=0.99Tk. Usually it is a
parameter that can be tuned by experiments.
Example of Simulated Annealing
• The first iteration
– Randomly choose a neighborhood solution. Suppose it is (1,2,4,3,5,6,7)
with cost Zn=65
– Because Zn<Zc, accept the neighborhood solution for sure
• (1,2,4,3,5,6,7) becomes the current solution and Zc=65
• The second iteration
– Randomly choose a neighborhood solution. Suppose it is (1,2,4,6,5,3,7)
with cost Zn=64
– Because Zn<Zc, accept the neighborhood solution for sure
• (1,2,4,6,5,3,7) becomes the current solution and Zc=64
• The third iteration
– Randomly choose a neighborhood solution. Suppose it is (1,2,4,6,5,7,3)
with cost Zn=66
– Because Zn>Zc, accept the neighborhood solution with probability
e(64-66)/10 = e -0.2 = 0.819
– How to implement: Generate a random number x in (0,1). If x<0.819,
accept the neighborhood solution.
Example of Simulated Annealing
• The third iteration (Cont’d)
– If (1,2,4,6,5,7,3) is accepted in the above step, then
the next neighborhood solution will be the one
selected from the neighborhood of (1,2,4,6,5,7,3)
– If (1,2,4,6,5,7,3) is NOT accepted, then the next
neighborhood solution will be the one selected
from the neighborhood of (1,2,4,6,5,3,7)
• The fourth iteration
– Find a new neighborhood solution
– ……
Summary of Quiz 1
• Mean = 80.8
• Standard deviation = 12.3
• Distribution
–
–
–
–
–
[90-99]: 13
[80-89]: 26
[70-79]: 15
[60-69]: 4
Below 60: 5