410 likes | 781 Views
Characteristics of SA (review). Random selection of a neighbouring solutionProbabilistic acceptance of non-improving solutions;The best solution is recorded:Lack of memory of history of search;All the information found during the search is lost;. Tabu Search. Proposed independently by Glover (19
E N D
1. G5BAIMArtificial Intelligence Methods Andrew Parkes
2. Characteristics of SA (review) Random selection of a neighbouring solution
Probabilistic acceptance of non-improving solutions;
The best solution is recorded:
Lack of memory of history of search;
All the information found during the search is lost;
3. Tabu Search Proposed independently by Glover (1986) and Hansen (1986)
“a meta-heuristic superimposed on another heuristic. The overall approach is to avoid entrapment in cycles by forbidding or penalizing moves which take the solution, in the next iteration, to points in the solution space previously visited (hence tabu).”
4. Tabu Search (continued) Accepts non-improving solutions deterministically [no randomness]
in order to escape from local optima (where all the neighbouring solutions are non-improving)
by guiding a steepest descent local search (or steepest ascent hill climbing) algorithm
SA uses randomness to escape
TS uses forbidden areas
5. Tabu Search (continued) After evaluating a number of neighbourhoods, we accept the best one, even if it is low quality on cost function.
Accept worse move
Uses of memory in two ways:
prevent the search from revisiting previously visited solutions;
explore the unvisited areas of the solution space;
for example,
6. Tabu Search (continued) Use past experiences to improve current decision making.
By using memory (a “tabu list”) to prohibit certain moves – “makes tabu search attempt to be a global optimizer rather than a local optimizer.”
7. Tabu Search vs. Simulated Annealing Accept worse move
Selection of neighbourhoods
Use of memory
8. Is memory useful during the search?
9. Uses of memory during the search? Intelligence needs memory!
Information on characteristics of good solutions (or bad solutions!)
10. Uses of memory during the search? Tabu move – what does it mean?
Not allowed to re-visit exactly the same state that we’ve been before
Discouraging some patterns in solution: e.g. in TSP problem, tabu a state that has the towns listed in the same order that we’ve seen before.
If the size of problem is large, lot of time just checking if we’ve been to certain state before.
11. Uses of memory during the search? Tabu move – what does it mean?
Not allowed to return to the state that the search has just come from.
just one solution remembered
smaller data structure in tabu list
12. Uses of memory during the search? Tabu move – what does it mean?
Tabu a small part of the state
In TSP problem, tabu the two towns just been considered in the last move – search is forced to consider other towns.
13. Uses of memory during the search? In Tabu Search
What neighbourhood we use (as for SA)
What constitute a tabu list
14. Exercise: Tabu for SAT Recall: (MAX-)SAT means
set of boolean variables: x, y, z, …
constraints are many clauses: “x or y or not z”
goal pick T or F for each variable so as to satisfy as many clauses as possible
SAT local move: flip one variable
Exercise:
What are our options for how to do a tabu list?
(Hardest part here is understanding what the question means and why it needs to be asked!)
15. Tabu for SAT SAT local move: flip a variable
Suppose have just 3 variables:
states are just “( T/F, T/F, T/F )”
Option ONE:
Tabu an entire state: e.g. tabu ( T, F, F )
Option TWO:
Tabu a flip: e.g. tabu list { x } would mean that x cannot be flipped
16. Tabu for SAT Tabu an entire state
Tabu a flip
These are very different!!
“tabu a entire state” is rarely (never?) used
“tabu a flip” is very common
17. Tabu for SAT Suppose that have x on the tabu list, and currently x=T
then the entire space with x=F is tabu
this is half the search space!!
E.g. n=20
2^20 = 10^6 states
Tabuing a single state will have miniscule effect, but tabu’ing half of them is a strong enough restriction to aid the search
“x=F” is an example of an attribute
18. Tabu Design Lesson:
place tabu on attributes not just single states!
Design Issues for TS:
pick neighbourhood (as usual)
pick attributes used to build tabu lists
19. Dangers of memory Exhaustive usage of memory resources
Design of efficient data structures to record and access the recorded data efficiently;
Hash table
Binary tree
Memorising information which should not be remembered
20. Dangers of memory Collecting more data than could be handled:
Clear understanding of which attributes of solutions are crucial;
Limited selection of attributes of solutions to be memorised;
Clear strategy on usage of information or their disposal when not needed;
21. Tabu Search algorithm Function TABU_SEARCH(Problem) returns a solution state
Inputs: Problem, a problem
Local Variables:
Current, a state
Next, a state
BestSolutionSeen, a state
H, a history of visited states
22. Tabu Search algorithm (cont.) Current = MAKE-NODE(INITIAL-STATE[Problem])
While not terminate
Next = a highest-valued successor of Current
If(not Move_Tabu(H,Next) or Aspiration(Next)) then
Current = Next
Update BestSolutionSeen
H = Recency(H + Current)
Endif
End-While
Return BestSolutionSeen
23. Elements of TS: Recency Memory related - recency (How recent the solution has been reached)
Tabu List (short term memory): to record a limited number of attributes of solutions (moves, selections, assignments, etc) to be discouraged in order to prevent revisiting a visited solution;
Tabu tenure (length of tabu list): number of iterations a tabu move is considered to remain tabu;
24. Elements of Tabu Search Memory related – recency (How recent the solution has been reached)
Tabu tenure
List of moves does not grow forever – restrict the search too much
Restrict the size of list
FIFO
Other ways: dynamic
25. Elements of Tabu Search Part of TS framework, though not “tabu”
Long term memory: record attributes of elite solutions
E.g. in SAT it might be all good solutions have x33=F
might want to observe, remember and exploit this during search
Memory related: frequency
observe frequency of selected attributes
26. Diversification vs. Intensification Many search issues involve a trade-off between:
Diversification:
Want to cause the search to consider new areas
Intensification:
want to intensely explore known good areas
27. Diversification vs. Intensification & TS Diversification by:
Discouraging attributes of elite solutions in selection functions
Intensification by:
Giving priority to attributes of a set of elite solutions (usually in some weighted probability manner)
Again, elements of this might well be incorporated into other search methods
28. Elements of TS: Aspiration If a move is good, but it’s tabu-ed, do we still reject it?
Aspiration criteria: accepting an improving solution even if generated by a tabu move
Similar to SA in always accepting improving solutions, but accepting non-improving ones when there is no improving solution in the neighbourhood;
29. Example: Tabu for MAX-SAT MAX-SAT:
set of n boolean variables
set of clauses
find truth assignment that maximizes the number of satisfied clauses
30. Example: Tabu for MAX-SAT Neighbourhood
“flip”, T to F, or F to T just one variable
Tabu List:
maintain a (short) list of variables not to be flipped
Recency:
remember when each variable was last flipped
bias search towards flipping most/least recently flipped variables
Aspiration: Ignore tabu/recency if it would give a new Best-So-Far solution
31. Example: TSP using Tabu Search In our example of TSP:
Find the list of towns to be visited so that the travelling salesman will have the shortest route
Short term memory:
Maintain a list of t towns and prevent them from being selected for consideration of moves for a number of iterations;
After a number of iterations, release those towns by FIFO
32. Example: TSP using Tabu Search In our example of TSP:
Long term memory:
Maintain a list of t towns which have been considered in the last k best (worst) solutions
encourage (or discourage) their selections in future solutions
using their frequency of appearance in the set of elite solutions and the quality of solutions which they have appeared in our selection function
33. Example: TSP using Tabu Search In our example of TSP:
Aspiration:
If the next moves consider those moves in the tabu list but generate better solution than the current one
Accept that solution anyway
Put it into tabu list
34. Tabu Search Pros & Cons Pros
Generated generally good solutions for optimisation problems compared with other AI methods
Cons
Tabu list construction is problem specific
No guarantee of global optimal solutions
35. Minimal Expectations Understand the meaning and use of “attributes”
Terminology of
tabu list, tenure
recency
aspiration, etc
Have idea of how to apply TS to our std. domains: MAX-SAT, GCP, TSP, etc
Good understanding of the issue of “intensification vs. diversification”
36. Summary Use of memory in search
Tabu Search algorithm
Elements of Tabu Search
Example of Tabu Search on TSP
37. SA vs. TS
38. References Glover, F. 1989. Tabu Search – Part I. ORSA Journal on Computing, Vol. 1, No. 3, pp 190-206.
Glover, F. 1990. Tabu Search – Part II. ORSA Journal on Computing, Vol. 2, No. 1, pp 4-32.
Glover, F., Laguna, M. 1998. Tabu Search. Kluwer Academic Publishers
Rayward-Smith, V.J., Osman, I.H., Reeves, C.R., Smith, G.D. 1996. Modern Heuristic Search Methods, John Wiley & Sons.
Russell, S., Norvig, P. 1995. Artificial Intelligence A Modern Approach. Prentice-Hall
39. Other Issues Miscellaneous other issues
Mainly for self-study
40. Hybrid? If a hybrid makes sense then it is worth considering
probably someone has already tried it
E.g. SA & TS
We could use both
randomness of SA
tabu’d moves of TS
It is quite common that some kind of tabu list is added to other meta-heuristics
41. G5BAIMArtificial Intelligence Methods