1 / 14

G5BAIM Artificial Intelligence Methods

G5BAIM Artificial Intelligence Methods. Tabu Search. Characteristics of SA (review). Random selection of a neighbouring solution Probabilistic acceptance of non-improving solutions; The best solution is recorded: Lack of memory of history of search;

hawa
Download Presentation

G5BAIM Artificial Intelligence Methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. G5BAIMArtificial Intelligence Methods Tabu Search

  2. Characteristics of SA (review) • Random selection of a neighbouring solution • Probabilistic acceptance of non-improving solutions; • The best solution is recorded: • Lack of memory of history of search; • All the information found during the search is lost;

  3. Tabu Search Proposed independently by Glover(1986) and Hansen(1986); • Tabu search is “a meta-heuristic superimposed on another heuristic. The overall approach is to avoid entrapment in cycles by forbidding or penalizing moves which take the solution, in the next iteration, to points in the solution space previously visited (hence tabu).”

  4. Tabu Search (continued) • Accepts non-improving solutions deterministically in order to escape from local optima (where all the neighbouring solutions are non-improving) by guiding a steepest descent local seach (or steepest ascent hill climbing ) algorithm; • Uses of memory to: • prevent the search from revisiting previously visited solutions; • explore the unvisited areas of the solution space;

  5. Is memory useful during the search?

  6. Uses of memory during the search? • Intelligence needs memory! • Discouraging some patterns in solution: e.g. in clustering example, selection of some stores as warehouses or their assignment of some stores to a specific warehouse may be considered taboo (forbidden). • Information on characteristics of good solutions (or bad solutions!)

  7. Dangers of memory • Exhaustive usage of memory resources: • Design of efficient data structures to record and access the recorded data efficiently; • Collecting more data than could be handled: • Clear understanding of which attributes of solutions are crucial; • Limited selection of attributes of solutions to be memorised; • Clear strategy on usage of information or their disposal when not needed; • Memorising information which should not be remembered: • Misguiding patterns in local optima which are very different from global optimum;;

  8. Tabu Search algorithm Function TABU_SEARCH(Problem) returns a solution state • Inputs: Problem, a problem • Local Variables: Current, a state • Next, a state • BestSolutionSeen, a state • H, a history of visited states

  9. Tabu Search algorithm (continued) • Current = MAKE-NODE(INITIAL-STATE[Problem]) • While not terminte • Next = a highest-valued successor of Current • If(not Move_Tabu(H,Next) or Aspiration(Next)) then • Current = Next • Update BestSolutionSeen • H = Recency(H + Current) • Endif • End-While • ReturnBestSolutionSeen

  10. Elements of Tabu Search • Tabu List (short term memory): to record a limited number of attributes of solutions (moves, selections, assignments, etc) to be discouraged in order to prevent revisiting a visited solution; • Tabu tenure: number of iterations a tabu move is considered to remain tabu; • Aspiration criteria: accepting an improving solution even if generated by a tabu move • Similar to SA in always accepting improving solutions, but accepting non-improving ones when there is no improving solution in te neighbourhood;

  11. Elements of Tabu Search (continued) Long term memory): to record attributes of elite solutions to be used in: • Intensification: giving priority to attributes of a set of elite solutions (usually in weighted probability manner) • Diversification: Discouraging attributes of elite solutions in selection functions in order to diversify the search to other areas of solution space;

  12. Example of use of memory In our example of selecting warehouses, following information could be recorded: • Short term memory: • Maintain a list of t stores and prevent them from being selected as warehouse or prevent their assignments to specific selection of warehouses for a number of iterations; • Long term memory: • Maintain a list of t stores which have been selected as warehouses in the last k best solution and encourage (or discourage) their selection in future solutions by using their frequency of appearance in set of elite solutions and the quality of solutions which they have appeared in our selection function; • Which stores have been selected as warehouses in our elite solutions? • Which assignment of stores to warehouses in our elite solutions;

  13. References • 1. Glover, F. 1989. Tabu Search – Part I. ORSA Journal on Computing, Vol. 1, No. 3, pp 190-206. • 2. Glover, F. 1990. Tabu Search – Part II. ORSA Journal on Computing, Vol. 2, No. 1, pp 4-32. • 3. Glover, F., Laguna, M. 1998. Tabu Search. Kluwer Academic Publishers • 4. Rayward-Smith, V.J., Osman, I.H., Reeves, C.R., Smith, G.D. 1996. Modern Heuristic Search Methods, John Wiley & Sons. • 5. Russell, S., Norvig, P. 1995. Artificial Intelligence A Modern Approach. Prentice-Hall

  14. G5BAIMArtificial Intelligence Methods End of Tabu Search

More Related