600 likes | 820 Views
Heuristic Optimization Methods. Scatter Search. Agenda. Scatter Search (SS) For Local Search based Metaheuristics: SA based on ideas from nature TS based on problem-solving and learning For population based Metaheuristics: GA based on ideas from nature
E N D
Heuristic Optimization Methods Scatter Search
Agenda • Scatter Search (SS) • For Local Search based Metaheuristics: • SA based on ideas from nature • TS based on problem-solving and learning • For population based Metaheuristics: • GA based on ideas from nature • SS based on problem-solving and learning • Nature works, but usually very slowly • Being clever is better than emulating nature?
The following is a presentation previously held at the conference ICS 2003... Scatter Search:Methodology and Applications Manuel Laguna University of Colorado Rafael Martí University of Valencia
Based on … Laguna, M. and R. Martí (2003) Scatter Search: Methodology and Implementations in C, Kluwer Academic Publishers, Boston.
Scatter Search Methodology
Metaheuristic • A metaheuristic refers to a master strategy that guides and modifies other heuristics to produce solutions beyond those that are normally generated in a quest for local optimality. • A metaheuristic is a procedure that has the ability to escape local optimality
Metaheuristic Classification • x/y/z Classification • x = A (adaptive memory) or M (memoryless) • y = N (systematic neighborhood search) or S (random sampling) • Z = 1 (one current solution) or P (population of solutions) • Some Classifications • Tabu search (A/N/1) • Genetic Algorithms (M/S/P) • Scatter Search (M/N/P)
P Repeat until |P| = PSize RefSet Stop if no more new solutions Scatter Search Diversification Generation Method Improvement Method Improvement Method Reference Set Update Method Solution Combination Method Subset Generation Method
P RefSet Scatter Search with Rebuilding Diversification Generation Method Repeat until |P| = PSize Improvement Method Improvement Method Reference Set Update Method Stop if MaxIter reached Solution Combination Method Improvement Method Subset Generation Method No more new solutions Diversification Generation Method
Tutorial • Unconstrained Nonlinear Optimization Problem
Diversification Generation Method Subrange 1 Subrange 2 Subrange 3 Subrange 4 -10 -5 0 +5 +10 Probability of selecting a subrange is proportional to a frequency count
Improvement Method Nelder and Mead (1965)
Reference Set Update Method(Initial RefSet) Objective function value to measure quality b1 high-quality solutions Min-max criterion and Euclidean distances to measure diversity b2 diverse solutions RefSet of size b
Initial RefSet High-Quality Solutions Diverse Solutions
Subset Generation Method • All pairs of reference solutions that include at least one new solution • The method generates (b2-b)/2 pairs from the initial RefSet
1 Best 2 . . . New trial solution b Worst Updated RefSet Reference Set Update Method Quality 1 Best 2 . . . b Worst RefSet of size b
Static Update Pool of new trial solutions Quality 1 Best 2 . . . Updated RefSet = Best b from RefSet Pool b Worst RefSet of size b
Additional Strategies • Reference Set • Rebuilding • Multi-tier • Subset Generation • Subsets of size > 2 • Combination Method • Variable number of solutions
RefSet Rebuilt RefSet b1 b2 Diversification Generation Method Reference Set Update Method Rebuilding
Solution Combination Method Improvement Method RefSet Try here first b1 If it fails, then try here b2 2-Tier RefSet
Solution Combination Method Improvement Method RefSet Try here first b1 If it fails, then try here b2 Try departing solution here b3 3-Tier RefSet
Subset Generation • Subset Type 1: all 2-element subsets. • Subset Type 2: 3-element subsets derived from the 2-element subsets by augmenting each 2-element subset to include the best solution not in this subset. • Subset Type 3: 4-element subsets derived from the 3-element subsets by augmenting each 3-element subset to include the best solutions not in this subset. • Subset Type 4: the subsets consisting of the best i elements, for i = 5 to b.
Variable Number of Solutions Quality 1 Best 2 . . . Generate 5 solutions Generate 3 solutions Generate 1 solution b Worst RefSet of size b
Hybrid Approaches • Use of Memory • Tabu Search mechanisms for intensification and diversification • GRASP Constructions • Combination Methods • GA Operators • Path Relinking
Multiobjective Scatter Search • This is a fruitful research area • Many multiobjective evolutionary approaches exist (Coello, et al. 2002) • SS can use similar techniques developed for MOEA (multiobjective evolutionary approches)
Multiobjective EA Techniques • Independent Sampling • Search on f(x) = wi fi(x) • Change weights and rerun • Criterion Selection • Divide reference set into k subsets • Admission to ith subset is according to fi(x)
Advanced Designs • Reference Set Update • Dynamic / Static • 2 Tier / 3 Tier • Subset Generation • Use of Memory • Explicit Memory • Attributive Memory • Path Relinking
An ExampleThe Linear Ordering Problem • Given a matrix of weights E = {eij}mxm, the LOP consists of finding a permutation p of the columns (and rows) in order tomaximize the sum of the weights in the upper triangle • Applications • Triangulation for Input-Output Economic Tables. • Aggregation of individual preferences • Classifications in Sports Maximize
An Instance 1 2 3 4 3 4 1 2 1 2 3 4 3 4 1 2 p=(1,2,3,4) cE(p)=12+5+3+2+6+9=37 p*=(3,4,1,2) cE(p*)=9+8+3+11+4+12=47
Diversification Generator • Use of problem structure to create methods in order to achieve a good balance between quality and diversity. • Quality • Deterministic constructive method • Diversity • Random Generator • Systematic Generators (Glover, 1998) • GRASP constructions. • The method randomly selects from a short list of the most attractive sectors. • Use of Memory • Modifying a measure of attractiveness proposed by Becker with a frequency-based memory measure that discourages sectors from occupying positions that they have frequently occupied.
Diversity vs. Quality • Compare the different generators • Create a set of 100 solutions with each one d = Standardized Diversity C = Standardized Quality
Improvement Method • INSERT_MOVE (pj, i)consist of deleting pjfrom its current position j to be inserted in position i • Apply a first strategy • scans the list of sectors in search for the first sector whose movement results in an improvement MoveValue = CE(p’) - CE(p) CE(p’) = 78 + (1 - 4) + (6 - 0) + (2 - 6) + (13 - 4)= 78 + 8 = 86
Solution Combination Method • The method scans (from left to right) each reference permutation. • Each reference permutation votes for its first element that is still not included in the combined permutation (“incipient element”). • The voting determines the next element to enter the first still unassigned position of the combined permutation. • The vote of a given reference solution is weighted according to the incipient element’s position. Incipient element (3,1,4,2,5) votes for 4 Solution under construction: (1,4,3,5,2) votes for 4 (3,1,2,4,_ ) (2,1,3,5,4) votes for 5
Experiments with LOLIB • 49 Input-Output Economic Tables
Another ExampleA commercial SS implementation • OptQuest Callable Library (by OptTek) • As other context-independent methods separates the method and the evaluation.
OptQuest based Applications Solution Generator Solution Evaluator
Feasibility and Evaluation User Implementation Returns to OptQuest The OptQuest engine generates a new solution
Comparison with Genocop • Average on 28 hard nonlinear instances
Conclusions • The development of metaheuristics usually entails a fair amount of experimentation (“skill comes from practice”). • Code objectives: • Quick Start • Benchmark • Advanced Designs • Scatter Search provides a flexible “framework” to develop solving methods.
Metaheuristic Classification • x/y/z Classification • x = A (adaptive memory) or M (memoryless) • y = N (systematic neighborhood search) or S (random sampling) • Z = 1 (one current solution) or P (population of solutions) • Some Classifications • Tabu search (A/N/1) • Genetic Algorithms (M/S/P) • Scatter Search (M/N/P)
Some Classifications (local search) Tabu Search A/N/1 Simulated Annealing M/S/1 (systematic) (randomized) Genetic Algorithm M/S/P Scatter Search M/N/P (population)
About the Classifications • Our four main methods (SA, TS, GA, SS) all belong far from the center (they are very randomized or very systematic) • Other methods have both some element of randomized and some element of systematic behaviour • Most implementations will mix the ingredients, and we have an element of local search in population based methods (e.g., Memetic Algorithms), or an element of randomness in systematic approaches (such as random tabu tenure in TS) • The classifications highlight the differences between methods, but there are also many similarities
GA vs. SS (1) • GA has a ”long” history: proposed in the 1970s, and immediately becoming popular • Not initially used for optimization • Gradually morphed into a metodology whose major concern is the solution of optimization problems • The concepts and principles of SS was also proposed early (1970s), but was not popularized until the 1990s • The SS template most often used is from 1998 • Propsed to solve Integer Programming problems
GA vs. SS (2) • GA is based on natural processes (genetics, the ”survival of the fittest”, and imitation of the nature) • SS is based on strategic ideas for how to use adaptive memory • Some TS concepts are critically linked with SS