440 likes | 762 Views
Paper Review for ENGG6140 Memetic Algorithms. By: Jin Zeng Shaun Wang School of Engineering University of Guelph Mar. 18, 2002. Contents. Introduction MA and GA Basic MA Examples Conclusions. Introduction History of MA.
E N D
Paper Review for ENGG6140Memetic Algorithms By: Jin Zeng Shaun Wang School of Engineering University of Guelph Mar. 18, 2002
Contents • Introduction • MA and GA • Basic MA • Examples • Conclusions
IntroductionHistory of MA • ‘Meme’: word introduced by Richard Dawkins when he describe cultural evolution in his best-seller book “The Selfish Gene’’ (‘76). • “Memetic Algorithms’’ Analogous role of gene but in the field of cultural evolution.‘Memetic Algorithms’ , firstly proposed by P. Moscarto. (‘89) • MA has been widely applied in optimization and solving many NP hard problems successfully.
Introduction What is ‘Meme’? • Meme is the basic unit of cultural transmission, in analagy to gene in genetic transmission. • Meme is replicated by imitation. • It can be changed by the owner for adaption. • Examples: ideas, clothing fashion and NBA. • High-extent variation occurs in cultural transmission.
IntroductionCultural Evolution • When a meme passed between individuals, the individual will adapt the meme as it sees best. • Shared characteristics are not inherited due to simple processes of recombination of previous solutions • Using historical information and an external logic to speed-up the process.
Introduction What is MA? • MA mimics the process of cultural evolution • Characterization of evolutionary algorithms that can hardly fit the GAs methaphor - no, or small, relation with biology • ‘Hybrid GAs’ MAs • ‘Scatter Search’ (Glover, ‘77) MAs
IntroductionWhy MA? • In general, there are two ways to searching the solution space: • Exploration: Investigate the new and unknown areas in the search space; • Exploitation: Make use of knowledge found before to help find better solutions • Both are necessary but contradictory in solving an optimization problem.
IntroductionWhy MA? (cont.) • The limitation of former algorithms: • GA: using parallel searching technique. • Good at avoiding local optima • Not well suited for finely tuned search. • LS: improvement heuristics. • Find local optima quickly. • Highly depending on the start point. • Hard to find a global optimum.
IntroductionWhy MA? (cont.) Combination of GA + Local SearchMA • GA: For exploration; • LS: For exploitation; • Result: higher efficiency and better effect.
IntroductionCombination Methods • Two kinds of Combinations:
MA and GASimilarities • Both MA and GA model an evolutionary process. • Both MA and GA have the process of generalization, recombination (crossover) and mutation. Some changes occur in the process. • Both MA and GA use fitness function to evaluate the changes in the process thus both of them are applied in optimization successfully.
Basic MALocal Search • Full Local Search and Partial Local Search • Demo of FLS
Basic MADemonstration of MA • Example Problems: Y= f(x); • Parameters of MA: • Population: 5; • Xover rate:0.4; (# of Xover: 5x0.4=2) • Mutation rate: 0.4; (# of Mutation: 5x0.4=2) • Local Search: Full
Basic MAEffect of Crossover and Mutation • Both can be used for exploring the search space by “jumping” to new regions to start new local search; • Crossover Searching the region between two or more specified points; • Mutation Searching the undirected region randomly;
Basic MAAdvantage of MA • Combining the advantages of GA and LS while avoid the disadvantages of both; • GA ensures wide exploration in the solution space • Through local search, the space of possible solutions can be reduced to the subspace of local optima. • When the scale of problem increases, the advantages becomes remarkable.
Basic MADisadvantage of MA • The proportion of computations used in exploration and exploitation depends on the real optimization problem. • It is hard to determine the best depth of local search,.
MA ExamplesSome Implementation Examples of MA • Quadratic Assignment Problem (QAP) • Traveling Salesman Problem (TSP) • Vehicle Routing • Graph Partitioning • Scheduling • The Knapsack Problem
MA ExamplesApply Local Search to MA in QAP • For any permutation solution being explored, the procedure for the local search be executed once or several times –– partial local search (PLS) • The procedure for the local search be repeated many times until no further improvement is possible –– full local search (FLS)
MA ExamplesDerived Two different MAs for QAP • PGA –– starts with an initial population of randomly generated individuals. For each individual, after xover and mutation, a PLS is performed. • FGA –– relies on FLS, full local search are carried out on all individuals at the beginning and at the end of a SGA run.
MA ExamplesBriefly Steps involved for the PGA • The steps for PGA is same as the Basic MA. • The procedures for the local search only executed once or several times after each xover and mutation.
MA ExamplesBriefly Steps involved for the FGA • 1. Randomly generate an initial population. Perform FLS on each individual. • 2: While terminating criterion is not reached, continue with procedures as spelled out for the SGA. • 3: Perform FLS on the best solution and output the final solution.
MA ExamplesComparison of FGA and PGA • The effectiveness of FLS depends on the starting solution and the exchange routine. • PLS can be carried out more frequently, the algorithm is therefore able to spread out the search by exploring many small-localized regions, thus reducing the likelihood of the algorithms being trapped in a local optimum.
MA ExamplesComparison of FGA and PGA (cont.) • As the size of the problem scales up, it is difficult to carry out FLS freely due to its great computational intensity. • PLS is carried out for almost all the individuals in addition to the SGA evolutionary mechanisms, the capability of the SGA in evolving towards fitter individuals is greatly enhanced.
MA ExamplesComparison of FGA and PGA (cont.) • FLS limits the exploratory capability of the SGA, it will reduce the chance of the FGA reaching the global optimum. • PGA has a greater chance of obtaining the global optimum as compared to FGA.
MA ExamplesComparison of a typical run on problem Els19 for SGA, PGA and FGA
Conclusion • MA provides a more efficient and more robust way to the optimization problem. • MA combines global and local search by using EA to perform exploration while another local search method performs exploitation. • MA can solve some typical optimization problem where other meta-heuristics have failed.