1 / 35

Paper Review for ENGG6140 Memetic Algorithms

Paper Review for ENGG6140 Memetic Algorithms. By: Jin Zeng Shaun Wang School of Engineering University of Guelph Mar. 18, 2002. Contents. Introduction MA and GA Basic MA Examples Conclusions. Introduction History of MA.

bambi
Download Presentation

Paper Review for ENGG6140 Memetic Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Paper Review for ENGG6140Memetic Algorithms By: Jin Zeng Shaun Wang School of Engineering University of Guelph Mar. 18, 2002

  2. Contents • Introduction • MA and GA • Basic MA • Examples • Conclusions

  3. IntroductionHistory of MA • ‘Meme’: word introduced by Richard Dawkins when he describe cultural evolution in his best-seller book “The Selfish Gene’’ (‘76). • “Memetic Algorithms’’ Analogous role of gene but in the field of cultural evolution.‘Memetic Algorithms’ , firstly proposed by P. Moscarto. (‘89) • MA has been widely applied in optimization and solving many NP hard problems successfully.

  4. Introduction What is ‘Meme’? • Meme is the basic unit of cultural transmission, in analagy to gene in genetic transmission. • Meme is replicated by imitation. • It can be changed by the owner for adaption. • Examples: ideas, clothing fashion and NBA. • High-extent variation occurs in cultural transmission.

  5. IntroductionCultural Evolution • When a meme passed between individuals, the individual will adapt the meme as it sees best. • Shared characteristics are not inherited due to simple processes of recombination of previous solutions • Using historical information and an external logic to speed-up the process.

  6. Introduction What is MA? • MA mimics the process of cultural evolution • Characterization of evolutionary algorithms that can hardly fit the GAs methaphor - no, or small, relation with biology • ‘Hybrid GAs’  MAs • ‘Scatter Search’ (Glover, ‘77)  MAs

  7. IntroductionWhy MA? • In general, there are two ways to searching the solution space: • Exploration: Investigate the new and unknown areas in the search space; • Exploitation: Make use of knowledge found before to help find better solutions • Both are necessary but contradictory in solving an optimization problem.

  8. IntroductionWhy MA? (cont.) • The limitation of former algorithms: • GA: using parallel searching technique. • Good at avoiding local optima • Not well suited for finely tuned search. • LS: improvement heuristics. • Find local optima quickly. • Highly depending on the start point. • Hard to find a global optimum.

  9. IntroductionWhy MA? (cont.) Combination of GA + Local SearchMA • GA: For exploration; • LS: For exploitation; • Result: higher efficiency and better effect.

  10. IntroductionCombination Methods • Two kinds of Combinations:

  11. MA and GASimilarities • Both MA and GA model an evolutionary process. • Both MA and GA have the process of generalization, recombination (crossover) and mutation. Some changes occur in the process. • Both MA and GA use fitness function to evaluate the changes in the process thus both of them are applied in optimization successfully.

  12. MA and GADifference

  13. Basic MAFlow Chart Process

  14. Basic MAPseudo Code of MA

  15. Basic MAGeneralization

  16. Basic MACrossover

  17. Basic MAMutation

  18. Basic MALocal Search • Full Local Search and Partial Local Search • Demo of FLS

  19. Basic MADemonstration of MA • Example Problems: Y= f(x); • Parameters of MA: • Population: 5; • Xover rate:0.4; (# of Xover: 5x0.4=2) • Mutation rate: 0.4; (# of Mutation: 5x0.4=2) • Local Search: Full

  20. Basic MADemonstration of MA (Continued)

  21. Basic MADemonstration of MA (Continued)

  22. Basic MAEffect of Crossover and Mutation • Both can be used for exploring the search space by “jumping” to new regions to start new local search; • Crossover Searching the region between two or more specified points; • Mutation Searching the undirected region randomly;

  23. Basic MAAdvantage of MA • Combining the advantages of GA and LS while avoid the disadvantages of both; • GA ensures wide exploration in the solution space • Through local search, the space of possible solutions can be reduced to the subspace of local optima. • When the scale of problem increases, the advantages becomes remarkable.

  24. Basic MADisadvantage of MA • The proportion of computations used in exploration and exploitation depends on the real optimization problem. • It is hard to determine the best depth of local search,.

  25. MA ExamplesSome Implementation Examples of MA • Quadratic Assignment Problem (QAP) • Traveling Salesman Problem (TSP) • Vehicle Routing • Graph Partitioning • Scheduling • The Knapsack Problem

  26. MA ExamplesApply Local Search to MA in QAP • For any permutation solution being explored, the procedure for the local search be executed once or several times –– partial local search (PLS) • The procedure for the local search be repeated many times until no further improvement is possible –– full local search (FLS)

  27. MA ExamplesDerived Two different MAs for QAP • PGA –– starts with an initial population of randomly generated individuals. For each individual, after xover and mutation, a PLS is performed. • FGA –– relies on FLS, full local search are carried out on all individuals at the beginning and at the end of a SGA run.

  28. MA ExamplesBriefly Steps involved for the PGA • The steps for PGA is same as the Basic MA. • The procedures for the local search only executed once or several times after each xover and mutation.

  29. MA ExamplesBriefly Steps involved for the FGA • 1. Randomly generate an initial population. Perform FLS on each individual. • 2: While terminating criterion is not reached, continue with procedures as spelled out for the SGA. • 3: Perform FLS on the best solution and output the final solution.

  30. MA ExamplesComparison of FGA and PGA • The effectiveness of FLS depends on the starting solution and the exchange routine. • PLS can be carried out more frequently, the algorithm is therefore able to spread out the search by exploring many small-localized regions, thus reducing the likelihood of the algorithms being trapped in a local optimum.

  31. MA ExamplesComparison of FGA and PGA (cont.) • As the size of the problem scales up, it is difficult to carry out FLS freely due to its great computational intensity. • PLS is carried out for almost all the individuals in addition to the SGA evolutionary mechanisms, the capability of the SGA in evolving towards fitter individuals is greatly enhanced.

  32. MA ExamplesComparison of FGA and PGA (cont.) • FLS limits the exploratory capability of the SGA, it will reduce the chance of the FGA reaching the global optimum. • PGA has a greater chance of obtaining the global optimum as compared to FGA.

  33. MA ExamplesComparison of a typical run on problem Els19 for SGA, PGA and FGA

  34. Conclusion • MA provides a more efficient and more robust way to the optimization problem. • MA combines global and local search by using EA to perform exploration while another local search method performs exploitation. • MA can solve some typical optimization problem where other meta-heuristics have failed.

  35. Thank you!

More Related