1 / 71

Automated real-world problem solving EMPLOYING meta- heuristical black-box optimization

Automated real-world problem solving EMPLOYING meta- heuristical black-box optimization. Daniel R. Tauritz, Ph.D. Guest Scientist, Los Alamos National Laboratory Director, Natural Computation Laboratory Department of Computer Science Missouri University of Science and Technology.

elda
Download Presentation

Automated real-world problem solving EMPLOYING meta- heuristical black-box optimization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Automated real-world problem solving EMPLOYING meta-heuristicalblack-box optimization Daniel R. Tauritz, Ph.D. Guest Scientist, Los Alamos National Laboratory Director, Natural Computation Laboratory Department of Computer Science Missouri University of Science and Technology

  2. Black-Box Search Algorithms • Many complex real-world problems can be formulated as generate-and-test problems • Black-Box Search Algorithms (BBSAs) iteratively generate trial solutions employing solely the information gained from previous trial solutions, but no explicit problem knowledge

  3. Practitioner’s Dilemma • How to decide for given real-world problem whether beneficial to formulate as black-box search problem? • How to formulate real-world problem as black-box search problem? • How to select/create BBSA? • How to configure BBSA? • How to interpret result? • All of the above are interdependent!

  4. Theory-Practice Gap • While BBSAs, including EAs, steadily are improving in scope and performance, their impact on routine real-world problem solving remains underwhelming • A scalable solution enabling domain-expert practitioners to routinely solve real-world problems with BBSAs is needed

  5. Two typical real-world problem categories • Solving a single-instance problem: automated BBSA selection • Repeatedly solving instances of a problem class: evolve custom BBSA

  6. Part I: Solving Single-Instance Problems Employing Automated BBSA Selection

  7. Requirements • Need diverse set of high-performance BBSAs • Need automated approach to select most appropriate BBSA from set for a given problem • Need automated approach to configure selected BBSA

  8. Automated BBSA Selection • Given a set of BBSAs, a priori evolve a set of benchmark functions which cluster the BBSAs by performance • Given a real-world problem, create a surrogate fitness function • Find the benchmark function most similar to the surrogate • Execute the corresponding BBSA on the real-world problem

  9. A Priori, Once Per BBSA Set Benchmark Generator BBSA1 BBSA2 … BBSAn BBSAn BBSA1 BBSA2 BPn BP1 BP2 …

  10. A Priori, Once Per Problem Class Real-World Problem Sampling Mechanism Surrogate Objective Function Match with most “similar” BPk Identified most appropriate BBSAk

  11. Per Problem Instance Real-World Problem Sampling Mechanism Surrogate Objective Function Apply a priori established most appropriate BBSAk

  12. Requirements • Need diverse set of high-performance BBSAs • Need automated approach to select most appropriate BBSA from set for a given problem • Need automated approach to configure selected BBSA

  13. Static vs. dynamic parameters • Static parameters remain constant during evolution, dynamic parameters can change • Dynamic parameters require parameter control • The optimal values of the strategy parameters can change during evolution [1]

  14. Solution Create self-configuring BBSAs employing dynamic strategy parameters

  15. Supportive Coevolution [3]

  16. Self Adaptation • Dynamical Evolution of Parameters • Individuals Encode Parameter Values • Indirect Fitness • Poor Scaling

  17. Coevolution Basics Def.: In Coevolution, the fitness of an individual is dependent on other individuals (i.e., individuals are part of the environment) Def.: In Competitive Coevolution, the fitness of an individual is inversely proportional to the fitness of some or all other individuals

  18. Supportive Coevolution • Primary Population • Encodes problem solution • Support Population • Encodes configuration information Support Individual Primary Individual

  19. Multiple Support Populations • Each Parameter Evolved Separately • Propagation Rate Separated

  20. Multiple Primary Populations

  21. Benchmark Functions • Rastrigin • Shifted Rastrigin • Randomly generated offset vector • Individuals offset before evaluation

  22. Fitness Results • All Results Statistically Significantaccording to T-test with α = 0.001

  23. Conclusions • Proof of Concept Successful • SuCo outperformed SA on all but on test problem • SuCo Mutation more successful • Adapts better to change in population fitness

  24. Self-Configuring Crossover [2]

  25. Motivation • Performance Sensitive to Crossover Selection • Identifying & Configuring Best Traditional Crossover is Time Consuming • Existing Operators May Be Suboptimal • Optimal Operator May Change During Evolution

  26. Some Possible Solutions • Meta-EA • Exceptionally time consuming • Self-Adaptive Algorithm Selection • Limited by algorithms it can choose from

  27. Self-Configuring Crossover (SCX) Offspring Crossover • Each Individual Encodes a Crossover Operator • Crossovers Encoded as a List of Primitives • Swap • Merge • Each Primitive has three parameters • Number, Random, or Inline Swap(3, 5, 2) Merge(1, r, 0.7) Swap(r, i, r)

  28. Applying an SCX Concatenate Genes 1.0 5.0 2.0 6.0 3.0 7.0 4.0 8.0 Parent 1 Genes Parent 2 Genes

  29. The Swap Primitive • Each Primitive has a type • Swap represents crossovers that move genetic material • First Two Parameters • Start Position • End Position • Third Parameter Primitive Dependent • Swaps use “Width” Swap(3, 5, 2)

  30. Applying an SCX Concatenate Genes 1.0 5.0 2.0 6.0 3.0 7.0 4.0 8.0 Offspring Crossover Swap(3, 5, 2) 3.0 4.0 5.0 6.0 Merge(1, r, 0.7) Swap(r, i, r)

  31. The Merge Primitive • Third Parameter Primitive Dependent • Merges use “Weight” • Random Construct • All past primitive parameters used the Number construct • “r” marks a primitive using the Random Construct • Allows primitives to act stochastically Merge(1, r, 0.7)

  32. Applying an SCX Concatenate Genes 3.0 1.0 2.0 4.0 5.0 7.0 8.0 6.0 Offspring Crossover Swap(3, 5, 2) 0.7 g(i) = α*g(i) + (1-α)*g(j) Merge(1, r, 0.7) g(1) = 1.0*(0.7) + 6.0*(1-0.7) g(2) = 6.0*(0.7) + 1.0*(1-0.7) 2.5 4.5 Swap(r, i, r)

  33. The Inline Construct • Only Usable by First Two Parameters • Denoted as “i” • Forces Primitive to Act on the Same Loci in Both Parents Swap(r, i, r)

  34. Applying an SCX Concatenate Genes 2.5 3.0 2.0 4.0 5.0 7.0 4.5 8.0 Offspring Crossover 2.0 4.0 Swap(3, 5, 2) Merge(1, r, 0.7) Swap(r, i, r)

  35. Applying an SCX Remove Exess Genes Concatenate Genes 2.5 3.0 4.0 2.0 5.0 7.0 4.5 8.0 Offspring Genes

  36. Evolving Crossovers Parent 2 Crossover Parent 1 Crossover Offspring Crossover Swap(r, 7, 3) Swap(3, 5, 2) Merge(i, 8, r) Swap(4, 2, r) Merge(1, r, 0.7) Merge(r, r, r) Swap(r, i, r)

  37. Empirical Quality Assessment • Compared Against • Arithmetic Crossover • N-Point Crossover • Uniform Crossover • On Problems • Rosenbrock • Rastrigin • Offset Rastrigin • NK-Landscapes • DTrap

  38. SCX Overhead • Requires No Additional Evaluation • Adds No Significant Increase in Run Time • All linear operations • Adds Initial Crossover Length Parameter • Testing showed results fairly insensitive to this parameter • Even worst settings tested achieved better results than comparison operators

  39. Conclusions • Remove Need to Select Crossover Algorithm • Better Fitness Without Significant Overhead • Benefits From Dynamically Changing Operator

  40. Current Work • Combining Support Coevolution and Self-Configuring Crossover by employing the latter as a support population in the former [4]

  41. Part II: Repeatedly Solving Problem Class Instances By Evolving Custom BBSAs Employing Meta-GP [5]

  42. Problem • Difficult Repeated Problems • Cell Tower Placement • Flight Routing • Which BBSA, such as Evolutionary Algorithms, Particle Swarm Optimization, or Simulated Annealing, to use?

  43. Possible Solutions • Automated Algorithm Selection • Self-Adaptive Algorithms • Meta-Algorithms • Evolving Parameters / Selecting Operators • Evolve the Algorithm

  44. Our Solution • Evolving BBSAs employing meta-Genetic Programming (GP) • Post-ordered parse tree • Evolve a repeated iteration

  45. Initialization Check for Termination Iteration Terminate

  46. Our Solution • Genetic Programing • Post-ordered parse tree • Evolve a repeated iteration • High-level operations

  47. Parse Tree • Iteration • Sets of Solutions • Root returns ‘Last’ set

  48. Nodes: Variation Nodes • Mutate • Mutate w/o creating new solution (Tweak) • Uniform Recombination • Diagonal Recombination

  49. Nodes: Selection Nodes • k-Tournament • Truncation

  50. Nodes: Set Nodes • makeSet • Persistent Sets

More Related