420 likes | 673 Views
Seminar at the University of Zaragoza June 28 - 30, 1999. Using Probabilistic Search Methods for Model Optimization. Michael Syrjakow. Institute for Computer Design and Fault Tolerance (Prof. D. Schmid) University of Karlsruhe, 76128 Karlsruhe P.O. Box 6980, Germany syrjakow@ira.uka.de.
E N D
Seminar at the University of Zaragoza June 28 - 30, 1999 Using Probabilistic Search Methods for ModelOptimization Michael Syrjakow Institute for Computer Design and Fault Tolerance (Prof. D. Schmid) University of Karlsruhe, 76128 Karlsruhe P.O. Box 6980, Germany syrjakow@ira.uka.de
Outline Monday • Introduction into parameter optimization of simulation models • Genetic Algorithms Tuesday • Simulated Annealing • Pattern Search • Hybrid optimization strategies I Wednesday • Hybrid optimization strategies II • REMO (REsearch Model Optimization Package) • Using Java for Simulation
Introduction into Parameter Optimization of Simulation Models Outline • Motivation and objectives • Specification of the model function • The procedure of model optimization • Real parameter and combinatorial optimization • Constraints and basic problems of model optimization • Properties of the model function • Requirements on the applied optimization strategies • Classification of optimization methods • Essential aspects of local and global optimization
experimental optimization of the real system system system* abstraction interpretation model optimization system model system model* Motivation and Objectives • Primary goal of system design and system tuning: “optimal" systems • Alternatives for system optimization: • Required: Qualified methods for global optimization of simulation models
Involved Domains 1 system design optimal systems 3 2 optimization modeling
Specification of the Model Function environment system-/workload- parameter system behavior • performance • reliability • resource consumption model building } x1 a1 = fM1(x1,x2,...,xn) abstract model x2 a2 = fM2(x1,x2,...,xn) ••• ••• xn am = fMm(x1,x2,...,xn) goal function model input parameter model output model function fM: D Rn Rm
The Procedure of Model Optimization direct optimization { } x1 simulation model x2 ••• ••• xn Required: goal function input output
Real Parameter Optimization Let F be the goal function and L be the search space: parameter restrictions Parameter vector L with: Required: global optimum NP-hard global optimum point
Constraints and Basic Problems of Model Optimization • no additional analytical information available (black-box situation) • expensive goal function evaluation through simulation • stochastic inaccuracies • high-dimensional search space with complex parameter restrictions • multimodal goal function with many local and/or global optimum points
Properties of the Model Function • analytical model • additional analytical information • high accuracy • simulation model • only goal function values • stochastic inaccuracies
Parameter Optimization of Simulation Models Requirements on the applied optimization strategies • no usage of additional analytical information • qualification for global search • high convergence speed • high approximation accuracy • high efficiency • robustness against stochastic inaccuracies • suitability for many kinds of goal functions • suitability for high numbers of model input parameters • easy handling • low implementation effort
optimization methods Classification of Optimization Methods methods with guaranteed success methods without guaranteed success complete enumeration analytical methods (indirect) heuristics iterative with additional analytical information (hybrid) iterative without additional analytical information (direct) quasi- deterministic deterministic probabilistic greedy algorithms HC GA ES SA local global
Combinatorial Optimization optimization problem (L,F) finite search space L set of real numbers R quantification of the quality of a solution D F(D) A F(A) F(C) C E F(E) B F(B) goal function F : L R Required: problem solution i*L global optimum global-optimally solution
Direct Local Optimization • Let • (L,F) be a combinatorial optimization problem • :L 2L be a neighborhood structure • istart be a starting solution • Required • problem solution iÙÎL: "jÎLi: F(j) o F(iÙ), oÎ{£,³} • Main problems • definition of the neighborhood structure • choice of the starting solution • Basic algorithm procedure local_minimization; begin choose_a_starting_solution(istartL); i:=istart; repeat generate_a_neighbouring_solution(jLi); if F(j) < F(i) then i:=j; until F(j) F(i), jLi; end;
Direct Global Optimization Main problems • usually no a priori knowledge about the goal function available • no basic algorithm • no efficient criterion for the proof of global optimum points Optimization methods • complete enumeration • heuristics, based on stochastic operators Let (L,F) be a combinatorial optimization problem Required problem solution i*L
Direct Optimization Methods • Local • Hill-Climbing Strategies (HC) • Strategy of Hooke and Jeeves (Pattern Search PS) • Global • Population based • Monte Carlo (MC) • Genetic Algorithms (GA) • Evolution Strategies (ES) • Point-to-Point • Simulated Annealing (SA)
Comparison of Direct Global and Local Optimization Methods local optimization methods global optimization methods Advantages + exact or at least e -accurate localization of optimal solutions + high convergence speed + high efficiency + ability to escape from sub-optimal regions of the search space • for global optimization of simulation models not sufficiently qualified Dis-advantages - no escape from sub-optimal regions of the search space (the optimization result is determined by the starting solution) - very low convergence speed especially in the neighborhood of optimal solutions - high optimization effort - uncertain quality of the optimization results
Example: Global Optimization • Advantage • exploration of the search space • Disadvantage • low convergence speed in the neighborhood of optimal solutions Applied optimization method: Genetic Algorithms global optimum optimization problem Shekel-Function
Example: Local Optimization Advantage • high convergence speed • -accurate localization (here = 0,01) Disadvantage • optimization success depends on the starting point Applied optimization method: Pattern Search optimization problem Shekel-Function
Components of an Atomic Direct Optimization Method A optimization algorithm D data structure T termination condition P control parameter -------------------------------------------- Lstart set of starting solutions control parameter setting oE optimization result oT optimization trajectory Basic structure optimization problem (L,F) kp Lstart D A P T atomic optimization nein ja oE oT
Genetic Algorithms (GA) Outline • Introduction • Basic structure of Genetic Algorithms • Basic properties of Genetic Algorithms • Genetic operators • A simple optimization example • Main difficulties of Genetic Algorithms • Case-study
Introduction into Genetic Algorithms • Search technique based on the principle of evolution • Invented in the early 70's by John Holland • GAs use two basic processes from evolution • inheritance (passing of features from one generation to the next) • competition (survival of the fittest) • Goal: Evolution through alternation in generations towards better and better regions of the search space
Data structure D P(t) = {a1t,a2t,...,ant} t: generation counter n: number of individuals ait: fixed-length binary strings (i=1,...,n) Algorithm A begin t:= 0 initialize P(t) evaluate P(t) while (not termination-condition) do begin t:= t+1 select P(t) from P(t-1) {selection} recombine P(t) {crossover, mutation} evaluate P(t) end end Basic Structure of Genetic Algorithms 100...1 010...0 ... 110...1 p1 p2 ... pm pj: parameter (j=1,...,m)
Basic Properties of Genetic Algorithms • GAs manipulate acodingof the optimized parameters • GAs search from a population, not a single point • GAs use only goal function values to guide the search process (blind search) • GAs use stochastic operatorsinstead of deterministic rules
Genetic Operators Selection • task: select n individuals from the previous population • realization alternative: Roulette Wheel Selection • fitness proportional selection • an individual ai is selected with probability • properties: not extinctive • each individual has principally the chance to be selected
Genetic Operators • Crossover • 1-point-crossover CO1P 11010111001010011 01011000110111011 01011010110101 01011010110101 01011000110111011 11010111001010011 01010100110111 01010100110111 crossover point crossover point • 2-point-crossover CO2 P 010110101101 011101011100101 0011 010110101101 110101100011011 0011 011101011100101 110101100011011 010110101101 1011 010110101101 1011 crossover point 1 crossover point 2 crossover point 1 crossover point 2 • Mutation MU 0101101011011111011000110100011 0101101011011101011000110110011
A Simple Optimization Example • Problem representation • encoding of the variable x as a binary vector • [0, 31] [00000, 11111] • Optimization of f(x)=x2, with x [0,31]
Main Difficulties of Genetic Algorithms • Adjustment of the GA control parameters • population size • crossover probability • mutation probability • Specification of the termination condition • Representation of the problem solutions
Genetic Algorithm • Population P(0) population-based optimization
Genetic Algorithm • Population P(1)
Genetic Algorithm • Population P(2)
Genetic Algorithm • Population P(3)
Genetic Algorithm • Population P(4)
Genetic Algorithm • Population P(5)
Genetic Algorithm • Complete optimization trajectory (P(0) - P(5))
Additional Information • Literature • Goldberg, D.E.: Genetic Algorithms in Search, Optimization and Machine Learning; Addison-Wesley, 1989. • Michalewicz, Z.: Genetic Algorithms + Data Structures = Evolution Programs; Springer, 1992. • Bäck, Th.: Evolutionary Algorithms in Theory and Practice; Oxford University Press, New York, 1996. • Mazumder, P.; Rudnick, E.: Genetic Algorithms for VLSI Design, Layout & Test Automation; Prentice Hall, 1999. • Information on the Web • http://www-illigal.ge.uiuc.edu/illigal.home.html • http://www.aic.nrl.navy.mil/galist/ • http://www.aracnet.com/~wwir/NovaGenetica/