290 likes | 410 Views
Evolutionary Computational Intelligence. Lecture 5a: Overview about Evolutionary Programming. Ferrante Neri University of Jyväskylä. E P quick overview. Developed: USA in the 1960’s Early names: D. Fogel Typically applied to:
E N D
Evolutionary Computational Intelligence Lecture 5a: Overview about Evolutionary Programming Ferrante Neri University of Jyväskylä Lecture 5: EP and DE
EP quick overview • Developed: USA in the 1960’s • Early names: D. Fogel • Typically applied to: • traditional EP: machine learning tasks by finite state machines • contemporary EP: (numerical) optimization • Attributed features: • very open framework: any representation and mutation op’s OK • crossbred with ES (contemporary EP) • consequently: hard to say what “standard” EP is • Special: • no recombination • self-adaptation of parameters standard (contemporary EP) Lecture 5: EP and DE
EP technical summary tableau Lecture 5: EP and DE
Historical EP perspective • EP aimed at achieving intelligence • Intelligence was viewed as adaptive behaviour • Prediction of the environment was considered a prerequisite to adaptive behaviour • Thus: capability to predict is key to intelligence Lecture 5: EP and DE
Finite State Machine as predictor • Consider the following FSM • Task: predict next input • Quality: % of in(i+1) = outi • Given initial state C • Input sequence 011101 • Leads to output 110111 • Quality: 3 out of 5 Lecture 5: EP and DE
Representation • For continuous parameter optimization • Chromosomes consist of two parts: • Object variables: x1,…,xn • Mutation step sizes: 1,…,n • Full size: x1,…,xn,1,…,n Lecture 5: EP and DE
Mutation • Chromosomes: x1,…,xn,1,…,n • i’ = i•(1 + • N(0,1)) • x’i = xi + i’• Ni(0,1) • 0.2 • boundary rule: ’ < 0 ’ = 0 • Other variants proposed & tried: • Lognormal scheme as in ES • Using variance instead of standard deviation • Mutate -last • Other distributions, e.g, Cauchy instead of Gaussian Lecture 5: EP and DE
Recombination • None • Rationale: one point in the search space stands for a species, not for an individual and there can be no crossover between species • Much historical debate “mutation vs. crossover” • Pragmatic approach seems to prevail today Lecture 5: EP and DE
Parent selection • Each individual creates one child by mutation • Thus: • Deterministic • Not biased by fitness Lecture 5: EP and DE
Survivor selection • P(t): parents, P’(t): offspring • Pairwise competitions in round-robin format: • Each solution x from P(t) P’(t) is evaluated against q other randomly chosen solutions • For each comparison, a "win" is assigned if x is better than its opponent • The solutions with the greatest number of wins are retained to be parents of the next generation • Parameter q allows tuning selection pressure • Typically q = 10 Lecture 5: EP and DE
Example application: evolving checkers players (Fogel’02) • Neural nets for evaluating future values of moves are evolved • NNs have fixed structure with 5046 weights, these are evolved + one weight for “kings” • Representation: • vector of 5046 real numbers for object variables (weights) • vector of 5046 real numbers for ‘s • Mutation: • Gaussian, lognormal scheme with -first • Plus special mechanism for the kings’ weight • Population size 15 Lecture 5: EP and DE
Example application: evolving checkers players (Fogel’02) • Tournament size q = 5 • Programs (with NN inside) play against other programs, no human trainer or hard-wired intelligence • After 840 generation (6 months!) best strategy was tested against humans via Internet • Program earned “expert class” ranking outperforming 99.61% of all rated players Lecture 5: EP and DE
Evolutionary Computational Intelligence Lecture 5b:Differential Evolution Lecture 5: EP and DE
Brief historical overview • The Term Differntial Evolution has been coined in 1994 by Storn and Proce (Germany-USA) • Some important invesigations have been recently done by Lampinen • The so far only existing book has been published in 2005 Lecture 5: EP and DE
Representation • Differential Evolution in its original implementation is intended for vectors of real numbers • Nevertheless it can be employed also in the case of integer problems, probably loosing in terms of efficiency Lecture 5: EP and DE
Population models • GA and “comma” ES employ a generational logic: offspring population replaces entirely the previous population • “plus” ES considers both parents and offspring and after having sorted them selects a predetermined number of best performing individuals • Differential Evolution (DE) emplys a steady-state logic (also used in some GAs): the successfull offspring immediately “kills” the weakest parent Lecture 5: EP and DE
Initial Sampling • A set of vectors in sampled, usually at random with the boundaries of the decision space • And these vector represent the design variables that we are willing to optimize • Our population size must be at least four Lecture 5: EP and DE
Parent selection • Four individuals x1, x2, x3, x4 are selected at random from the population by means of a uniformly distributed function • Like in ES there is no selection pressure for the choice of the parents undergoing variation operators (recombination and mutation) Lecture 5: EP and DE
Recombination • A provisional offspring xoffp is generated by: xoffp=x1+K(x2-x3) where K is s constant value usually set equal to 0.7 Lecture 5: EP and DE
Mutation • With a certain probability some genes of the provisional offspring are replaced with some genes of x4. • The probability of happening such mutation is usually set to 0.3 Lecture 5: EP and DE
Survivor seelection • The offspring xoff is thus generated. • The fitness value of xoff is calculated and,according to a steady-state strategy, • if xoff outperforms x4, it replaces x4, • if on the contrary f(xoff)>f(x4), no replacement occurs. Lecture 5: EP and DE
Observations • The steady state logic makes the DE structure without generation loops since the replacements occurs as soon as a better solution is generated • Exploratory logic of DE has a slight analogy with Nelder Mead since it lets the search directions been led by means of existing solutions. Analogy for 2 dimension case is rather strong • The DE is very promising but the biggest limit it has is the risk of stagnation Lecture 5: EP and DE
Premature Convergence/ Stagnation • There are the main defects in EAs • Premature Convergence: It occurs when all the population does not have any difference (one genotype) and the corrensponding fitness value is suboptimal (+ strategy) • Stagnation:It occurs when, notwithstanding a high diversity, there are no improvements (superfit individual) Lecture 5: EP and DE
Evolutionary Computational Intelligence Lecture 5c:Handling Multimodality Lecture 5: EP and DE
Motivation 1: Multimodality Most interesting problems have more than one locally optimal solution. Lecture 5: EP and DE
Motivation 2: Genetic Drift • Finite population with global (panmictic) mixing and selection eventually convergence around one optimum • Often might want to identify several possible peaks • This can aid global optimisation when sub-optima has the largest basin of attraction Lecture 5: EP and DE
Biological Motivation 1: Speciation • In nature different species adapt to occupy different environmental niches, which contain finite resources, so the individuals are in competition with each other • Species only reproduce with other members of the same species (Mating Restriction) • These forces tend to lead to phenotypic homogeneity within species, but differences between species Lecture 5: EP and DE
Biological Motivation 2: Punctuated Equilbria • Theory that periods of stasis are interrupted by rapid growth when main population is “invaded” by individuals from previously spatially isolated group of individuals from the same species • The separated sub-populations (demes) often show local adaptations in response to slight changes in their local environments Lecture 5: EP and DE
Implications for Evolutionary Optimization • Two main approaches to diversity maintenance: • Implicit approaches: • Impose an equivalent of geographical separation • Impose an equivalent of speciation • Explicit approaches • Make similar individuals compete for resources (fitness) • Make similar individuals compete with each other for survival Lecture 5: EP and DE