1 / 57

Genetic algorithms: an introduction

Genetic algorithms: an introduction. A rtem Eremin , j. researcher, IMMI KSU. Motivation. Motivation. Material properties ( C ij ) - ???. experimental data. Doppler laservibrometry for measuring out-of-plane velocities. Wavelet transform. TOF.

cynara
Download Presentation

Genetic algorithms: an introduction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Genetic algorithms: an introduction Artem Eremin, j. researcher, IMMI KSU

  2. Motivation

  3. Motivation

  4. Material properties (Cij) - ??? experimental data • Doppler laservibrometry for measuring out-of-plane velocities Wavelet transform TOF • Time-of-Flight (TOF) with wavelet transform Motivation

  5. Every day we subconsciously solve some optimization problems! Optimization “Optimization is the process of making something better”

  6. Optimization

  7. Minimum-seeking algorithms • Exhaustive Search = Brute Force • Analytical Optimization • Nelder-Mead downhill Simplex Method • Optimization based on Line Minimization (the coordinate search method, the steepest descent algorithm, Newton’s method, Davidon-Fletcher-Powell (DFP) algorithm, etc …)

  8. Minimum-seeking algorithms 1 – 4 can converge to a local minimum! Natural optimization methods Not the panacea, but … Simulated annealing (Kirkpatrick et al., 1983) Particle swarm optimization (Parsopoulos and Vrahatis, 2002) Genetic algorithms (Holland, 1975) Evolutionary algorithms (Schwefel, 1995) No derivatives, large search spaces, “nature-based”

  9. Biological background (Cell and Chromosomes) • Every animal cell is a complex of many small “factories” working together; the center of this all is the cell nucleus; the nucleus contains the genetic information in chromosomes - strings of DNA • Each chromosome contains a set of genes - blocks of DNA • Each gene determines some aspect of the organism (like eye colour) • A collection of genes is sometimes called a genotype • A collection of aspects (like eye colour) is sometimes called a phenotype

  10. + Biological background (Reproduction) Organisms produce a number of offspring similar to themselves but can have variations due to: – Mutations (random changes) – Sexual reproduction (offspring have combinations of features inherited from each parent)

  11. Biological background (Natural Selection) • The Origin of Species: “Preservation of favourable variations and rejection of unfavourable variations.” • There are more individuals born than can survive, so there is a continuous struggle for life. • Individuals with an advantage have a greater chance for survive: survival of the fittest. • Important aspects in natural selection are: adaptation to the environment and isolation of populations in different groups which cannot mutually mate

  12. Genetic algorithms (GA) • GA were initially developed by John Holland, University of Michigan (1970’s) • Popularized by his student David Goldberg (solved some very complex engineering problems, 1989) • Based on ideas from Darwinian Evolution • Provide efficient techniques for optimization and machine learning applications; widely used in business, science and engineering

  13. GA main features • Optimizes with continuous or discrete variables • Doesn’t require derivative information • Simultaneously searches from a wide sampling of the cost surface • Deals with a large number of variables • Is well suited for parallel computers • Optimizes variables with extremely complex cost surfaces (they can jump out of a local minimum) • Provides a list of optimum variables, not just a single solution • May encode the variables so that the optimization is done with the encoded variables • Works with numerically generated data, experimental data, or analytical functions.

  14. To start with… Phenotype space Genotype space = {0,1}L Encoding (representation) 10010001 10010010 010001001 011101001 Decoding (inverse representation)

  15. To start with… Gene – a single encoding of part of the solution space, i.e. either single bits or short blocks of adjacent bits that encode an element of the candidate solution Chromosome – a string of genes that represents a solution + Population – the number of chromosomes available to test

  16. Chromosomes Chromosomes can be: –Bit strings(0110, 0011, 1101, …) –Real numbers(33.2, -12.11, 5.32, …) –Permutations of elements (1234, 3241, 4312, …) –Lists of rules(R1, R2, R3, …Rn…) –Program elements(genetic programming) – … Chromosome=array of Nvar variables (genes) pi

  17. How does it works? So… produce an initial population of individuals evaluate the fitness of all individuals while termination condition not met do select fitter individuals for reproduction recombine between individuals mutate individuals evaluate the fitness of the modified individuals generate a new population End while

  18. How does it works? The Evolutionary Cycle parents selection modification modified offspring initiate & evaluation population evaluate evaluated offspring deleted members Or so… discard

  19. s1 = 1111010101 f (s1) = 7 s2 = 0111000101 f (s2) = 5 s3 = 1110110101 f (s3) = 7 s4 = 0100010011 f (s4) = 4 s5 = 1110111101 f (s5) = 8 s6 = 0100110000 f (s6) = 3 Ex. Npop=6 Generation of the initial population

  20. Selection We are kind! Let’s save everybody! or Mating pool

  21. Selection roulette wheel weighting

  22. Selection The roulette wheel method: Individual i will have a probability to be chosen Area is proportional to fitness value 1 2 n We repeat the extraction as many times as it is necessary 3 4

  23. Selection • randomly pick a small subset • perform a “tournament” • “the winner takes it all” Tournament + Threshold = No SORTING!!!

  24. Mating (Crossover) Simple 1-point crossover • Choose a random point on the two parents • Split parents at this crossover point • Create children by exchanging tails • Pc typically in range (0.6, 0.9) • Performance with 1 Point Crossover depends on the order that variables occur in the representation • more likely to keep together genes that are near each other • Can never keep together genes from opposite ends of string • This is known as Positional Bias • Can be exploited if we know about the structure of our problem, but this is not usually the case

  25. Mating (Crossover) n-point crossover • Choose n random crossover points • Split along those points • Glue parts, alternating between parents • Generalisation of 1 point (still some positional bias)

  26. Mating (Crossover) Uniform crossover Uniform crossover looks at each bit in the parents and randomly assigns the bit from one parent to one offspring and the bit from the other parent to the other offspring

  27. Mutation • Alter each gene (or, bit) independently with a probabilitypm • pmis called the mutation rate • Typically between 1/Npop and1/[s]

  28. Crossover or/and Mutation • A long debate: which one is better / necessary / main-background • Answer (at least, rather wide agreement): • it depends on the problem, but • in general, it is good to have both • both have another role • mutation-only-GA is possible, crossover-only-GA would not work

  29. Crossover or/and Mutation • Exploration: Discovering promising areas in the search space, i.e. gaining information on the problem • Exploitation: Optimising within a promising area, i.e. using information • There is co-operation AND competition between them • Crossover is explorative, it makes a big jump to an area somewhere “in between” two (parent) areas • Mutation is exploitative, it creates random small diversions, thereby staying near (in the area of ) the parent • Only crossover can combine information from two parents • Only mutation can introduce new information • To hit the optimum you often need a ‘lucky’ mutation

  30. Real valued problems Mapping real values on bit strings pi [ai, bi] R represented by {a1,…,aL} {0,1}L • [ai, bi]  {0,1}L must be invertible (one phenotype per genotype) • : {0,1}L  [ai, bi] defines the representation • Only 2L values out of infinite are represented • L determines possible maximum precision of solution • High precision  long chromosomes (slow evolution)

  31. Floating point mutations General scheme of floating point mutations • Uniform mutation: • Analogous to bit-flipping (binary) or random resetting (integers)

  32. Floating point mutations • Non-uniform mutations: • Many methods proposed,such as time-varying range of change etc. • Most schemes are probabilistic but usually only make a small change to value • Most common method is to add random deviate to each variable separately, taken from N(0, ) Gaussian distribution and then curtail to range • Standard deviation  controls amount of change (2/3 of deviations will lie in range (- to +)

  33. Crossover for real valued GAs • Discrete: • each gene value in offspring z comes from one of its parents (x,y) with equal probability: zi = xior yi • Could use n-point or uniform • Intermediate • exploits idea of creating children “between” parents (hence a.k.a. arithmetic recombination) • zi =  xi + (1 - ) yi where  : 0   1. • The parameter  can be: • constant: uniform arithmetical crossover • variable (e.g. depend on the age of the population) • picked at random every time

  34. Single arithmetic crossover • Parents: x1,…,xnand y1,…,yn • Pick a single gene (k) at random, • child1 is: • reverse for other child. e.g. with  = 0.5

  35. Simple arithmetic crossover • Parents: x1,…,xnand y1,…,yn • Pick random gene (k) after this point mix values • child1 is: • reverse for other child. e.g. with  = 0.5

  36. “Whole” arithmetic crossover • Most commonly used • Parents: x1,…,xnand y1,…,yn • child1 is: • reverse for other child. e.g. with  = 0.5

  37. micro-GA First generation (random values) Tournament selection SBX crossover Select fittest individual Start new generation Good results? Enough iterations? Yes No

  38. Benefits of GA • Concept is easy to understand • Modular–separate from application (representation); building blocks can be used in hybrid applications • Supports multi-objective optimization • Good for “noisy”environment • Always results in an answer, which becomes better and better with time • Can easily run in parallel • The fitness function can be changed from iteration to iteration, which allows incorporating new data in the model if it becomes available

  39. Issues with GA Choosing parameters: –Population size –Crossover and mutation probabilities –Selection, deletion policies –Crossover, mutation operators, etc. –Termination criteria Performance: –Can be too slow but covers a large search space –Is only as good as the fitness function

  40. Examples

  41. Experimental specimens 4 CFRP–plates

  42. Material properties

  43. Comparison of results

  44. Comparison of results

  45. Comparison of results

  46. GA for Permutations • Ordering/sequencing problems form a special type • Task is (or can be solved by) arranging some objects in a certain order • Example: sort algorithm: important thing is which elements occur before others (order) • Example: Travelling Salesman Problem (TSP) : important thing is which elements occur next to each other (adjacency) • These problems are generally expressed as a permutation: • if there are n variables then the representation is as a list of n integers, each of which occurs exactly once

  47. The Traveling Salesman Problem (TSP) The traveling salesman must visit every city in his territory exactly once and then return to the starting point; given the cost of travel between all cities, how should he plan his itinerary for minimum total cost of the entire tour? TSP  NP-Complete Search space is BIG: for 30 cities there are 30!  1032 possible tours

  48. TSP (Representation, Initialization and Selection) A vector v = (i1 i2… in) represents a tour (v is a permutation of {1,2,…,n}) Fitness f of a solution is the inverse cost of the corresponding tour Initialization: use either some heuristics, or a random sample of permutations of {1,2,…,n} We shall use the fitness proportionate selection

  49. Mutation operations for permutations • Normal mutation operators lead to inadmissible solutions • e.g. bit-wise mutation : let gene i have value j • changing to some other value k would mean that k occurred twice and j no longer occurred • Therefore must change at least two values • Mutation parameter now reflects the probability that some operator is applied once to the whole string, rather than individually in each position

  50. Insert Mutation for permutations • Pick two allele values at random • Move the second to follow the first, shifting the rest along to accommodate • Note that this preserves most of the order and the adjacency information

More Related