1 / 45

Introduction

Introduction. Optimisation Methods. Introduction. Course Material : http://www.intelligent-optimization.org/OM Username : om2008 Password : 0m_course Lectures / Tutorials Assessment. Course Topics. Optimisation Overview Minimisation or Maximisation of Functions Simulated Annealing

Download Presentation

Introduction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction OptimisationMethods

  2. Introduction • Course Material : http://www.intelligent-optimization.org/OM • Username : om2008 • Password : 0m_course • Lectures / Tutorials • Assessment

  3. Course Topics • Optimisation Overview • Minimisation or Maximisation of Functions • Simulated Annealing • Genetic Algorithms • Model Building • Linear Algebra • Linear Programming • Simplex Algorithm • Network Models • Integer Programming • Ant Colony Optimisation

  4. Optimisation Overview Topics • Global / Local Optimisation • Overview • Evolutionary Algorithms • Local Search • Population Based Search (PBS) • Algorithm • Applications • Atomic clusters • p-median problems • Future of PBS • Phased Local Search (PLS) • Algorithm • Applications • Max Clique / Weighted Max Clique • MIS / MVC / WMIS / WMVC

  5. Global / Local optimisation - Overview • Optimisation problems occur everywhere in normal life • Searching a hyper-surface for f(x) where x is a vector • Number of local minima is one measure of the complexity of the problem

  6. Global optimisation • In general terms, the global optimisation problem is: where and • is a function, usually referred to as the objective function, which maybe both non-continuous and non-convex. • A number of deterministic methods exist (e.g. branch and bound, cutting planes) but are only suitable for small problems • A number of stochastic methods exist (e.g. Random Search, IHR, SA, GA, ...) which may also incorporate the repeated use of a local optimiser.

  7. Global optimisation – Genetic Algorithm randomly initialise a pool of solutions Next_Generation Mutation_Loop select a solution from pool using relative fitness mutate solution and save end_Loop Crossover_Loop select pairs of solutions from pool using relative fitness crossover and save both child solutions end_Loop Termination_Check if not finished create new pool from saved solutions goto Next_GenerationGADemo

  8. Continuous local optimisation - algorithm Initial Step set k = 0 supply an initial guess, xk, within any specified constraints Iterative Step calculate a search direction pk determine an appropriate step length lk set xk+1 to xk+ lk pk Stopping Criteria if convergence criteria reached optimum vector is xk+1 stop else set k = k + 1 repeat Iterative Step

  9. Discrete local optimisation - algorithm • Typically performed using a local search algorithm which starts from a candidate solution and theniterativelymoves to aneighboursolution. This is only possible if aneighbourhood relationis defined on the search space. As an example, the neighbourhood of a vertex cover is another vertex cover only differing by one node. • Typically, every candidate solution has more than one neighbour solution; the choice of which one to move to is taken using only information about the solutions in theneighbourhoodof the current one, hence the name local search. • To keep the time-complexity of the individual search steps minimal which is in keeping with the common intuition that, in the context of stochastic local search algorithms, it is often preferable to perform many relatively simple, but efficiently computable search steps rather than fewer complex search steps (PLS)

  10. PBS - Recent Studies • W.J.Pullan, An Unbiased Population Based Search for the Geometry Optimisation of Lennard-Jones Clusters: 2 ≤ N ≤ 372, Journal of Computational Chemistry, 2005 • W.J.Pullan, Population based search for the p-median problem, IEEE Congress on Evolutionary Computation, 2008 • W.J.Pullan, Population based search for the p-center problem, Evolutionary Computation, In Press 2008 • W.J.Pullan, Population based search for the UFL problem, To be submitted, IJCAI-2009

  11. PBS for clustering problems - Algorithm randomly initialise a pool of solutions Next_Generation for all solutions in the pool, mutate and locally optimise for all possible pairs of solutions in the pool, crossover and locally optimise Termination_Check if not finished create new pool from saved solutions goto Next_Generation Key features Small population, Structure niching, Directed mutation operators, Phenotype crossover operators, Local Optimisation, Highly parallel

  12. PBS for atomic clusters - Problem • Theoretical investigations of atomic clusters address the following optimisation problem: • Given N particles, interacting with two-body central forces, find the configuration in three-dimensional Euclidean space for which the total potential energy attains its global minimum. • Number of local minima is exponential in N - For 100 atoms, there appears to be at least 10140 local minima • Variants - Lennard-Jones clusters, BLJ clusters, Morse clusters, Molecular clusters

  13. PBS for atomic clusters - Problem • Potential energy of a cluster is where has the form:

  14. PBS for atomic clusters – Optimal structures Lennard-Jones Clusters

  15. PBS for atomic clusters – Key Features • Small population – 8 to 12 members, needs effective genetic operators to promote diversity • Structure niching – uses cluster strain energy to identify different shapes • Directed mutation operators – close atom removal • Phenotype crossover operators – rotated hemisphere • Local optimisation – LBFGS optimiser • Highly parallel – using MPI / LAM on Linux cluster

  16. PBS for atomic clusters – Results • The Cambridge Cluster Database • Lennard-Jones Atomic Clusters - Highly researched problem • 1987 – Lattice-Based • 1994 – GA 2 ≤ N ≤ 10 • 1996 – GA 2 ≤ N ≤ 100 • 1999 – GA 2 ≤ N ≤ 150 • 2002 – Conformational Space Annealing (CSA) 2 ≤ N ≤ 201 but only partially successful for N = 184, 188 - 192, 198, 199. • 2005 – PBS 2 ≤ N ≤ 372 • 2006 – PBS 2 ≤ N ≤ 500 • Morse Clusters • 2006 – PBS – First to optimise all Morse clusters 2 ≤ N ≤ 80, 147 • 2006 – PBS – First to optimise 81 ≤ N ≤ 146 • Binary Lennard-Jones Clusters • 2006 – PBS produced 10 new results

  17. PBS for atomic clusters – Structure Niching

  18. PBS for atomic clusters – Directed Mutation

  19. PBS for atomic clusters – Performance

  20. PBS for p-median problems - Problem • The p-median problem calls for finding the p facilities which minimise the total cost of servicing n clients where the pair-wise cost of servicing each client from all facilities is given. • In the uncapacitated version, each facility is able to service any number of clients. • Each client is only serviced by a single facility and client services are not combined. • Input is usually a weighted graph in either distance matrix format or (x, y) coordinates. • Graph may be symmetric or non-symmetric

  21. PBS for p-median problems - Problem

  22. PBS for p-median problems – Key Features • Small population – 7 + Np, , needs effective genetic operators to promote diversity • Structure niching – p-median cost • Directed mutation operators – facilities `close’ to each other • Phenotype crossover operators – geometric for distance and xy matrices • Local optimisation – Local search based on PLS • Highly parallel – MPI / LAM on Linux cluster

  23. PBS for p-median problems - Results • Of the 174 instances in the commonly used ORLIB, SL, GR, RW, Koerkel and TSPLIB classes, PBS: • Located 64 new best known results • Found 94 existing best known results • Failed on 16 instances with an average percentage cost error of 0.004 (some of these instances may be suspect(?)) • Benchmark has number of vertices in range 100,...,5934 and p in range 2,…,1500

  24. PBS for p-median problems – rl5934 Best Known _________PBS-20_______________ p Value Source MED %ERR Time 10 9794951.00 HMP01 9794973.65 0.000 7.33 20 6718848.19 RES04/PBS 6718848.19 0.001 99.44 30 5374936.14 RES04/PBS 5375610.35 0.011 82.44 40 4550327.09 PBS 4550471.49 0.003 104.22 50 4032379.97 RES04/PBS 4032606.96 0.005 85.89 60 3642064.70 PBS 3642843.00 0.018 105.00 70 3343617.76 PBS 3344766.37 0.033 161.11 80 3094507.17 PBS 3095034.50 0.020 139.56 90 2893234.39 PBS 2894592.50 0.045 146.89 100 2725020.51 PBS 2725537.73 0.019 231.56 150 2147817.00 PBS 2148657.97 0.039 267.56 200 1808010.37 PBS 1808612.52 0.032 348.22 250 1569830.31 PBS 1570395.72 0.034 380.00 300 1394045.57 PBS 1394332.32 0.020 553.00 350 1256775.39 PBS 1257074.32 0.023 622.33 400 1145631.85 PBS 1145945.94 0.027 609.33 450 1053303.41 PBS 1053690.34 0.037 523.22 500 973982.54 PBS 974232.56 0.026 856.78

  25. PBS for p-median problems – rl5934 Best Known ________PBS-20_______________ p Value Source MED %ERR Time 600 848266.43 PBS 848495.14 0.027 714.44 700 752047.96 PBS 752260.07 0.028 916.89 800 676778.00 PBS 676949.38 0.025 1888.22 900 613352.32 PBS 613497.68 0.023 1725.11 1000 558801.23 PBS 558941.38 0.025 1658.33 1100 511804.96 PBS 511923.63 0.023 1301.67 1200 470290.06 PBS 470409.38 0.025 1586.33 1300 433587.06 PBS 433675.77 0.020 1932.22 1400 401829.22 PBS 401931.92 0.025 2787.56 1500 374001.06 PBS 374053.40 0.014 6148.67

  26. PBS for p-median problems – rw1000

  27. PBS for p-median problems – rw1000

  28. PBS for p-median problems - RTD

  29. PBS for p-median problems - Difficulty

  30. PBS for p-median problems - Scaling

  31. PBS for p-median problems – Population size

  32. PBS for p-median problems – Genetic Operators

  33. PBS for p-median problems – Local Search

  34. PBS for clusters – future research • Improve atomic cluster results • Extend PBS algorithm to molecular clusters • Max covering problem can be solved as a p-median problem • Capacitated p-median, pcenter and facility location problems • Set covering problems can be solved as a facility location problem • Weighted maximum satisfiability problems can also be solved as a facility location problem

  35. Phased Local Search (PLS) - Recent Studies • W.J.Pullan, H.H.Hoos, Dynamic Local Search for the Maximum Clique Problem, Journal of Artificial Intelligence Research. 25, 2006 • W.J.Pullan, Phased Local Search for the Maximum Clique Problem, Journal of Combinatorial Optimization. 12 (3) 2006 • W.J.Pullan, Approximating the Maximum Vertex / Edge Weighted Clique Using Local Search, Journal of Heuristics 2007 • W.J.Pullan, Protein Structure Alignment Using Maximum Cliques and Local Search, Australian Artificial Intelligence Conference, 2007 • A.Grosso, M.Locatelli, W.J.Pullan, Randomness, plateau search, penalties, restart rules: simple ingredients leading to very efficient heuristics for the Maximum Clique Problem, Journal of Heuristics,2007 • W.J.Pullan, Optimising Unweighted / Weighted Maximum Independent Sets and Minimum Vertex Covers, To be submitted, LION 2009

  36. PLS - Algorithm • ...for any algorithm, any elevated performance over one class of problems is exactly paid for in performance over another class. (Wolpert & Macready, 1997) • PLS combines three different algorithms into a single unified search • Random search • Penalty based search • Greedy search • Within PLS, these three algorithms are sequentially applied within each iteration of PLS

  37. PLS - Algorithm • Within each algorithm, PLS will, from a starting point, select nodes that increase the current clique using the appropriate criteria. • When no more nodes are available that increase the current clique, PLS constructs the set of nodes that are missing only one connection to the nodes in the current clique and, using the same selection criteria, perform node swaps for a fixed, small number of iterations. • Once Step 2 has completed, PLS will generate a new starting point by either: • generating a current clique containing a single node; or • by `forcing’ a node into the current clique and removing all those nodes that are not connected to it.

  38. PLS – Penalty Delay Tracking

  39. DIMACS Max Clique Problem Types

  40. PLS – MC Hyper-surface Expected cardinalities for a randomly generated graph, n = 1000, edge probability P = 0.9 as compared to experimental observations for C1000.9.

  41. PLS – MC Global Maximum? • C1000.9, when |K| = ω(C1000.9)(= 68), E(|C0(K)|) = 0.72 and E(|C1(K)|) = 5.45 which confirms that there is a high probability of multiple maximal cliques (70 distinct maximal cliques were found in 100 trials) for this instance and raises the possibility of a larger maximal clique existing. • p hat1500-1, when |K| = ω(p hat1500-1)(= 12), E(|C0(K)|) = 0.000088 and E(|C1(K)|) = 0.0031929 which suggests that p hat1500-1 is an extreme variant of the corresponding random graph and that it is highly unlikely that more than a single maximal clique exists (a single unique maximal clique was found in 100 trials). • brock800 1, when |K| = ω(brock800 1)(= 23), E(|C0(K)|) = 0.04 and E(|C1(K)|) = 0.48 (a single unique maximal clique was found in 100 trials).

  42. PLS – Approach to Putative Global Maximums

  43. PLS – RLS DIMACS Comparison

  44. PLS / RLS – Freq. of Node Selection

  45. FRB59-23-6 : Distribution of Node Degrees

More Related