1 / 23

Solving Constrained Continuous Optimization Problems with GCACO II

Solving Constrained Continuous Optimization Problems with GCACO II. Presented by : M. Eftekhari M.R. Moosavi S. D. Katebi. Outlines of lecture. State-Of-the-art in ACO Meta-Heuristic Extension Of ACO to Continuous Spaces Past researches Proposed Method

karsen
Download Presentation

Solving Constrained Continuous Optimization Problems with GCACO II

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Solving Constrained Continuous Optimization Problems withGCACO II Presented by : M. Eftekhari M.R. Moosavi S. D. Katebi eftekhari@cse.shirazu.ac.ir

  2. Outlines of lecture • State-Of-the-art in ACO Meta-Heuristic • Extension Of ACO to Continuous Spaces • Past researches • Proposed Method • Experimental Design And Parameter Selection • Results • Conclusions eftekhari@cse.shirazu.ac.ir

  3. Introduction • The initial versions of ACO Meta-heuristic algorithms were developed by Dorigo et. Al(AS) • The ACO Meta-heuristic for attacking hard combinatorial optimization problems (TSP) • Improvement of the AS by Dorigo et. al (ACS) • In contrast to conventional use of ACO relatively a few works with the purpose of extending ACO algorithms to continuous space have been reported. • The first Continuous ACO (CACO) algorithm was introduced by Bilchev eftekhari@cse.shirazu.ac.ir

  4. Outlines of lecture • State-Of-the-art in ACO Meta-Heuristic • Extension Of ACO to Continuous Spaces • Past researches • Proposed Method • Experimental Design And Parameter Selection • Results • Conclusions eftekhari@cse.shirazu.ac.ir

  5. Extension Of ACO to Continuous SpacesCACO (review of past researches) • Bilchev (1995) , Mathur et. Al (2000), Dréo, Siarry(2002), Yan-jun, Tie-jun (2003) • The Bilchev's approach and most of the recent methods comprises two stages: global and local. • The set of ants are divided into twoclasses • one type is used to search globally for promising regions in search space • The other ants are used to perform a local searchinsidethe most promising regions. . eftekhari@cse.shirazu.ac.ir

  6. Extension Of ACO to Continuous SpacesCACO (review of past researches) • The creation of new regions for global searching is handled by a Genetic Algorithm like process • The local ants provide the metaphoric like to ant colonies • disadvantages : • The expensive maintaining of a history of regions • The first CACO didn't handle constrained optimization problems (Bilchev & Wodrich solved this) eftekhari@cse.shirazu.ac.ir

  7. Our Proposed Method • In GCACOI, global search stage is not concerned • The GCACOI for unconstrained continuous numerical optimization problems • In GCACOI, NO any regions and some GA-like notations for creating them. • Employing a dynamic penalization method (by Joins and Houck) in GCACOII . • Handling of Linear/Nonlinear and Equality/Inequality constraints is possible . eftekhari@cse.shirazu.ac.ir

  8. Flow chart of algorithm Next Slide eftekhari@cse.shirazu.ac.ir

  9. Movement of each ant if(q<=q0) cur_location = pre_location +Random_generated_vector_proportional_to_Ra; else cur _location = pre_location + The_best_NGV_proportional_to_Ra; endif-else sum_of_violations = Check_Constraint_for_violation(current eftekhari@cse.shirazu.ac.ir

  10. Our Proposed Method (continue) • The Normalized Gradient Vectors (NGVs) are employed to guide our searching process. • Each ant moves from its previous position in two ways: • 1- Along the most promising NGV, leading to better points in previous motions. • 2- Along a random uniform vector in [-1 1]. eftekhari@cse.shirazu.ac.ir

  11. Radius of Colony RC Ra=Rc Reduction of ants radius in a colony Ra = Ra*dF Nest Our Proposed Method (continue) • The movement of each ant in a colony • The reduction of colonies radius After the movement of ants in a colony the best point is considered as the Nest of the Next colony If one of the random vectors lead to a better point , we will memorize the NGV of it eftekhari@cse.shirazu.ac.ir

  12. Our Proposed Method (continue) • Heuristic value of each memorized NGV. eftekhari@cse.shirazu.ac.ir

  13. Our Proposed Method (continue)Constraint handling • F is the feasible region • is the jth constraint violation • W is a constant and C_NO is the colony number, θ is a constant for adjusting the weight of penalization eftekhari@cse.shirazu.ac.ir

  14. Outlines of lecture • State-Of-the-art in ACO Meta-Heuristic • Extension Of ACO to Continuous Spaces • Past researches • Proposed Method • Experimental Design And Parameter Selection • Results • Conclusions eftekhari@cse.shirazu.ac.ir

  15. Experimental Design And Parameter Selection eftekhari@cse.shirazu.ac.ir

  16. An Example of constraints eftekhari@cse.shirazu.ac.ir

  17. parameters of proposed algorithm for different test cases eftekhari@cse.shirazu.ac.ir

  18. Outlines of lecture • State-Of-the-art in ACO Meta-Heuristic • Extension Of ACO to Continuous Spaces • Past researches • Proposed Method • Experimental Design And Parameter Selection • Results • Conclusions eftekhari@cse.shirazu.ac.ir

  19. Comparing the results Comparing the global best evaluation function obtained by GCACO and Michalewicz's methods. eftekhari@cse.shirazu.ac.ir

  20. Outlines of lecture • State-Of-the-art in ACO Meta-Heuristic • Extension Of ACO to Continuous Spaces • Past researches • Proposed Method • Experimental Design And Parameter Selection • Results • Conclusions eftekhari@cse.shirazu.ac.ir

  21. Conclusions and Future Works • The basic structure of the general ACO has been preserved • GCACOII has not the global search stage as the previous developed algorithms while it utilizes a memory for sensing the environment globally • The GCACOII is well comparable to other Evolutionary methods • The ability to solve various types of numerical optimization problems while for solving the test cases used in this research, different algorithms • A hybridization of conventional gradient based and meta-heuristic optimization method eftekhari@cse.shirazu.ac.ir

  22. Future Works • Extending to multi-modal , Dynamic optimization problems • Testing on the real world problems • Extending to parallel and distributed form eftekhari@cse.shirazu.ac.ir

  23. Thanks for your attention Any Question ? eftekhari@cse.shirazu.ac.ir

More Related