230 likes | 361 Views
Solving Constrained Continuous Optimization Problems with GCACO II. Presented by : M. Eftekhari M.R. Moosavi S. D. Katebi. Outlines of lecture. State-Of-the-art in ACO Meta-Heuristic Extension Of ACO to Continuous Spaces Past researches Proposed Method
E N D
Solving Constrained Continuous Optimization Problems withGCACO II Presented by : M. Eftekhari M.R. Moosavi S. D. Katebi eftekhari@cse.shirazu.ac.ir
Outlines of lecture • State-Of-the-art in ACO Meta-Heuristic • Extension Of ACO to Continuous Spaces • Past researches • Proposed Method • Experimental Design And Parameter Selection • Results • Conclusions eftekhari@cse.shirazu.ac.ir
Introduction • The initial versions of ACO Meta-heuristic algorithms were developed by Dorigo et. Al(AS) • The ACO Meta-heuristic for attacking hard combinatorial optimization problems (TSP) • Improvement of the AS by Dorigo et. al (ACS) • In contrast to conventional use of ACO relatively a few works with the purpose of extending ACO algorithms to continuous space have been reported. • The first Continuous ACO (CACO) algorithm was introduced by Bilchev eftekhari@cse.shirazu.ac.ir
Outlines of lecture • State-Of-the-art in ACO Meta-Heuristic • Extension Of ACO to Continuous Spaces • Past researches • Proposed Method • Experimental Design And Parameter Selection • Results • Conclusions eftekhari@cse.shirazu.ac.ir
Extension Of ACO to Continuous SpacesCACO (review of past researches) • Bilchev (1995) , Mathur et. Al (2000), Dréo, Siarry(2002), Yan-jun, Tie-jun (2003) • The Bilchev's approach and most of the recent methods comprises two stages: global and local. • The set of ants are divided into twoclasses • one type is used to search globally for promising regions in search space • The other ants are used to perform a local searchinsidethe most promising regions. . eftekhari@cse.shirazu.ac.ir
Extension Of ACO to Continuous SpacesCACO (review of past researches) • The creation of new regions for global searching is handled by a Genetic Algorithm like process • The local ants provide the metaphoric like to ant colonies • disadvantages : • The expensive maintaining of a history of regions • The first CACO didn't handle constrained optimization problems (Bilchev & Wodrich solved this) eftekhari@cse.shirazu.ac.ir
Our Proposed Method • In GCACOI, global search stage is not concerned • The GCACOI for unconstrained continuous numerical optimization problems • In GCACOI, NO any regions and some GA-like notations for creating them. • Employing a dynamic penalization method (by Joins and Houck) in GCACOII . • Handling of Linear/Nonlinear and Equality/Inequality constraints is possible . eftekhari@cse.shirazu.ac.ir
Flow chart of algorithm Next Slide eftekhari@cse.shirazu.ac.ir
Movement of each ant if(q<=q0) cur_location = pre_location +Random_generated_vector_proportional_to_Ra; else cur _location = pre_location + The_best_NGV_proportional_to_Ra; endif-else sum_of_violations = Check_Constraint_for_violation(current eftekhari@cse.shirazu.ac.ir
Our Proposed Method (continue) • The Normalized Gradient Vectors (NGVs) are employed to guide our searching process. • Each ant moves from its previous position in two ways: • 1- Along the most promising NGV, leading to better points in previous motions. • 2- Along a random uniform vector in [-1 1]. eftekhari@cse.shirazu.ac.ir
Radius of Colony RC Ra=Rc Reduction of ants radius in a colony Ra = Ra*dF Nest Our Proposed Method (continue) • The movement of each ant in a colony • The reduction of colonies radius After the movement of ants in a colony the best point is considered as the Nest of the Next colony If one of the random vectors lead to a better point , we will memorize the NGV of it eftekhari@cse.shirazu.ac.ir
Our Proposed Method (continue) • Heuristic value of each memorized NGV. eftekhari@cse.shirazu.ac.ir
Our Proposed Method (continue)Constraint handling • F is the feasible region • is the jth constraint violation • W is a constant and C_NO is the colony number, θ is a constant for adjusting the weight of penalization eftekhari@cse.shirazu.ac.ir
Outlines of lecture • State-Of-the-art in ACO Meta-Heuristic • Extension Of ACO to Continuous Spaces • Past researches • Proposed Method • Experimental Design And Parameter Selection • Results • Conclusions eftekhari@cse.shirazu.ac.ir
Experimental Design And Parameter Selection eftekhari@cse.shirazu.ac.ir
An Example of constraints eftekhari@cse.shirazu.ac.ir
parameters of proposed algorithm for different test cases eftekhari@cse.shirazu.ac.ir
Outlines of lecture • State-Of-the-art in ACO Meta-Heuristic • Extension Of ACO to Continuous Spaces • Past researches • Proposed Method • Experimental Design And Parameter Selection • Results • Conclusions eftekhari@cse.shirazu.ac.ir
Comparing the results Comparing the global best evaluation function obtained by GCACO and Michalewicz's methods. eftekhari@cse.shirazu.ac.ir
Outlines of lecture • State-Of-the-art in ACO Meta-Heuristic • Extension Of ACO to Continuous Spaces • Past researches • Proposed Method • Experimental Design And Parameter Selection • Results • Conclusions eftekhari@cse.shirazu.ac.ir
Conclusions and Future Works • The basic structure of the general ACO has been preserved • GCACOII has not the global search stage as the previous developed algorithms while it utilizes a memory for sensing the environment globally • The GCACOII is well comparable to other Evolutionary methods • The ability to solve various types of numerical optimization problems while for solving the test cases used in this research, different algorithms • A hybridization of conventional gradient based and meta-heuristic optimization method eftekhari@cse.shirazu.ac.ir
Future Works • Extending to multi-modal , Dynamic optimization problems • Testing on the real world problems • Extending to parallel and distributed form eftekhari@cse.shirazu.ac.ir
Thanks for your attention Any Question ? eftekhari@cse.shirazu.ac.ir