610 likes | 1.15k Views
Optimization methods. Aleksey Minin Saint-Petersburg State University Student of ACOPhys master program (10 th semester). A pplied and CO mputational Phys ics. What is optimization?. Content:. Applications of optimization Global Optimization Local Optimization Discrete optimization
E N D
Optimization methods Aleksey Minin Saint-Petersburg State University Student of ACOPhys master program (10th semester) Joint Advanced Students School Applied and COmputationalPhysics
Joint Advanced Students School What is optimization?
Joint Advanced Students School Content: • Applications of optimization • Global Optimization • Local Optimization • Discrete optimization • Constrained optimization • Real application, Bounded Derivative Network.
Joint Advanced Students School Applications of optimization • Advanced engineering design • Biotechnology • Data analysis • Environmental management • Financial planning • Process control • Scientific modeling etc
Joint Advanced Students School Global or Local ?
Joint Advanced Students School What is global optimization? • The objective of global optimization is to find the globally best solution of (possibly nonlinear) models, in the (possible or known) presence of multiple local optima.
Joint Advanced Students School Branch and bound
Joint Advanced Students School Branch and bound Scientist are ready to carry out some experiments. But the quality of all of the varies depending on type of experiment according to next table:
Joint Advanced Students School _ _ _ _ Branch and bound 4 possibilities A _ _ _ Root … A AAA
Joint Advanced Students School Branch and bound Root AAAA 0.55
Joint Advanced Students School Type 1 Branch and bound A ADCC 0.42 B BAAA 0.42 Root AAAA 0.55 C CAAA 0.52 D DAAA 0.45
Joint Advanced Students School Best CABD 0.38 Type 1 Type 2 Branch and bound A ADCC 0.42 B BAAA 0.42 Root AAAA 0.55 A CABD 0.38 C CAAA 0.52 B CBAA 0.39 D DAAA 0.45 D CDAA 0.45
Joint Advanced Students School Best CABD 0.38 Type 1 Type 2 Type 3 Branch and bound A ADCC 0.42 B BAAA 0.42 Root AAAA 0.55 A CABD 0.38 C CAAA 0.52 A CBAD 0.37 B CBAA 0.39 D DAAA 0.45 B CDBA 0.40 D CDAA 0.45
Joint Advanced Students School Best CABD 0.38 Type 1 Type 2 Type 3 Branch and bound A ADCC 0.42 Best CDBA 0.40 B BAAA 0.42 Root AAAA 0.55 A CABD 0.38 C CAAA 0.52 A CBAD 0.37 B CBAA 0.39 D DAAA 0.45 B CDBA 0.40 D CDAA 0.45
Branch and bound Joint Advanced Students School
Joint Advanced Students School Evolutionary algorithms
Joint Advanced Students School Evolutionary algorithms
Joint Advanced Students School Simulated annealing If T=0 Apply small perturbation Solution found! Repeat until good solution not found
Joint Advanced Students School Simulated annealing results
Joint Advanced Students School Simulated annealing
Joint Advanced Students School Tree annealingdeveloped by Bilbro and Snyder [1991]
Joint Advanced Students School Tree annealingdeveloped by Bilbro and Snyder [1991]
Joint Advanced Students School Swarm intelligence
Joint Advanced Students School Tabu Search
Joint Advanced Students School Taboo search implementation 1 Taboo list
Joint Advanced Students School Tabu search implementation 5 2 1 4 3 Tabu list
Joint Advanced Students School Taboo search implementation 5 2 1 4 3 Tabu list 1
Joint Advanced Students School Tabu search implementation 5 2 1 6 4 3 7 Tabu list 1 3
Joint Advanced Students School Tabu search implementation 5 2 8 1 6 4 3 9 7 Tabu list 1 3 6
Joint Advanced Students School Tabu search implementation 5 2 8 1 6 10 4 3 9 11 7 Tabu list 1 3 6 9
Joint Advanced Students School Tabu search implementation 5 2 8 1 6 10 4 3 9 11 7 Tabu list 1 3 6 9
Joint Advanced Students School Tabu search implementation 5 2 8 1 6 10 4 3 9 11 7 Tabu list 1 3 9 6
Joint Advanced Students School Tabu search implementation 5 2 8 1 6 10 4 3 9 11 7 Tabu list 1 3 9 6
Joint Advanced Students School Tabu Search
Joint Advanced Students School What is Local Optimization? • The term LOCAL refers both to the fact that only information about the function from the neighborhood of the current approximation is used in updating the approximation as well as that we usually expect such methods to converge to whatever local extremum is closest to the starting approximation. • Global structure of the objective function is unknown to a local method.
Joint Advanced Students School Local optimization
Joint Advanced Students School Gradient descent
Joint Advanced Students School Gradient descent Therefore we obtained: F(x0)<F(x1)<…<F(xn )
Joint Advanced Students School Quasi-Newton Methods • These methods build up curvature information at each • iteration to formulate a quadratic model problem of the form: • where the Hessian matrix, H, is a positive definite symmetric matrix, c is a constant vector, and b is a constant. • The optimal solution for this problem occurs when the partial derivatives of x go to zero:
Joint Advanced Students School Quasi-Newton Methods
Joint Advanced Students School BFGS - algorithm
BFGS - algorithm Joint Advanced Students School
Joint Advanced Students School Gauss Newton algorithm Given m functions f1 f2 … fm of n parameters p1 p2 .. Pn (m>n),and we want to minimize the sum:
Gauss Newton algorithm Joint Advanced Students School
Joint Advanced Students School This is an iterative procedure. Initial guess for pT = (1,1,…,1). Levenberg-Marquardt
Levenberg-Marquardt Joint Advanced Students School
SQP – constrained minimization Joint Advanced Students School Reformulation
SQP – constrained minimization Joint Advanced Students School The principal idea is the formulation of a QP sub-problem based on a quadratic approximation of the Lagrangian function:
SQP – constrained minimization Updating the Hessian matrix Joint Advanced Students School