440 likes | 656 Views
ISMP 2006. Inexact SQP methods for equality constrained optimization. Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge Nocedal August 1, 2006. Outline. Introduction/Motivation Unconstrained optimization Nonlinear equations
E N D
ISMP 2006 Inexact SQP methods for equality constrained optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge Nocedal August 1, 2006
Outline • Introduction/Motivation • Unconstrained optimization • Nonlinear equations • Constrained optimization • Algorithm Development • Step computation • Step acceptance • Global Analysis • Merit function and sufficient decrease • Satisfying first-order conditions • Conclusions/Final remarks
Outline • Introduction/Motivation • Unconstrained optimization • Nonlinear equations • Constrained optimization • Algorithm Development • Step computation • Step acceptance • Global Analysis • Merit function and sufficient decrease • Satisfying first-order conditions • Conclusions/Final remarks
Unconstrained optimization Goal: minimize a single nonlinear objective Algorithm: Newton’s method (CG)
Unconstrained optimization Goal: minimize a single nonlinear objective Algorithm: Newton’s method (CG) Note: choosing any intermediate step ensures global convergence to a local solution of NLP (Steihaug, 1983)
Nonlinear equations Goal: solve a single nonlinear system Algorithm: Newton’s method
Nonlinear equations Goal: solve a single nonlinear system Algorithm: Newton’s method Note: choosing any step with and ensures global convergence (Dembo, Eisenstat, and Steihaug, 1982) (Eisenstat and Walker, 1994)
Equality constrained optimization Goal: solve the problem Define: the Lagrangian Define: the derivatives Goal: solve KKT conditions
Equality constrained optimization • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem
Equality constrained optimization • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem Question: can we ensure convergence to a local solution by choosing any step into the ball?
Equality constrained optimization • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem • Question: can we ensure convergence with a • step to constraints? • step to reduce objective? • Preferably both, but… (Heinkenschloss and Vicente, 2001)
Equality constrained optimization • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem … what if we can’t do both?
Equality constrained optimization • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem Exact solution minimizes the objective subject to satisfying the constraints (but this can be expensive to find) Question: what inexact solutions are acceptable?
Outline • Introduction/Motivation • Unconstrained optimization • Nonlinear equations • Constrained optimization • Algorithm Development • Step computation • Step acceptance • Global Analysis • Merit function and sufficient decrease • Satisfying first-order conditions • Conclusions/Final remarks
Proposed algorithm • Step computation: inexact SQP step • Globalization strategy: exact merit function … with Armijo line search condition
First attempt • Proposition: sufficiently small residual • Test: 61 problems from CUTEr test set
First attempt… Not robust • Proposition: sufficiently small residual • … not enough for complete robustness • We have multiple goals (feasibility and optimality) • Lagrange multipliers may be completely off
Quadratic/linear model of merit function • Create model • Quantify reduction obtained from step
Quadratic/linear model of merit function • Create model • Quantify reduction obtained from step
What are acceptable steps? positive or negative positive or negative 4 possibilities = x
Option #1: “constraint reduction” Penalty parameter can be increased to ensure sufficiently large model reduction
Option #2: “quadratic objective reduction” Question: if model reduction is positive, is it large enough?
Central idea: “sufficient reduction” • Sufficient reduction condition:
Option #1: “constraint reduction” Step is acceptable if:
Option #1: “constraint reduction” Step is acceptable if:
Option #2: “quadratic objective reduction” Step is acceptable if:
Option #2: “quadratic objective reduction” Step is acceptable if:
Algorithm outline • for k = 0, 1, 2, … • Iteratively solve • Until • Update penalty parameter • Perform backtracking line search • Update iterate or
Termination test • Observe KKT conditions
Outline • Introduction/Motivation • Unconstrained optimization • Nonlinear equations • Constrained optimization • Algorithm Development • Step computation • Step acceptance • Global Analysis • Merit function and sufficient decrease • Satisfying first-order conditions • Conclusions/Final remarks
Assumptions • The sequence of iterates is contained in a convex set over which the following hold: • the objective function is bounded below • the objective and constraint functions and their first and second derivatives are uniformly bounded in norm • the constraint Jacobian has full row rank and • the Hessian of the Lagrangian is positive definite and
Sufficient reduction to sufficient decrease • Taylor expansion of merit function yields • Accepted step satisfies
Intermediate results is bounded above is bounded above is positive, and bounded above zero
Step in dual space • We converge to an optimal primal solution, and (for sufficiently small and ) …, therefore,
Outline • Introduction/Motivation • Unconstrained optimization • Nonlinear equations • Constrained optimization • Algorithm Development • Step computation • Step acceptance • Global Analysis • Merit function and sufficient decrease • Satisfying first-order conditions • Conclusions/Final remarks
Conclusion/Final remarks • Review • Defined a globally convergent inexact SQP algorithm • Require only inexact solutions of KKT system • Require only matrix-vector products involving objective and constraint function derivatives • Results also apply when only reduced Hessian of Lagrangian is assumed to be positive definite • Future challenges • Implementation and appropriate parameter values • Nearly-singular constraint Jacobian • Inexact derivative information • Negative curvature • etc., etc., etc….