370 likes | 463 Views
INFORMS Annual Meeting 2006. Inexact SQP Methods for Equality Constrained Optimization. Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge Nocedal November 6, 2006. Outline. Introduction Problem formulation Motivation for inexactness
E N D
INFORMS Annual Meeting 2006 Inexact SQP Methods for Equality Constrained Optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge Nocedal November 6, 2006
Outline • Introduction • Problem formulation • Motivation for inexactness • Unconstrained optimization and nonlinear equations • Algorithm Development • Step computation • Step acceptance • Global Analysis • Merit function and sufficient decrease • Satisfying first-order conditions • Conclusions/Final remarks
Outline • Introduction • Problem formulation • Motivation for inexactness • Unconstrained optimization and nonlinear equations • Algorithm Development • Step computation • Step acceptance • Global Analysis • Merit function and sufficient decrease • Satisfying first-order conditions • Conclusions/Final remarks
Equality constrained optimization Goal: solve the problem Define: the derivatives Define: the Lagrangian Goal: solve KKT conditions
Equality constrained optimization • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem
Equality constrained optimization • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem • KKT matrix • Cannot be formed • Cannot be factored
Equality constrained optimization • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem • KKT matrix • Cannot be formed • Cannot be factored • Linear system solve • Iterative method • Inexactness
Unconstrained optimization Goal: minimize a nonlinear objective Algorithm: Newton’s method (CG) Note: choosing any intermediate step ensures global convergence to a local solution of NLP (Steihaug, 1983)
Nonlinear equations Goal: solve a nonlinear system Algorithm: Newton’s method Note: choosing any step with and ensures global convergence (Dembo, Eisenstat, and Steihaug, 1982) (Eisenstat and Walker, 1994)
Outline • Introduction/Motivation • Unconstrained optimization • Nonlinear equations • Constrained optimization • Algorithm Development • Step computation • Step acceptance • Global Analysis • Merit function and sufficient decrease • Satisfying first-order conditions • Conclusions/Final remarks
Equality constrained optimization • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem Question: can we ensure convergence to a local solution by choosing any step into the ball?
Globalization strategy • Step computation: inexact SQP step • Globalization strategy: exact merit function … with Armijo line search condition
First attempt • Proposition: sufficiently small residual • Test: 61 problems from CUTEr test set
First attempt… not robust • Proposition: sufficiently small residual • … not enough for complete robustness • We have multiple goals (feasibility and optimality) • Lagrange multipliers may be completely off
Second attempt • Step computation: inexact SQP step • Recall the line search condition • We can show
Second attempt • Step computation: inexact SQP step • Recall the line search condition • We can show ... but how negative should this be?
Quadratic/linear model of merit function • Create model • Quantify reduction obtained from step
Quadratic/linear model of merit function • Create model • Quantify reduction obtained from step
Exact case Exact step minimizes the objective on the linearized constraints
Exact case Exact step minimizes the objective on the linearized constraints … which may lead to an increase in the objective (but that’s ok)
Option #1: current penalty parameter Step is acceptable if for
Option #2: new penalty parameter Step is acceptable if for
Option #2: new penalty parameter Step is acceptable if for
Algorithm outline • for k = 0, 1, 2, … • Iteratively solve • Until • Update penalty parameter • Perform backtracking line search • Update iterate or
Termination test • Observe KKT conditions
Outline • Introduction/Motivation • Unconstrained optimization • Nonlinear equations • Constrained optimization • Algorithm Development • Step computation • Step acceptance • Global Analysis • Merit function and sufficient decrease • Satisfying first-order conditions • Conclusions/Final remarks
Assumptions • The sequence of iterates is contained in a convex set over which the following hold: • the objective function is bounded below • the objective and constraint functions and their first and second derivatives are uniformly bounded in norm • the constraint Jacobian has full row rank and its smallest singular value is bounded below by a positive constant • the Hessian of the Lagrangian is positive definite with smallest eigenvalue bounded below by a positive constant
Sufficient reduction to sufficient decrease • Taylor expansion of merit function yields • Accepted step satisfies
Intermediate results is bounded above is bounded above is bounded below by a positive constant
Step in dual space • We converge to an optimal primal solution, and (for sufficiently small and ) Therefore,
Outline • Introduction/Motivation • Unconstrained optimization • Nonlinear equations • Constrained optimization • Algorithm Development • Step computation • Step acceptance • Global Analysis • Merit function and sufficient decrease • Satisfying first-order conditions • Conclusions/Final remarks
Conclusion/Final remarks • Review • Defined a globally convergent inexact SQP algorithm • Require only inexact solutions of KKT system • Require only matrix-vector products involving objective and constraint function derivatives • Results also apply when only reduced Hessian of Lagrangian is assumed to be positive definite • Future challenges • Implementation and appropriate parameter values • Nearly-singular constraint Jacobian • Inexact derivative information • Negative curvature • etc., etc., etc….