1 / 44

Inexact SQP methods for equality constrained optimization

ISMP 2006. Inexact SQP methods for equality constrained optimization. Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge Nocedal August 1, 2006. Outline. Introduction/Motivation Unconstrained optimization Nonlinear equations

alvaro
Download Presentation

Inexact SQP methods for equality constrained optimization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ISMP 2006 Inexact SQP methods for equality constrained optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge Nocedal August 1, 2006

  2. Outline • Introduction/Motivation • Unconstrained optimization • Nonlinear equations • Constrained optimization • Algorithm Development • Step computation • Step acceptance • Global Analysis • Merit function and sufficient decrease • Satisfying first-order conditions • Conclusions/Final remarks

  3. Outline • Introduction/Motivation • Unconstrained optimization • Nonlinear equations • Constrained optimization • Algorithm Development • Step computation • Step acceptance • Global Analysis • Merit function and sufficient decrease • Satisfying first-order conditions • Conclusions/Final remarks

  4. Unconstrained optimization Goal: minimize a single nonlinear objective Algorithm: Newton’s method (CG)

  5. Unconstrained optimization Goal: minimize a single nonlinear objective Algorithm: Newton’s method (CG) Note: choosing any intermediate step ensures global convergence to a local solution of NLP (Steihaug, 1983)

  6. Nonlinear equations Goal: solve a single nonlinear system Algorithm: Newton’s method

  7. Nonlinear equations Goal: solve a single nonlinear system Algorithm: Newton’s method Note: choosing any step with and ensures global convergence (Dembo, Eisenstat, and Steihaug, 1982) (Eisenstat and Walker, 1994)

  8. Equality constrained optimization Goal: solve the problem Define: the Lagrangian Define: the derivatives Goal: solve KKT conditions

  9. Equality constrained optimization • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem

  10. Equality constrained optimization • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem Question: can we ensure convergence to a local solution by choosing any step into the ball?

  11. Equality constrained optimization • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem • Question: can we ensure convergence with a • step to constraints? • step to reduce objective? • Preferably both, but… (Heinkenschloss and Vicente, 2001)

  12. Equality constrained optimization • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem … what if we can’t do both?

  13. Equality constrained optimization • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem Exact solution minimizes the objective subject to satisfying the constraints (but this can be expensive to find) Question: what inexact solutions are acceptable?

  14. Outline • Introduction/Motivation • Unconstrained optimization • Nonlinear equations • Constrained optimization • Algorithm Development • Step computation • Step acceptance • Global Analysis • Merit function and sufficient decrease • Satisfying first-order conditions • Conclusions/Final remarks

  15. Proposed algorithm • Step computation: inexact SQP step • Globalization strategy: exact merit function … with Armijo line search condition

  16. First attempt • Proposition: sufficiently small residual • Test: 61 problems from CUTEr test set

  17. First attempt… Not robust • Proposition: sufficiently small residual • … not enough for complete robustness • We have multiple goals (feasibility and optimality) • Lagrange multipliers may be completely off

  18. Quadratic/linear model of merit function • Create model • Quantify reduction obtained from step

  19. Quadratic/linear model of merit function • Create model • Quantify reduction obtained from step

  20. What are acceptable steps? positive or negative positive or negative 4 possibilities = x

  21. What are acceptable steps?

  22. Option #1: “constraint reduction”

  23. Option #1: “constraint reduction” Penalty parameter can be increased to ensure sufficiently large model reduction

  24. Option #2: “quadratic objective reduction”

  25. Option #2: “quadratic objective reduction” Question: if model reduction is positive, is it large enough?

  26. Split reduction in two parts

  27. Central idea: “sufficient reduction” • Sufficient reduction condition:

  28. Option #1: “constraint reduction”

  29. Option #1: “constraint reduction”

  30. Option #1: “constraint reduction” Step is acceptable if:

  31. Option #1: “constraint reduction” Step is acceptable if:

  32. Option #2: “quadratic objective reduction”

  33. Option #2: “quadratic objective reduction” Step is acceptable if:

  34. Option #2: “quadratic objective reduction” Step is acceptable if:

  35. Algorithm outline • for k = 0, 1, 2, … • Iteratively solve • Until • Update penalty parameter • Perform backtracking line search • Update iterate or

  36. Termination test • Observe KKT conditions

  37. Outline • Introduction/Motivation • Unconstrained optimization • Nonlinear equations • Constrained optimization • Algorithm Development • Step computation • Step acceptance • Global Analysis • Merit function and sufficient decrease • Satisfying first-order conditions • Conclusions/Final remarks

  38. Assumptions • The sequence of iterates is contained in a convex set over which the following hold: • the objective function is bounded below • the objective and constraint functions and their first and second derivatives are uniformly bounded in norm • the constraint Jacobian has full row rank and • the Hessian of the Lagrangian is positive definite and

  39. Sufficient reduction to sufficient decrease • Taylor expansion of merit function yields • Accepted step satisfies

  40. Intermediate results is bounded above is bounded above is positive, and bounded above zero

  41. Sufficient decrease in merit function

  42. Step in dual space • We converge to an optimal primal solution, and (for sufficiently small and ) …, therefore,

  43. Outline • Introduction/Motivation • Unconstrained optimization • Nonlinear equations • Constrained optimization • Algorithm Development • Step computation • Step acceptance • Global Analysis • Merit function and sufficient decrease • Satisfying first-order conditions • Conclusions/Final remarks

  44. Conclusion/Final remarks • Review • Defined a globally convergent inexact SQP algorithm • Require only inexact solutions of KKT system • Require only matrix-vector products involving objective and constraint function derivatives • Results also apply when only reduced Hessian of Lagrangian is assumed to be positive definite • Future challenges • Implementation and appropriate parameter values • Nearly-singular constraint Jacobian • Inexact derivative information • Negative curvature • etc., etc., etc….

More Related