570 likes | 1.35k Views
ENGINEERING OPTIMIZATION Methods and Applications. A. Ravindran, K. M. Ragsdell, G. V. Reklaitis. Book Review. Chapter 5: Constrained Optimality Criteria. Part 1: Ferhat Dikbiyik Part 2:Yi Zhang. Review Session July 2, 2010. Constraints: Good guys or bad guys?. Constraints:
E N D
ENGINEERING OPTIMIZATION Methods and Applications A. Ravindran, K. M. Ragsdell, G. V. Reklaitis Book Review
Chapter 5: Constrained Optimality Criteria Part 1: Ferhat Dikbiyik Part 2:Yi Zhang Review Session July 2, 2010
Constraints: Good guys or bad guys?
Constraints: Good guys or bad guys? reduces the region in which we search for optimum.
Constraints: Good guys or bad guys? makes optimization process very complicated
Outline of Part 1 • Equality-Constrained Problems • Lagrange Multipliers • Economic Interpretation of Lagrange Multipliers • Kuhn-Tucker Conditions • Kuhn-Tucker Theorem
Outline of Part 1 • Equality-Constrained Problems • Lagrange Multipliers • Economic Interpretation of Lagrange Multipliers • Kuhn-Tucker Conditions • Kuhn-Tucker Theorem
Equality-Constrained Problems GOAL solving the problem as an unconstrained problem by explicitly eliminating K independent variables using the equality constraints
Outline of Part 1 • Equality-Constrained Problems • Lagrange Multipliers • Economic Interpretation of Lagrange Multipliers • Kuhn-Tucker Conditions • Kuhn-Tucker Theorem
Lagrange Multipliers Converting constrained problem to an unconstrained problem with help of certain unspecified parameters known as Lagrange Multipliers
Lagrange Multipliers Lagrange function
Lagrange Multipliers Lagrange multiplier
Test whether the stationary point corresponds to a minimum positive definite
max positive definite negative definite
Outline of Part 1 • Equality-Constrained Problems • Lagrange Multipliers • Economic Interpretation of Lagrange Multipliers • Kuhn-Tucker Conditions • Kuhn-Tucker Theorem
Economic Interpretation of Lagrange Multipliers The Lagrange multipliers have an important economic interpretation as shadow prices of the constraints, and their optimal values are very useful in sensitivity analysis.
Outline of Part 1 • Equality-Constrained Problems • Lagrange Multipliers • Economic Interpretation of Lagrange Multipliers • Kuhn-Tucker Conditions • Kuhn-Tucker Theorem
Outline of Part 1 • Equality-Constrained Problems • Lagrange Multipliers • Economic Interpretation of Lagrange Multipliers • Kuhn-Tucker Conditions • Kuhn-Tucker Theorem
Kuhn-Tucker Theorems Kuhn – Tucker Necessity Theorem Kuhn – Tucker Sufficient Theorem
Kuhn-Tucker Necessity Theorem • Let • f, g, and h be differentiable functions • x* be a feasible solution to the NLP problem. • and for k=1,….,K are linearly independent
Kuhn-Tucker Necessity Theorem • Let • f, g, and h be differentiable functions x* be a feasible solution to the NLP problem. • and for k=1,….,K are linearly independent at the optimum • If x* is an optimal solution to the NLP problem, then there exists a (u*, v*) such that (x*,u*, v*) solves the KTP given by KTC. Constraint qualification ! Hard to verify, since it requires that the optimum solution be known beforehand !
Kuhn-Tucker Necessity Theorem For certain special NLP problems, the constraint qualification is satisfied: When all the inequality and equality constraints are linear When all the inequality constraints are concave functions and equality constraints are linear ! When the constraint qualification is not met at the optimum, there may not exist a solution to the KTP
Example 5.5 x* = (1, 0) and for k=1,….,K are linearly independent at the optimum
Example 5.5 x* = (1, 0) No Kuhn-Tucker point at the optimum
Kuhn-Tucker Necessity Theorem Given a feasible point that satisfies the constraint qualification not optimal optimal If it does not satisfy the KTCs If it does satisfy the KTCs
Kuhn-Tucker Sufficiency Theorem • Let • f(x) be convex • the inequality constraints gj(x) for j=1,…,J be all concave function • the equality constraints hk(x) for k=1,…,K be linear If there exists a solution (x*,u*,v*) that satisfies KTCs, then x* is an optimal solution
Example 5.4 • f(x) be convex • the inequality constraints gj(x) for j=1,…,J be all concave function • the equality constraints hk(x) for k=1,…,K be linear
Example 5.4 • f(x) be convex semi-definite
Example 5.4 • f(x) be convex • the inequality constraints gj(x) for j=1,…,J be all concave function v g1(x) linear, hence both convex and concave negative definite
Example 5.4 • f(x) be convex • the inequality constraints gj(x) for j=1,…,J be all concave function • the equality constraints hk(x) for k=1,…,K be linear v
Remarks For practical problems, the constraint qualification will generally hold. If the functions are differentiable, a Kuhn–Tucker point is a possible candidate for the optimum. Hence, many of the NLP methods attempt to converge to a Kuhn–Tucker point.
Remarks When the sufficiency conditions of Theorem 5.2 hold, a Kuhn–Tucker point automatically becomes the global minimum. Unfortunately, the sufficiency conditions are difficult to verify, and often practical problems may not possess these nice properties. Note that the presence of one nonlinear equality constraint is enough to violate the assumptions of Theorem 5.2
Remarks The sufficiency conditions of Theorem 5.2 have been generalized further to nonconvex inequality constraints, nonconvex objectives, and nonlinear equality constraints. These use generalizations of convex functions such as quasi-convex and pseudoconvex functions