260 likes | 399 Views
Optimization with Equality Constraints. Optimum Values and Extreme Values of a Function of two Variables. Local maximum. Local minimum. Extreme Values of a Function of two Variables. Saddle point. First Order Conditions ( The necessary conditions).
E N D
Optimum Values and Extreme Values of a Function of two Variables Local maximum Local minimum
Extreme Values of a Function of two Variables Saddle point
First Order Conditions(The necessary conditions) Given the problem of maximizing ( or minimizing) of the objective function: Z=f(x ,y ) Finding the Stationary Values solutions of the following system:
Examples • Z=f(x,y)=x2+y2 • Z=f(x,y)=x2-y2 • Z=f(x,y)=xy
Second Order Conditions The Hessian Matrix H(x0,y0)>0 fxx >0 minimum H(x0,y0)>0 fxx <0 maximum H(x0,y0)<0 saddle
Extreme Values of a Function of two Variables The method of Lagrange multipliers provides a strategy for finding the maxima and minima of a function: subject to constraints:
Extreme Values of a Function of two Variables For instance minimize the objective function Subject to the constraint:
Direct methode We can combine the constraint with the objective function: Minimum in P(1/2;1/2)
Lagrangemultiplier We introduce a new variable (λ) called a Lagrange multiplier, and study the Lagrange function:
Second Order Conditions Bordered Hessian Matrix of the Second Order derivative is given by The point is a minimum The point is a maximum
Lagrangemultiplier Given the problem of maximizing ( or minimizing) of the objective function with constraints
First OrderConditionsNecessaryCondition We build a Lagrangian function : Finding the Stationary Values:
Second Order ConditionsSufficient Conditions • Second order conditions: • We must check the sign of a Bordered Hessian:
n=2 e m=1 the Bordered Hessian Matrix of the Second Order derivative is given by Det>0 imply Maximum Det<0 imply Minimum
n=3 e m=2 the matrix of the second order derivate is given by: