300 likes | 809 Views
Ch. 9: Direction Generation Method Based on Linearization. Generalized Reduced Gradient Method. Mohammad Farhan Habib NetLab, CS, UC Davis July 30, 2010. Objective. Methods to solve general NLP problems Equality constraints Inequality Constraints. Implicit Variable Elimination.
E N D
Ch. 9: Direction Generation Method Based on Linearization Generalized Reduced Gradient Method Mohammad Farhan Habib NetLab, CS, UC Davis July 30, 2010
Objective • Methods to solve general NLP problems • Equality constraints • Inequality Constraints
Implicit Variable Elimination • Eliminate variables by solving equality constraints • Explicit elimination is not always possible • Reduce the problem dimension
Implicit Variable Elimination • X(1) satisfies the constraints of the equality constrained problem • Linear approximation to the problem constraints at X(1) • This system of equations have more unknowns than equation • Solve for k variables in terms of other N-K
Implicit Variable Elimination • First K variables - (basic) • Remaining N-K variables – (non-basic) • Partition the row vector into and • Equation 9.14 becomes,
Implicit Variable Elimination • appears to be an unconstrained function involving only the N-K non-basic variables
Implicit Variable Elimination • The first order necessary condition for X(1) to be a local minima of is, - reduced gradient
Basic Generalized Reduced Gradient (GRG) algorithm • Suppose at iteration t, feasible point and the partition are available
Basic GRG algorithm • d is a descent direction • From first order tailor expansion of equation 9.16, • is implicit in the above construction
Basic GRG algorithm – Example 1 • Linear approximation – • Most of the points do not satisfy the equality constraints • d is a descent direction • d in general leads to infeasible points
Basic GRG algorithm • More precisely, is a descent direction in the space of non-basic variables but the composite direction vector yields infeasible points
Basic GRG algorithm – Example 2 • For every values of α that is selected as a trial, the constraint equation will have to be solved for the values of the dependent variables that will cause the resulting point to be feasible • Newton’s iteration formula to solve the set of equations, is • In this problem,
Extension of GRG – Inequality Constraints and Bounds on Variables • Upper and lower variable bounds • A check must be made to ensure that only variables that are not on or very near their bounds are labeled as basic variables • The direction vector is modified to ensure that the bounds on the independent variables will not be violated if movement is undertaken in the direction. This is accomplished by setting • Checks must be inserted in step 3 of the basic GRG algorithm to ensure that the bounds are not exceeded either during the search on or during the Newton iterations.
Extension of GRG – Inequality Constraints and Bounds on Variables • Inequality constraints • explicitly writing these constraints as equalities using slack variables • implicitly using the concept of active constraint set as in feasible direction methods.
Summary • Linearization of the nonlinear problem functions to generate good search directions • Two types of algorithms • Feasible direction methods • Required the solution of an LP sub-problem • GRG algorithm • solve a set of linear equations to determine a good descent direction