130 likes | 321 Views
“True” Constrained Minimization. Optimization algorithms for constrained optimization are often classified in primal and transformation methods:
E N D
Optimization algorithms for constrained optimization are often classified in primal and transformation methods: By a primal method of solution we mean a search method that works on the original problem directly by searching through the feasible region for the optimal solution. Transformation methods convert a constrained optimization problem to a sequence of unconstrained optimization problems. They include barrier and penalty function methods. Classification
Sequential Linear Programming was developed in the early 1960s. Also known as Kelley's cutting plane method. Also known as Stewart and Griffith's Method of Approximate Programming. SLP is considered unattractive by theoreticians. However, the concept has proven to be quite powerful and efficient for engineering design. The basic concept is that we first linearize the objective and constraints and then solve this linear problem by an optimizer of choice. Sequential Linear Programming
Basic idea: Use linear approximations of the nonlinear functions and apply standard linear programming techniques. Process is repeated successively as the optimization process. Major concern: How far from the point of interest are these approximations valid? This problem is generally addressed by introducing move limits (often by trial and error). Note: Move limits depend on degree of nonlinearity. Traditional Stewart and Griffith Method uses first order Taylor expansion: SLP Algorithm
Better results can be obtained by retaining the second-order terms of the Taylor series expansion, i.e., Variations on SLP
ALP algorithm is a variation on SLP. Some unique features: the use of second-order terms in linearization, the normalization of the constraints and goals and their transformation into generally well-behaved convex functions in the region of interest, an “intelligent” constraint suppression and accumulation scheme. ALP uses only diagonal terms of Hessian. Thus 2nd order Taylor series becomes: ALP Algorithm in DSIDES
Hatched area represents the linear hyperplane in the three dimensional space. Note that g(x) is the objective function value. Creation of a Hyperplane
Given the solution of the first approximation, one can make an improved hyperplane by using this solution and the existing quadratic approximation. F F q h q e g X g X Creation of a Second Improved Hyperplane Error test: = ( ) - ( )
Three dimensional view of constraint accumulation, i.e., you "remember" previous approximations of a constraint in order to improve the overall approximation. Constraint Accumulation
Accumulation is only useful for convex constraints Why not for non-convex? Convexity Low degree of convexity
New improved constraint approximation is added to first approximation of a constraint with high degree of convexity Adaptation of Convex Constraint
Adaptation of Non-Convex Constraint Original (first) linear approximation is replaced (or modified) with new approximation.
The software implementation has various features (see manual): Linear solver is a version of Revised Multiplex, hence lexicographic multi-objective optimization is possible. Generation of initial point and exploration of design space Adaptive reduced move (optional) Automatic constraint suppression (embodied in code) Perturbation step size control ... Other ALP Features