890 likes | 2.1k Views
MATLAB Optimization Toolbox. Presented by Chin Pei February 28, 2003. Presentation Outline. Introduction Function Optimization Optimization Toolbox Routines / Algorithms available Minimization Problems Unconstrained Constrained Example The Algorithm Description
E N D
MATLAB Optimization Toolbox Presented by Chin Pei February 28, 2003
Presentation Outline • Introduction • Function Optimization • Optimization Toolbox • Routines / Algorithms available • Minimization Problems • Unconstrained • Constrained • Example • The Algorithm Description • Multiobjective Optimization • Optimal PID Control Example
Introduction Unconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Function Optimization • Optimization concerns the minimization or maximization of functions • Standard Optimization Problem Subject to: Equality Constraints Inequality Constraints Side Constraints
Introduction Unconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Function Optimization is the objective function, which measure and evaluate the performance of a system. In a standard problem, we are minimizing the function. For maximization, it is equivalent to minimization of the –ve of the objective function. is a column vector of design variables, which can affect the performance of the system.
Introduction Unconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Function Optimization Constraints – Limitation to the design space. Can be linear or nonlinear, explicit or implicit functions Equality Constraints Inequality Constraints Most algorithm require less than!!! Side Constraints
Introduction Unconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Optimization Toolbox • Is a collection of functions that extend the capability of MATLAB. The toolbox includes routines for: • Unconstrained optimization • Constrained nonlinear optimization, including goal attainment problems, minimax problems, and semi-infinite minimization problems • Quadratic and linear programming • Nonlinear least squares and curve fitting • Nonlinear systems of equations solving • Constrained linear least squares • Specialized algorithms for large scale problems
Introduction Unconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Minimization Algorithm
Introduction Unconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Minimization Algorithm (Cont.)
Introduction Unconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Equation Solving Algorithms
Introduction Unconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Least-Squares Algorithms
Introduction Unconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Implementing Opt. Toolbox • Most of these optimization routines require the definition of an M-file containing the function, f, to be minimized. • Maximization is achieved by supplying the routines with –f. • Optimization options passed to the routines change optimization parameters. • Default optimization parameters can be changed through an options structure.
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Unconstrained Minimization • Consider the problem of finding a set of values [x1 x2]T that solves
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Steps • Create an M-file that returns the function value (Objective Function) • Call it objfun.m • Then, invoke the unconstrained minimization routine • Use fminunc
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Step 1 – Obj. Function function f = objfun(x) f=exp(x(1))*(4*x(1)^2+2*x(2)^2+4*x(1)*x(2)+2*x(2)+1); Objective function
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Step 2 – Invoke Routine Starting with a guess x0 = [-1,1]; options = optimset(‘LargeScale’,’off’); [xmin,feval,exitflag,output]= fminunc(‘objfun’,x0,options); Optimization parameters settings Output arguments Input arguments
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Results xmin = 0.5000 -1.0000 feval = 1.3028e-010 exitflag = 1 output = iterations: 7 funcCount: 40 stepsize: 1 firstorderopt: 8.1998e-004 algorithm: 'medium-scale: Quasi-Newton line search' Minimum point of design variables Objective function value Exitflag tells if the algorithm is converged. If exitflag > 0, then local minimum is found Some other information
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion More on fminunc – Input [xmin,feval,exitflag,output,grad,hessian]= fminunc(fun,x0,options,P1,P2,…) • fun: Return a function of objective function. • x0: Starts with an initial guess. The guess must be a vector of size of number of design variables. • option: To set some of the optimization parameters. (More after few slides) • P1,P2,…: To pass additional parameters. Ref. Manual: Pg. 5-5 to 5-9
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion More on fminunc – Output • xmin: Vector of the minimum point (optimal point). The size is the number of design variables. • feval: The objective function value of at the optimal point. • exitflag: A value shows whether the optimization routine is terminated successfully. (converged if >0) • output: This structure gives more details about the optimization • grad: The gradient value at the optimal point. • hessian: The hessian value of at the optimal point [xmin,feval,exitflag,output,grad,hessian]= fminunc(fun,x0,options,P1,P2,…) Ref. Manual: Pg. 5-5 to 5-9
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Options Setting – optimset • The routines in Optimization Toolbox has a set of default optimization parameters. • However, the toolbox allows you to alter some of those parameters, for example: the tolerance, the step size, the gradient or hessian values, the max. number of iterations etc. • There are also a list of features available, for example: displaying the values at each iterations, compare the user supply gradient or hessian, etc. • You can also choose the algorithm you wish to use. Options = optimset(‘param1’,value1, ‘param2’,value2,…) Ref. Manual: Pg. 5-10 to 5-14
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Options Setting (Cont.) Options = optimset(‘param1’,value1, ‘param2’,value2,…) • Type help optimset in command window, a list of options setting available will be displayed. • How to read? For example: LargeScale - Use large-scale algorithm if possible [ {on} | off ] The default is with { } Value (value1) Parameter (param1)
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Options Setting (Cont.) Options = optimset(‘param1’,value1, ‘param2’,value2,…) LargeScale - Use large-scale algorithm if possible [ {on} | off ] • Since the default is on, if we would like to turn off, we just type: Options = optimset(‘LargeScale’, ‘off’) and pass to the input of fminunc.
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Useful Option Settings Highly recommended to use!!! • Display - Level of display [ off | iter | notify | final ] • MaxIter - Maximum number of iterations allowed [ positive integer ] • TolCon - Termination tolerance on the constraint violation [ positive scalar ] • TolFun - Termination tolerance on the function value [ positive scalar ] • TolX - Termination tolerance on X [ positive scalar ] Ref. Manual: Pg. 5-10 to 5-14
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion fminunc andfminsearch • fminunc uses algorithm with gradient and hessian information. • Two modes: • Large-Scale: interior-reflective Newton • Medium-Scale: quasi-Newton (BFGS) • Not preferred in solving highly discontinuous functions. • This function may only give local solutions.
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion fminunc andfminsearch • fminsearch is generally less efficient than fminunc for problems of order greater than two. However, when the problem is highly discontinuous, fminsearch may be more robust. • This is a direct search method that does not use numerical or analytic gradients as in fminunc. • This function may only give local solutions.
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Constrained Minimization Vector of Lagrange Multiplier at optimal point [xmin,feval,exitflag,output,lambda,grad,hessian] = fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options,P1,P2,…)
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Example function f = myfun(x) f=-x(1)*x(2)*x(3); Subject to:
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Example (Cont.) For Create a function call nonlcon which returns 2 constraint vectors [C,Ceq] function [C,Ceq]=nonlcon(x) C=2*x(1)^2+x(2); Ceq=[]; Remember to return a null Matrix if the constraint does not apply
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Example (Cont.) Initial guess (3 design variables) x0=[10;10;10]; A=[-1 -2 -2;1 2 2]; B=[0 72]'; LB = [0 0 0]'; UB = [30 30 30]'; [x,feval]=fmincon(@myfun,x0,A,B,[],[],LB,UB,@nonlcon) CAREFUL!!! fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options,P1,P2,…)
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective OptimizationConclusion Example (Cont.) Warning: Large-scale (trust region) method does not currently solve this type of problem, switching to medium-scale (line search). > In D:\Programs\MATLAB6p1\toolbox\optim\fmincon.m at line 213 In D:\usr\CHINTANG\OptToolbox\min_con.m at line 6 Optimization terminated successfully: Magnitude of directional derivative in search direction less than 2*options.TolFun and maximum constraint violation is less than options.TolCon Active Constraints: 2 9 x = 0.00050378663220 0.00000000000000 30.00000000000000 feval = -4.657237250542452e-035 Const. 1 Const. 2 Const. 3 Const. 5 Const. 4 Const. 6 Const. 7 Const. 8 Const. 9 Sequence: A,B,Aeq,Beq,LB,UB,C,Ceq
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective Optimization Conclusion Multiobjective Optimization • Previous examples involved problems with a single objective function. • Now let us look at solving problem with multiobjective function by lsqnonlin. • Example is taken by designing an optimal PID controller for an plant.
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective Optimization Conclusion Simulink Example Goal: Optimize the control parameters in Simulink model optsim.mdl in order to minimize the error between the output and input. • Plant description: • Third order under-damped with actuator limits. • Actuation limits are a saturation limit and a slew rate limit. • Saturation limit cuts off input: +/- 2 units • Slew rate limit: 0.8 unit/sec
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective Optimization Conclusion Simulink Example (Cont.) Initial PID Controller Design
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective Optimization Conclusion Solving Methodology • Design variables are the gains in PID controller (KP, KI and KD) . • Objective function is the error between the output and input.
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective Optimization Conclusion Solving Methodology (Cont.) • Let pid = [Kp Ki Kd]T • Let also the step input is unity. • F = yout - 1 • Construct a function tracklsq for objective function.
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective Optimization Conclusion Objective Function function F = tracklsq(pid,a1,a2) Kp = pid(1); Ki = pid(2); Kd = pid(3); % Compute function value opt = simset('solver','ode5','SrcWorkspace','Current'); [tout,xout,yout] = sim('optsim',[0 100],opt); F = yout-1; The idea is perform nonlinear least squares minimization of the errors from time 0 to 100 at the time step of 1. So, there are 101 objective functions to minimize. Getting the simulation data from Simulink
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective Optimization Conclusion The lsqnonlin [X,RESNORM,RESIDUAL,EXITFLAG,OUTPUT,LAMBDA,JACOBIAN] = LSQNONLIN(FUN,X0,LB,UB,OPTIONS,P1,P2,..)
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective Optimization Conclusion Invoking the Routine clear all Optsim; pid0 = [0.63 0.0504 1.9688]; a1 = 3; a2 = 43; options = optimset('LargeScale','off','Display','iter','TolX',0.001,'TolFun',0.001); pid = lsqnonlin(@tracklsq,pid0,[],[],options,a1,a2) Kp = pid(1); Ki = pid(2); Kd = pid(3);
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective Optimization Conclusion Results Optimal gains
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective Optimization Conclusion Results (Cont.) Initial Design Optimization Process Optimal Controller Result
IntroductionUnconstrained Minimization Constrained Minimization Multiobjective Optimization Conclusion Conclusion • Easy to use! But, we do not know what is happening behind the routine. Therefore, it is still important to understand the limitation of each routine. • Basic steps: • Recognize the class of optimization problem • Define the design variables • Create objective function • Recognize the constraints • Start an initial guess • Invoke suitable routine • Analyze the results (it might not make sense)
Thank You! Questions & Suggestions?