1 / 25

Chapter 7 Optimization

Chapter 7 Optimization. Content. Introduction One dimensional unconstrained Multidimensional unconstrained Example. Introduction (1). Root finding & optimization are closely related !!. Optimization. Root finding. Introduction (2). An optimization problem

Download Presentation

Chapter 7 Optimization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 7 Optimization

  2. Content • Introduction • One dimensional unconstrained • Multidimensional unconstrained • Example

  3. Introduction (1) Root finding & optimization are closely related !! Optimization Root finding

  4. Introduction (2) An optimizationproblem “Find x, which minimizes or maximizes f(x) subject to… x is an n-dimensional design vector f(x) is the objective function di(x) are inequality constraints ei(x) are equality constraints ai and bi are constants ”

  5. Introduction (3) Classification • Constrained optimization problem 1)Linear programming : f(x) & constraints are linear 2) Quadratic programming : f(x) is quadratic and constraints are linear 3) Nonlinear programming : f(x) is either quadratic or nonlinear and/or constraints are nonlinear • Unconstrainedoptimization problem

  6. One-dimensional (1) interested in finding the absolute highest or lowest value of a function. Multimodal function

  7. One-dimensional (2) To distinguish global minimum (or maximum) from local ones we can try • Graphing to gain insight into the behavior of the function. • Using randomly generated starting guesses and picking the largest of the optima as global. • Perturbing the starting point to see if the routine returns a better point or the same local minimum.

  8. One-dimensional (3) Golden section search (for a unimodal function) • A unimodal function has a single maximum or a minimum in the a given interval. • Step of GSS • Pick two points that will bracket your extremum [xl, xu]. • Pick an additional third point within this interval to determine whether a maximum occurred. • Then pick a fourth point to determine whether the maximum has occurred within the first three or last three points • The key is making this approach efficient by choosing intermediate points wisely thus minimizing the function evaluations by replacing the old values with new values.

  9. One-dimensional (4) GSS principle is always keep the ratio of section length equal @ each iteration

  10. One-dimensional (5) Step I: Starts with two initial guesses (xl ,xu) Step II: Calculate points x1,x2 according to the golden ratio where Step III: Evaluate function at x1 and x2

  11. One-dimensional (6) Step IV: if f(x1)> f(x2) xU = x1 if f(x1)< f(x2) xL= x2 Step V: Calculate new x1 x Try with the following example !!

  12. One-dimensional (7) Ex Use the GSS to find the minimum of assume that xl = 0 and xu = 4 Answer x = 1.4276

  13. Multidimensional… (1) • Techniques to find minimum and maximum of a function of several variables. • Classified as: • Requires derivative evaluation • Gradient or descent (or ascent) methods • Not require derivative evaluation • Non-gradient or direct methods.

  14. Multidimensional… (2)

  15. Multidimensional… (3) DIRECT METHODS :Random Search • Based on evaluation of the function randomly at selected values of the independent variables. • If a sufficient number of samples are conducted, the optimum will be eventually located. • Ex: Locate the maximum of a function f (x, y)=y-x-2x2-2xy-y2

  16. Multidimensional… (4)

  17. Multidimensional… (5) • Advantages/ • Works even for discontinuous and nondifferentiable functions. • Always finds the global optimum rather than the global minimum. • Disadvantages/ • As the number of independent variables grows, the task can become onerous. • Not efficient, it does not account for the behavior of underlying function.

  18. Multidimensional… (6) Univariate and Pattern Searches • More efficient than random search and still doesn’t require derivative evaluation. • The basic strategy is: • Change one variable at a time while the other variables are held constant. • Thus problem is reduced to a sequence of one-dimensional searches that can be solved by variety of methods. • The search becomes less efficient as you approach the maximum.

  19. Multidimensional… (7) Univariate and Pattern Searches

  20. Multidimensional… (8) Gradient method • If f(x,y) is a two dimensional function, the gradient vector tells us • What direction is the steepest ascend? • How much we will gain by taking that step? Directional derivative of f(x,y) at point x=a and y=b

  21. Multidimensional… (9) Gradient method (cont’d) • For n dimensions

  22. Multidimensional… (10) Steepest ascent method Use the gradient vector to change x Says that the we should change with along h-axis The gradient vector represents a direction of maximum rate of increase for the function f(x)at x .

  23. Multidimensional… (11) Example Minimize the function where x = 0.6, y = 4

  24. Multidimensional… (11) Example Maximize the function where x = -1, y = 1 Answer x = 2, y = 1

  25. Multidimensional… (11) Example Maximize the function where x = 0.6, y = 4

More Related