1 / 31

CS B553: Algorithms for Optimization and Learning

CS B553: Algorithms for Optimization and Learning. Univariate optimization. f (x). x. Key Ideas. Critical points Direct methods Exhaustive search Golden section search Root finding algorithms Bisection [More next time] Local vs. global optimization Analyzing errors, convergence rates.

gaye
Download Presentation

CS B553: Algorithms for Optimization and Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS B553: Algorithms for Optimization and Learning Univariate optimization

  2. f(x) x

  3. Key Ideas • Critical points • Direct methods • Exhaustive search • Golden section search • Root finding algorithms • Bisection • [More next time] • Local vs. global optimization • Analyzing errors, convergence rates

  4. Figure 1 f(x) Local maxima Inflection point Local minima x

  5. Figure 2a f(x) a b x

  6. Figure 2b Find critical points, apply 2nd derivative test f(x) a b x

  7. Figure 2b f(x) a b x

  8. Figure 2c f(x) a b x Global minimum must be one of these points

  9. Figure 3 Exhaustive grid search f(x) a b x

  10. Exhaustive grid search f(x) a b x

  11. Figure 4 Two types of errors f(x) f(xt) Analytical error f(x*) x x* xt Geometric error

  12. Does exhaustive grid search achieve e/2 geometric error? f(x) x* b a x e

  13. Does exhaustive grid searchachieve e/2 geometric error? Not necessarily for multi-modal objective functions f(x) x* b a x Error

  14. Figure 5 Lipschitz continuity Slope +K |f(x)-f(y)|  K|x-y| Slope -K

  15. Figure 6 Exhaustive grid search achieves Ke/2 analytical error in worst case f(x) b a x e

  16. Figure 7a Golden section search f(x) m b a x Bracket [a,b]Intermediate point m with f(m) < f(a),f(b)

  17. Figure 7b Golden section search f(x) c m b a x Candidate bracket 1 [a,m] Candidate bracket 2 [c,b]

  18. Figure 7b Golden section search f(x) m b a x

  19. Figure 7b Golden section search f(x) m b a c x

  20. Figure 7b Golden section search f(x) a b m x

  21. Optimal choice: based on golden ratio f(x) c m b a x Choose c so that (c-a)/(m-c) = , where  is the golden ratio => Bracket reduced by a factor of -1 at each step

  22. Notes • Exhaustive search is a global optimization: error bound is for finding the true optimum • GSS is a local optimization: error bound holds only for finding a local minimum • Convergence rate is linear: with xn = sequence of bracket midpoints

  23. Figure 8 Root finding: find x-value where f’(x) crosses 0 f(x) f’(x) x

  24. Figure 9a Bisection g(x) a b Bracket [a,b]Invariant: sign(f(a)) != sign(f(b))

  25. Figure 9 Bisection g(x) a m b Bracket [a,b]Invariant: sign(f(a)) != sign(f(b))

  26. Figure 9 Bisection g(x) a b Bracket [a,b]Invariant: sign(f(a)) != sign(f(b))

  27. Figure 9 Bisection g(x) a m b Bracket [a,b]Invariant: sign(f(a)) != sign(f(b))

  28. Figure 9 Bisection g(x) a b Bracket [a,b]Invariant: sign(f(a)) != sign(f(b))

  29. Figure 9 Bisection g(x) a m b Bracket [a,b]Invariant: sign(f(a)) != sign(f(b))

  30. Figure 9 Bisection g(x) a b Bracket [a,b]Invariant: sign(f(a)) != sign(f(b)) Linear convergence: Bracket size is reduced by factor of 0.5 at each iteration

  31. Next time • Root finding methods with superlinear convergence • Practical issues

More Related