210 likes | 321 Views
CS B553: Algorithms for Optimization and Learning. Root finding. g (x ). x. Roots of g. Key Ideas. Newton’s method Secant method Superlinear convergence rates Initialization and termination Approximate differentiation Numerical considerations. Figure 10. Newton’s method. g (x ).
E N D
CS B553: Algorithms for Optimization and Learning Root finding
g(x) x Roots of g
Key Ideas • Newton’s method • Secant method • Superlinear convergence rates • Initialization and termination • Approximate differentiation • Numerical considerations
Figure 10 Newton’s method g(x) x0 x In a neighborhood of a root, the line tangent to thegraph crosses the x axis near the root
Figure 10 Newton’s method g(x) x1 x In a neighborhood of a root, the line tangent to thegraph crosses the x axis near the root… iterate!
Figure 10 Newton’s method g(x) x2 x In a neighborhood of a root, the line tangent to thegraph crosses the x axis near the root… iterate!
Figure 11 Divergence x1 x g(x)
Figure 11 Divergence x1 x2 x g(x)
Figure 11 Divergence x3 x1 x2 x g(x)
Figure 11 Divergence x3 x1 x2 x4 x g(x)
Figure 11 Divergence x5 x3 x1 x2 x4 x g(x)
Figure 12 Oscillation x
Figure 12 Oscillation x
Figure 12 Oscillation x
Figure 12 Oscillation x
Figure 13 Secant method g(x) x0 x1 x Idea: Use line through two points on graph as approximation ofthe derivative
Figure 13 Secant method g(x) x0 x1 x2 x Idea: Use line through two points on graph as approximation ofthe derivative
Figure 13 Secant method g(x) x3 x0 x1 x2 x Idea: Use line through two points on graph as approximation ofthe derivative
Figure 13 Secant method g(x) x3 x0 x1 x2 x Idea: Use line through two points on graph as approximation ofthe derivative
Orders of convergence • Bisection: linear • Newton’s method: quadratic • Secant method: order 1.6 • Only bisection has guaranteed convergence (given appropriate initial interval) • Newton’s method needs derivatives • Most “out of the box” subroutines take a hybrid approach
Figure 14 Basins of attraction in complex plane: x5-1=0