740 likes | 767 Views
This chapter explores different root finding methods, including the Bisection Method, Newton's Method, Secant Method, and Fixed-point Iteration. It discusses their advantages, disadvantages, convergence, error estimation, and application examples.
E N D
3.1 The Bisection Method • Let fbe a continues function. Suppose we know that f(a) f(b) < 0, then there is a root between a and b.
Example 3.1 • A formal statement is given in Algorithm 3.1.
Bisection Method • Advantage: • A global method: it always converge no matter how far you start from the actual root. • Disadvantage: • It cannot be used to find roots when the function is tangent is the axis and does not pass through the axis. • For example: • It converges slowly compared with other methods.
3.2 Newton’s Method: Derivation and Examples • Newton’s method is the classic algorithm for finding roots of functions. • Two good derivations of Newton’s method: • Geometric derivation • Analytic derivation
Newton’s Method :Geometric Derivation • The fundamental idea in Newton’s method is to use the tangent line approximation to the functionf at point . • The point-slope formula for the equation of the straight line gives us: • Continue the process with another straight line to get
Newton’s Method • Advantage: • Very fast • Disadvantage: • Not a global method • For example: Figure 3.3 (root x= 0.5) • Another example: Figure 3.4 (root x = 0.05) • In these example, the initial point should be carefully chosen. • Newton’s method will cycle indefinitely. • Newton’s method will just hop back and forth between two values. • For example: Consider (root x= 0)
Initial value Wrong predictions, because the root is positive Very close to the actual root
3.3 How to Stop Newton’s Method • Ideally, we would want to stop when the error is sufficiently small. (p. 12)
f (x) f ’(x) 3.4 Application:Division using Newton’s Method • The purpose is to illustrate the use of Newtown’s method and the analysis of the resulting iteration.
Questions: • When does this iteration converge and how fast? • What initial guesses x0 will work for us? • The way that computer stores numbers:
From (2.11) p.53 Initial x0 p.56
Definition 3.1 • The requirement that C be nonzero and finite actually forces p to be a single unique value. • Linear convergence: p = 1 • Quadratic convergence: p = 2 • Superlinearly convergence: but
3.6 Newton’s Method:Theory and Convergence • Its proof is shown at pp. 106-108.
Questions: • Can we find an initial guess such that Newton’s method will always converge for bon this interval? • How rapidly will it converge? • The Newton error formula (3.12) applied to : (3.25) • The relative error satisfies (3.26)
How to find the initial value? • Choose the midpoint of the interval • For example: If , • Using linear interpolation • For example: bis known
3.8 The Secant Method:Derivation and Examples • An obvious drawback of Newton’s method is that it requires a formula for the derivative of f. • One obvious way to deal with this problem is to use an approximation to the derivative in the Newton formula. • For example: • Another method: the secant method • Used a secant line
The Secant Method • Its advantages over Newton’s method: • It not require the derivative. • It can be coded in a way requiringonlya single function evaluation per iteration. • Newton’s requires two, one for the function and one for the derivative.
Error Estimation • The error formula for the secant method:
The Convergence • This is almost the same as Newton’s method.
root 3.9 Fixed-point Iteration • The goal of this section is to use the added understanding of simple iteration to enhance our understanding of and ability to solve root-finding problems. • The root of f is equal to the fixed-point of g.
Fixed-point Iteration • Because show that this kind of point is called a fixed point of the function g, and an iteration of the form (3.33) is called a fixed-point iteration for g.
Fixed point Root