430 likes | 1.33k Views
Roots of Equations. Open Methods (Part 1) Fixed Point Iteration & Newton-Raphson Methods. The following root finding methods will be introduced: A. Bracketing Methods A.1. Bisection Method A.2. Regula Falsi B. Open Methods B.1. Fixed Point Iteration B.2. Newton Raphson's Method
E N D
Roots of Equations Open Methods (Part 1) Fixed Point Iteration & Newton-Raphson Methods
The following root finding methods will be introduced: A. Bracketing Methods A.1. Bisection Method A.2. Regula Falsi B. Open Methods B.1. Fixed Point Iteration B.2. Newton Raphson's Method B.3. Secant Method
B. Open Methods To find the root for f(x) = 0, we construct a magic formulae xi+1 = g(xi) to predict the root iteratively until x converge to a root. However, x may diverge! • Bisection method • Open method (diverge) • Open method (converge)
What you should know about Open Methods How to construct the magic formulae g(x)? How can we ensure convergence? What makes a method converges quickly or diverge? How fast does a method converge?
B.1. Fixed Point Iteration • Also known as one-point iteration or successive substitution • To find the root for f(x) = 0, we reformulatef(x) = 0 so that there is an x on one side of the equation. • If we can solve g(x) = x, we solve f(x) = 0. • x is known as the fixed point of g(x). • We solve g(x) = x by computing until xi+1 converges to x.
Fixed Point Iteration – Example Reason: Ifx converges, i.e. xi+1 xi
Example Find root of f(x) = e-x - x = 0. (Answer: α= 0.56714329)
Two Curve Graphical Method The point, x, where the two curves, f1(x) = x and f2(x) = g(x), intersect is the solution to f(x) = 0. Demo
Fixed Point Iteration For example, (ans: x = 3 or -1) • There are infinite ways to construct g(x) from f(x). Case a: Case b: Case c: So which one is better?
x0 = 4 • x1 = 3.31662 • x2 = 3.10375 • x3 = 3.03439 • x4 = 3.01144 • x5 = 3.00381 • x0 = 4 • x1 = 1.5 • x2 = -6 • x3 = -0.375 • x4 = -1.263158 • x5 = -0.919355 • x6 = -1.02762 • x7 = -0.990876 • x8 = -1.00305 • x0 = 4 • x1 = 6.5 • x2 = 19.625 • x3 = 191.070 Converge! Diverge! Converge, but slower
How to choose g(x)? • Can we know which g(x) would converge to solution before we do the computation?
Convergence of Fixed Point Iteration By definition Fixed point iteration
Convergence of Fixed Point Iteration According to the derivative mean-value theorem, if g(x) and g'(x) are continuous over an interval xi≤x≤α, there exists a value x = c within the interval such that • Therefore, if |g'(c)| < 1, the error decreases with each iteration. If |g'(c)| > 1, the error increase. • If the derivative is positive, the iterative solution will be monotonic. • If the derivative is negative, the errors will oscillate.
(a) |g'(x)| < 1, g'(x) is +ve • converge, monotonic (b) |g'(x)| < 1, g'(x) is -ve • converge, oscillate (c) |g'(x)| > 1, g'(x) is +ve • diverge, monotonic (d) |g'(x)| > 1, g'(x) is -ve • diverge, oscillate Demo
Fixed Point Iteration Impl. (as C function) // x0: Initial guess of the root // es: Acceptable relative percentage error // iter_max: Maximum number of iterations allowed double FixedPt(double x0, double es, int iter_max) { double xr = x0; // Estimated root double xr_old; // Keep xr from previous iteration int iter = 0; // Keep track of # of iterations do { xr_old = xr; xr = g(xr_old);// g(x) has to be supplied if (xr != 0) ea = fabs((xr – xr_old) / xr) * 100; iter++; } while (ea > es && iter < iter_max); return xr; }
The following root finding methods will be introduced: A. Bracketing Methods A.1. Bisection Method A.2. Regula Falsi B. Open Methods B.1. Fixed Point Iteration B.2. Newton Raphson's Method B.3. Secant Method
B.2. Newton-Raphson Method Use the slope of f(x) to predict the location of the root. xi+1 is the point where the tangent at xi intersects x-axis.
Newton-Raphson Method What would happen when f '(α) = 0? For example,f(x) = (x –1)2 = 0
Error Analysis of Newton-Raphson Method By definition Newton-Raphson method
Error Analysis of Newton-Raphson Method Suppose α is the true value (i.e., f(α) = 0). Using Taylor's series When xi and α are very close to each other, c is between xi and α. The iterative process is said to be of second order.
The Order of Iterative Process (Definition) Using an iterative process we get xk+1 from xk and other info. We have x0, x1, x2, …, xk+1 as the estimation for the root α. Let δk = α – xk Then we may observe The process in such a case is said to be of p-th order. • It is called Superlinear if p > 1. • It is call quadratic if p = 2 • It is called Linear if p = 1. • It is called Sublinear if p < 1.
Error of the Newton-Raphson Method Each error is approximately proportional to the square of the previous error. This means that the number of correct decimal places roughly doubles with each approximation. Example: Find the root of f(x) = e-x - x = 0 (Ans: α= 0.56714329) Error Analysis
Newton-Raphson vs. Fixed Point Iteration Find root of f(x) = e-x - x = 0. (Answer: α= 0.56714329) Fixed Point Iteration with Newton-Raphson
Pitfalls of the Newton-Raphson Method • Sometimes slow
Pitfalls of the Newton-Raphson Method Figure (a) An inflection point (f"(x)=0) at the vicinity of a root causes divergence. Figure (b) A local maximum or minimum causes oscillations.
Pitfalls of the Newton-Raphson Method Figure (c) It may jump from one location close to one root to a location that is several roots away. Figure (d) A zero slope causes division by zero.
Overcoming the Pitfalls? • No general convergence criteria for Newton-Raphson method. • Convergence depends on function nature and accuracy of initial guess. • A guess that's close to true root is always a better choice • Good knowledge of the functions or graphical analysis can help you make good guesses • Good software should recognize slow convergence or divergence. • At the end of computation, the final root estimate should always be substituted into the original function to verify the solution.
Other Facts • Newton-Rahpson method converges quadratically (when it converges). • Except when the root is a multiple roots • When the initial guess is close to the root, Newton-Rahpson method usually converges. • To improve the chance of convergence, we could use a bracketing method to locate the initial value for the Newton-Raphson method.
Summary • Differences between bracketing methods and open methods for locating roots • Guarantee of convergence? • Performance? • Convergence criteria for fixed-point iteration method • Rate of convergence • Linear, quadratic, super-linear, sublinear • Understand what conditions make Newton-Raphson method converges quickly or diverges