230 likes | 495 Views
The linear system. The problem: solve Suppose A is invertible, then there exists a unique solution How to efficiently compute the solution numerically???. Review of direct methods. Gaussian elimination with pivoting Memory cost: O(n^2) Computational cost: O(n^3)
E N D
The linear system • The problem: solve • Suppose A is invertible, then there exists a unique solution • How to efficiently compute the solution numerically???
Review of direct methods • Gaussian elimination with pivoting • Memory cost: O(n^2) • Computational cost: O(n^3) • Can only be used for small n, e.g. n<=1000 • LU decomposition • Memory cost: O(n^2) • Computational cost: O(n^2) • Can only be used for small n, e.g. n<=1000 • Good for problem to solve the linear system with different right hand
Review of direct methods • For tri-diagonal matrix • Thomas algorithm based on Crout factorization • Memory cost: O(n) & Computational cost: O(n) • Can be extended to band-limited matrix • For linear system from discretization of Poisson equation by FDM • Direct Poisson solver based on FFT • Memory cost: O(n) & Computational cost: O(n ln n) • For linear system from discretization of elliptic equation by FEM • Multigrid method (MG) or Algebraic Multigrid method (AMG) • Memory cost: O(n) & Computational cost: O(n) • For linear system from discretization of Poisson equation by integral formulation • Fast Multipole method • Memory cost: O(n) & Computational cost: O(n)
Iterative methods • Aim: to solve large sparse linear system • Basic iterative methods • Jacobi method • Gauss-Seidel method • Successive overrelaxation method (SOR) • Krylov subspace (modern iterative) methods • Steepest decent method • Conjugate gradient (CG) method • GMRES for nonsymmetric mehtod
Basic iterative methods • Rewrite
Jacobi iterative method • The linear system • Equation form • Matrix form
Jacobi iterative method • An example • The method • Initial guess
Jacobi iterative method • The results
Gauss-Seidel method • Idea: Used the new values when they are available • Equation form • Matrix form
Gauss-Seidel method • An example • The method • Initial guess
Gauss-Seidel method • The results
SOR method • Idea: To improve the Gauss-Seidel method by a linear combinationof the old value and new • Equation form • Matrix form
Convergence analysis • General form of basic iterative methods • Exact solution • Define the error at the m-th iteration • Error equations
Convergence analysis • Convergence • Lemma: For any square matrix R, there exists a nonsingular matrix T such that – Jordan canonical form
Convergence analysis • Definition: Spectral radius of R • Lemma: For any square matrix R, • Theorem: The iterative method converges to the exact solution of iff
Convergence rate • Thm: For the iterative method suppose then • The iterative method converges • Linear convergence rate with q<1 • Error bound
Proof for convergence rate • Fact • Error bound • Another error bound • Error bound
Convergence results • If A is strictly row diagonally dominant, then both Jacobi and Gauss-Seidel methods converge. • Gauss-Seidel method converges if A is symmetric positive definite • The relaxation parameter be in (0,2) is the necessary condition for the convergence of SOR method. In addition, if A is symmetric positive definite, then the condition is also sufficient for the convergence of SOR method
Convergence results • Definition: A is strictly row diagonally dominant if • Examples • Thm: If A is strictly row diagonally dominant, it is invertible!
Convergence results • Thm: If A is strictly row diagonally dominant, then both Jacobi and Gauss-Seidel methods converge. In fact, • Proof
Convergence results • Thm: Let A be symmetric positive definite matrix, then the Gauss-Seidel method converges for any initial guess. • Proof: See details in class • Remark: There are linear system, for which the Jacobi method converges, but the Gauss-Seidel method diverges, e.g.
Convergence results • Thm: For SOR method, we have Thus the relaxation parameter be in (0,2) is necessary for SOR converge • Proof:
Convergence results • Thm: If A is symmetric positive definite, then for . That is, SOR converges for all • Proof: See details in class • Remark: • Over relaxation: • Under relaxation: • Optimal relaxation parameter: