1 / 33

Krylov-Subspace Methods - I

Krylov-Subspace Methods - I. Lecture 6 Alessandra Nardi. Thanks to Prof. Jacob White, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy. Last lecture review. Iterative Methods Overview Stationary Non Stationary QR factorization to solve Mx=b Modified Gram-Schmidt Algorithm QR Pivoting

Download Presentation

Krylov-Subspace Methods - I

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Krylov-Subspace Methods - I Lecture 6 Alessandra Nardi Thanks to Prof. Jacob White, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy

  2. Last lecture review • Iterative Methods Overview • Stationary • Non Stationary • QR factorization to solve Mx=b • Modified Gram-Schmidt Algorithm • QR Pivoting • Minimization View of QR • Basic Minimization approach • Orthogonalized Search Directions • Pointer to Krylov Subspace Methods

  3. Last lecture reminderQR Factorization – By picture

  4. QR Factorization – Minimization ViewMinimization Algorithm For i = 1 to N “For each Target Column” For j = 1 to i-1 “For each Source Column left of target” end end Orthogonalize Search Direction Normalize

  5. Iterative Methods Solve Mx=b minimizing the residual r=b-Mx Stationary: x(k+1)=Gx(k)+c • Jacobi • Gauss-Seidel • Successive Overrelaxation Non Stationary: x(k+1)=x(k)+akpk • CG (Conjugate Gradient)  A symmetric and positive definite • GCR (Generalized Conjugate Residual) • GMRES, etc etc

  6. Iterative Methods - CG Convergence is related to: • Number of distinct eigenvalues • Ratio between max and min eigenvalue Why ? How?

  7. Outline • General Subspace Minimization Algorithm • Review orthogonalization and projection formulas • Generalized Conjugate Residual Algorithm • Krylov-subspace • Simplification in the symmetric case. • Convergence properties • Eigenvalue and Eigenvector Review • Norms and Spectral Radius • Spectral Mapping Theorem

  8. Arbitrary Subspace MethodsResidual Minimization

  9. Arbitrary Subspace MethodsResidual Minimization Use Gram-Schmidt on Mwi’s!

  10. Arbitrary Subspace MethodsOrthogonalization

  11. Arbitrary Subspace Solution Algorithm • Given M, b and a set of search directions: {w0,…,wk} • Make wi’sMMT orthogonal and get new search directions: {p0,…,pk} • Minimize the residual:

  12. Arbitrary Subspace Solution Algorithm For i = 0 to k For j = 1 to i-1 end end Orthogonalize Search Direction Normalize Update Solution

  13. Krylov Subspace • How about the initial set of search directions {w0,…,wk} ? • A particular choice that is commonly used is: {w0,…,wk}  {b, Mb, M2b…} • Km(A,v)  span{v, Av, A2v, …, Am-1v} is called Krylov Subspace

  14. Krylov Subspace Methods kth order polynomial

  15. Krylov Subspace MethodsSubspace Generation The set of residuals also can be used as a representation of the Krylov-Subspace Generalized Conjugate Residual Algorithm Nice because the residuals generate next search directions

  16. Krylov-Subspace MethodsGeneralized Conjugate Residual Method (k-th step) Determine optimal stepsize in kth search direction Update the solution (trying to minimize residual) and the residual Compute the new orthogonalized search direction (by using the most recent residual)

  17. Krylov-Subspace MethodsGeneralized Conjugate Residual Method (Computational Complexity for k-th step) Vector inner products, O(n) Matrix-vector product, O(n) if sparse Vector Adds, O(n) O(k) inner products, total cost O(nk) If M is sparse, as k (# of iters) approaches n, Better Converge Fast!

  18. Krylov-Subspace MethodsGeneralized Conjugate Residual Method (Symmetric Case – Conjugate Gradient Method) An Amazing fact that will not be derived Orthogonalization in one step If k (# of iters )  n, then symmetric, sparse, GCR is O(n2 ) Better Converge Fast!

  19. Summary • What is an iterative non stationary method: x(k+1)=x(k)+akpk • How search to calculate: • Search directions (pk) • Step along search directions (ak) • Krylov Subspace  GCR • GCR is O(k2n) • Better converge fast!  Now look at convergence properties of GCR

  20. Krylov Methods Convergence AnalysisBasic properties

  21. Krylov Methods Convergence AnalysisOptimality of GCR poly • GCR optimality property (key property of the algorithm): GCR picks the best (k+1)-th order polynomial minimizing and subject to: 

  22. Krylov Methods Convergence AnalysisOptimality of GCR poly GCR Optimality Property Therefore Any polynomial which satisfies the constraints can be used to get an upper bound on

  23. Eigenvalues and eigenvectors reviewBasic definitions Eigenvalues and eigenvectors of a matrix M satisfy eigenvalue eigenvector

  24. Eigenvalues and eigenvectors reviewA symplifying assumption Almost all NxN matrices have N linearly independent Eigenvectors The set of all eigenvalues of M is known as the Spectrum of M

  25. Eigenvalues and eigenvectors reviewA symplifying assumption Almost all NxN matrices have N linearly independent Eigenvectors

  26. Eigenvalues and eigenvectors reviewSpectral radius The spectral Radius of M is the radius of the smallest circle, centered at the origin, which encloses all of M’s eigenvalues

  27. Eigenvalues and eigenvectors reviewVector norms L2 (Euclidean) norm : Unit circle L1 norm : 1 1 L norm : Unit square

  28. Eigenvalues and eigenvectors reviewMatrix norms Vector induced norm : Induced norm of A is the maximum “magnification” of by = max abs column sum = max abs row sum = (largest eigenvalue of ATA)1/2

  29. Eigenvalues and eigenvectors reviewInduced norms Theorem: Any induced norm is a bound on the spectral radius Proof:

  30. Useful Eigenproperties Spectral Mapping Theorem Given a polynomial Apply the polynomial to a matrix Then

  31. Krylov Methods Convergence AnalysisOverview Matrix norm property GCR optimality property where is any (k+1)-th order polynomial subject to:  may be used to get an upper bound on

  32. Krylov Methods Convergence AnalysisOverview • Review on eigenvalues and eigenvectors • Induced norms: relate matrix eigenvalues to the matrix norms • Spectral mapping theorem: relate matrix eigenvalues to matrix polynomials • Now ready to relate the convergence properties of Krylov Subspace methods to eigenvalues of M

  33. Summary • Generalized Conjugate Residual Algorithm • Krylov-subspace • Simplification in the symmetric case • Convergence properties • Eigenvalue and Eigenvector Review • Norms and Spectral Radius • Spectral Mapping Theorem

More Related