300 likes | 311 Views
This paper discusses the use of algebraic analysis methods for preconditioners. It provides an overview of different algebraic tools and techniques, including rectangular factorizations and Vaidya's spanning trees. The paper also explores recent extensions and approximate inverse preconditioning.
E N D
Algebraic Tools for Analyzing Preconditioners Bruce Hendrickson Erik Boman Sandia National Labs
Long History • Many have worked on algebraic analysis methods • Abridged history of this line of work: • Beauwens, Notay, Axelsson & others (80s) • Vaidya (’91) • Miller, Gremban, Guattery (mid 90s) • Gilbert, Boman, Toledo, Chen, H., etc. (current)
Outline • Definitions and concepts • Basic tools • Rectangular factorizations • Special cases • Vaidya’s spanning trees • Recent extensions • Approximate inverse preconditioning
Starting Point:Preconditioned CG • Solving system Ax=f with preconditioner B • For now, focus on symmetric positive definite matrices • General systems later in the talk • Iterations of preconditioned CG bounded by spectral condition number of matrix pencil
Support Number • Support number s(A,B) • s(A,B) = min {t | xT(t B-A)x 0 x} • Closely related to largest eigenvalue • lmax(A,B) ≤ s(A,B) • lmin(A,B) = 1/ lmax(B,A) ≥ 1/ s(B,A) • (equality if full rank) • So, k2(A,B) ≤s(A,B) s(B,A)
Why Support Numbers? • s(A,B) is largest generalized eigenvalue projected onto range space of B • Equals lmax(A,B) when B full rank • Remains well defined when B rank deficient • Robust extension of largest eigenvalue • Easier to work with s(A,B) than with l(A,B)
Properties ofSupport Numbers • Splitting Lemma: If A = åi=1,k Ai and B=åi=1,k Bi • Then s(A,B) ≤ maxis(Ai,Bi) • Used in finite elements and domain decomposition • Our interest is algebraic • How to split? • Triangle Inequality: If B, C positive semidefinite • Then s(A,C) ≤s(A,B)s(B,C) • When A, B are psd, then s(A,B) ≤ 1/{1 - s(A-B, A)} • Useful for analysis of incomplete Cholesky
More Properties ofSupport Numbers • Symmetric Product Support: If A=UUT and B=VVT • Then s(A,B) = minW ||W||22 such that VW=U • Special case: • s(uuT,VVT) = minw wTw, where Vw=u • (Many more properties in our papers)
M-Matrices, Graphs & Rectangular Factorizations • Consider a simple Laplacian
Rectangular Factorizations • A = UUT, where U is arbitrary • Interesting special cases: • Columns of U have 1 nonzero • Non-negative diagonal matrices • + columns with 2 nonzeros, same magnitude, opposite sign • Symmetric, diagonally dominant M-matrices • + columns with 2 nonzeros, same magnitude • Symmetric, diagonally dominant matrices • + columns with 2 nonzeros • Symmetric H-matrices with non-negative diagonal
Rectangular Factorizations and Finite Elements • Factor each element matrix before assembly • A= UUT, where U rectangular, and few nonzeros per column • Block column of U for each element • “Natural Factor” – Argyris & Bronlund
Very Special Case:Vaidya’s Spanning Trees • Let u and columns of V look like • a(+1, 0, …, 0, -1)T, or b(…, 0, 1, 0, …)T • Matrices representable as UUT: • Symmetric, diagonally-dominant, M-matrices • Note: vectors with 2 nonzeros can be thought of as edges in a (weighted) graph
Support Paths • u = Vw, s(uuT,VVT) = wTw • Support number = length-of-path
Vaidya’s Spanning Tree Preconditioners • Vaidya used this to analyze preconditioners built from max-weight spanning trees • Using support-path analysis, easy to show that • Worst case condition number = O(nm) • n = matrix size, m = number of nonzeros • “Exact factorization of an incomplete matrix.” (J. Gilbert)
Matrix Interpretation • Given rectangular U (m<n) • Find subset of columns V that makes a good basis • That is, VW=U, where W has small 2-norm • Vaidya’s max-weight spanning tree ensures • Entries of V-1U are all of magnitude no more than 1 • O(nm) condition number follows
Vaidya’s Augmentation • Can add edges in special way • Reduce condition number, but increase factorization cost • O(n1.75) runtime for solving general problems • O(n1.2) runtime to solve for planar graphs • Bounds independent of sparsity pattern & numerical values!
How does Vaidya work in practice? • Chen & Toledo, ETNA 2003 • Sensitive to structure, not numerical values • Competitive with relaxed Modified Incomplete Cholesky on 2D problems • Sometimes worse, sometimes much better on 3D problems • Interesting convergence behavior
Beyond Vaidya:Other Spanning Trees • Max-weight spanning tree might have bad topology (long support paths) • Trade worse numerics for better structure • MASST: (min average stretch spanning tree) • Alon/Karp/Peleg/West(’95) for networks • Gives condition number bound of ~O(m lg n) for general graphs
Hybrid Idea • Augmenting MASST trees • Spielman & Teng (’03) • Add extra edges to improve condition number • Solve general diagonally dominant M-matrices in O(n1.31) time • No implementation yet
Beyond Vaidya:Broader Matrix Classes • Allow columns of U of the form • a(+1, 0, …, 0, +1)T • Now, UUT = • all symmetric, diagonally dominant matrices (SDD) • Two types of graph edges: Signed Graphs • Max spanning tree becomes max-weight basis of matroid • With Chen/Toledo, devised efficient algorithms & Vaidya-like analysis • O(nm) condition number bound for all SDD matrices • Can augment and get better bounds for planar graphs
Factorized Approximate Inverses • A = UUT • What if we have V, an approximate “inverse” of U? • Could use VTV as a preconditioner • Possible advantages: • For some matrices, U is cheap to compute • Symmetric diagonally dominant • Finite elements • Columns of U capture natural structure • E.g. few columns = one finite element • Allows preconditioner to focus on bad elements
“Inverting” Rectangular Factors • Let A=U D UT, and let V be a solution of • Or alternatively, A VT = U D • Then, A-1 = VT D-1 V
Nonsymmetric Systems • Let A=E FT, and X and Y solve • And • Then A-1 = XT Y
Status • Can solve KKT-like systems approximately • Use few steps of iterative method, or • Specify sparsity and minimize Frobineus norm • Empirical testing underway
Other Ongoing Work • Finite elements • Use splitting lemma to decompose into elements • Use symmetric-product lemma to approximate each element • Assemble approximations and approximate the result • Incomplete factorizations • Simple proof of model problem results • Suggests alternative dropping strategies • Domain decomposition • Easy proof of known results for block Jacobi on model problem • Can generalize to some unstructured grids • Generalizing tools to handle nonsymmetric matrices
Conclusions • Support numbers are nice analytical tool • Easy to prove algebraic properties • Rectangular factorizations are useful for analyzing and constructing preconditioners • Lots of open questions & opportunities for new insights
Collaborators Doron Chen John Gilbert Steve Guattery Ojas Parekh Clark Dohrmann Nhat Nguyen Work supported by DOE’s Applied Mathematical Science Program Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed-Martin Company, for the U.S. DOE under contract DE-AC-94AL85000. Sivan Toledo Marshall Bern Darin Diachin Edmond Chow Alex Pothen Acknowledgements
For More Information • www.cs.sandia.gov/~bahendr/support.html • bah@cs.sandia.gov