530 likes | 766 Views
Advances in Random Matrix Theory (stochastic eigen analysis). Alan Edelman MIT: Dept of Mathematics, Computer Science AI Laboratories. Stochastic Eigen analysis. Counterpart to stochastic differential equations Emphasis on applications to engineering & finance Beautiful mathematics:
E N D
Advances in Random Matrix Theory(stochastic eigenanalysis) Alan Edelman MIT: Dept of Mathematics, Computer Science AI Laboratories
Stochastic Eigenanalysis Counterpart to stochastic differential equations Emphasis on applications to engineering & finance Beautiful mathematics: Random Matrix Theory Free Probability Raw Material from Physics Combinatorics Numerical Linear Algebra Multivariate Statistics
Scalars, Vectors, Matrices • Mathematics: Notation = power & less ink! • Computation: Use those caches! • Statistics: Classical, Multivariate, Modern Random Matrix Theory The Stochastic Eigenproblem * Mathematics of probabilistic linear algebra * Emerging Computational Algorithms * Emerging Statistical Techniques Ideas from numerical computation that stand the test of time are right for mathematics!
Open Questions • Find new applications of spacing (or other) statistics • Cleanest derivation of Tracy-Widom? • “Finite” free probability? • Finite meets infinite • Muirhead meets Tracy-Widom • Software for stochastic eigen-analysis
Wigner’s Semi-Circle • The classical & most famous rand eig theorem • Let S = random symmetric Gaussian • MATLAB: A=randn(n); S=( A+A’)/2; • S known as the Hermite Ensemble • Normalized eigenvalue histogram is a semi-circle • Precise statements require n etc.
Wigner’s Semi-Circle • The classical & most famous rand eig theorem • Let S = random symmetric Gaussian • MATLAB: A=randn(n); S=( A+A’)/2; • S known as the Hermite Ensemble • Normalized eigenvalue histogram is a semi-circle • Precise statements require n etc. n x n iid standard normals
Wigner’s Semi-Circle • The classical & most famous rand eig theorem • Let S = random symmetric Gaussian • MATLAB: A=randn(n); S=( A+A’)/2; • S known as the Hermite Ensemble • Normalized eigenvalue histogram is a semi-circle • Precise statements require n etc.
Wigner’s original proof • Compute E(tr A2p) as n∞ • Terms with too many indices, have some element with power 1. Vanishes with mean 0. • Terms with too few indices: not enough to be relevant as n∞ • Leaves only a Catalan number left: Cp=(2p)/(p+1) for the moments when all is said and done • Semi-circle only distribution with Catalan number moments p
Finite Versions of semicircle n=2; n=4; n=3; n=5;
Finite Versions n=2; n=4; n=3; n=5; Area under curve (-∞,x): Can be expressed as sums of probabilities that certain tridiagonal determinants are positive.
Wigner’s Semi-Circle • Real Numbers: x β=1 • Complex Numbers: x+iy β=2 • Quaternions: x+iy+jz+kw β=4 • β=2½? x+iy+jz β=2½? Defined through joint eigenvalue density: const x ∏|xi-xj|β ∏exp(-xi2 /2) β=repulsion strength β=0 “no interference” spacings are Poisson Classical research only β=1,2,4 missing the link to Poisson, continuous techniques, etc
Largest eigenvalue “convection-diffusion?”
Haar or not Haar? “Uniform Distribution on orthogonal matrices” Gram-Schmidt or [Q,R]=QR(randn(n))
Haar or not Haar? “Uniform Distribution on orthogonal matrices” Gram-Schmidt or [Q,R]=QR(randn(n)) Eigenvalues Wrong
Longest Increasing Subsequence(n=4) (Baik-Deift-Johansson) (Okounkov’s proof) Green: 4 Yellow: 3 Red: 2Purple: 1
Bulk spacing statistics • Bus wait times in Mexico • Energy levels of heavy atoms • Parked Cars in London • Zeros of Riemann zeta • Mice Brain Wave Spikes “convection-diffusion?” Telltale Sign: Repulsion + optimality
“what’s my β?”web page • Cy’s tricks: • Maximum Likelihood Estimation • Bayesian Probability • Kernel Density Estimation • Epanechnikov kernel • Confidence Intervals http://people.csail.mit.edu/cychan/BetaEstimator.html
Open Questions • Find new applications of spacing (or other) distributions • Cleanest derivation of Tracy-Widom? • “Finite” free probability? • Finite meets infinite • Muirhead meets Tracy-Widom • Software for stochastic eigen-analysis
1 n2 d2 dx2 Everyone’s Favorite Tridiagonal … … … … …
1 n2 d2 dx2 dW β1/2 Everyone’s Favorite Tridiagonal 1 (βn)1/2 … … … + … … +
2 d 2 - + x dW , 2 dx β æ ö N(0,2) χ - ç ÷ (n 1) β χ N(0,2) χ ç ÷ - - (n 1) β (n 2) β 1 ç ÷ β H ~ , ç ÷ n 2 n β ç ÷ χ N(0,2) χ 2 β β ç ÷ χ N(0,2) è ø β 2 ¥ » + β H H G , n n n β Stochastic Operator Limit … … … Cast of characters: Dumitriu, Sutton, Rider
Open Questions • Find new applications of spacing (or other) distributions • Cleanest derivation of Tracy-Widom? • “Finite” free probability? • Finite meets infinite • Muirhead meets Tracy-Widom • Software for stochastic eigen-analysis
Is it really the random matrices? • The excitement is that the random matrix statistics are everyhwere • Random matrices properly tridiagonalized are discretizations of stochastic differential operators! • Eigenvalues of SDO’s not as well studied • Deep down this is what I believe is the important mechanism in the spacings, not the random matrices! (See Brian Sutton thesis, Brian Rider papers—connection to Schrodinger operators) • Deep down for other statistics, though it’s the matrices
Open Questions • Find new applications of spacing (or other) distributions • Cleanest derivation of Tracy-Widom? • “Finite” free probability? • Finite meets infinite • Muirhead meets Tracy-Widom • Software for stochastic eigen-analysis
Open Questions • Find new applications of spacing (or other) distributions • Cleanest derivation of Tracy-Widom? • “Finite” free probability? • Finite meets infinite • Muirhead meets Tracy-Widom • Software for stochastic eigen-analysis
Free Probability • Free Probability (name refers to “free algebras” meaning no strings attached) • Gets us past Gaussian ensembles and Wishart Matrices
The flipping coins example • Classical Probability: Coin: +1 or -1 with p=.5 50% 50% 50% 50% y: x: -1 +1 -1 +1 x+y: -2 0 +2
The flipping coins example Free • Classical Probability: Coin: +1 or -1 with p=.5 50% 50% 50% 50% eig(B): eig(A): -1 +1 -1 +1 eig(A+QBQ’): -2 0 +2
From Finite to Infinite Gaussian (m=1)
From Finite to Infinite Gaussian (m=1) Wiggly
From Finite to Infinite Gaussian (m=1) Wiggly Wigner
Open Questions • Find new applications of spacing (or other) distributions • Cleanest derivation of Tracy-Widom? • “Finite” free probability? • Finite meets infinite • Muirhead meets Tracy-Widom • Software for stochastic eigen-analysis
Matrix Statistics • Many Worked out in 1950s and 1960s • Muirhead “Aspects of Multivariate Statistics” • Are two covariance matrices equal? • Does my matrix equal this matrix? • Is my matrix a multiple of the identity? • Answers Require Computation of • Hypergeometrics of Matrix Argument • Long thought Computationally Intractible
The special functions of multivariate statistics • Hypergeometric Functions of Matrix Argument • β=2: Schur Polynomials • Other values: Jack Polynomials • Orthogonal Polynomials of Matrix Argument • Begin with w(x) on I • ∫ pκ(x)pλ(x) Δ(x)β∏i w(xi)dxi = δκλ • Jack Polynomials orthogonal for w=1 on the unit circle. Analogs of xm • Plamen Koev revolutionary computation • Dumitriu’s MOPS symbolic package
Multivariate Orthogonal Polynomials&Hypergeometrics of Matrix Argument • The important special functions of the 21st century • Begin with w(x) on I • ∫ pκ(x)pλ(x) Δ(x)β∏i w(xi)dxi = δκλ • Jack Polynomials orthogonal for w=1 on the unit circle. Analogs of xm
Smallest eigenvalue statistics A=randn(m,n); hist(min(svd(A).^2))
Open Questions • Find new applications of spacing (or other) distributions • Cleanest derivation of Tracy-Widom? • “Finite” free probability? • Finite meets infinite • Muirhead meets Tracy-Widom • Software for stochastic eigen-analysis
Symbolic MOPS applications A=randn(n); S=(A+A’)/2; trace(S^4) det(S^3)
Encoding the semicircleThe algebraic secret • f(x) = sqrt(4-x2)/(2π) • m(z) = (-z + i*sqrt(4-z2))/2 • L(m,z) ≡ m2+zm+1=0 m(z) = ∫ (x-z)-1f(x) dx Stieltjes transform Practical encoding: Polynomial L whose root m is Stieltjes transform
The Polynomial Method • RMTool • http://arxiv.org/abs/math/0601389 • The polynomial method for random matrices • Eigenvectors as well!
Plus + X =randn(n,n) A=X+X’ m2+zm+1=0 Y=randn(n,2n) B=Y*Y’ zm2+(2z-1)m+2=0 A+B m3+(z+2)m2+(2z-1)m+2=0
Times * X =randn(n,n) A=X+X’ m2+zm+1=0 Y=randn(n,2n) B=Y*Y’ zm2+(2z-1)m+2=0 A*B m4z2-2m3z+m2+4mz+4=0
Open Questions • Find new applications of spacing (or other) distributions • Cleanest derivation of Tracy-Widom? • “Finite” free probability? • Finite meets infinite • Muirhead meets Tracy-Widom • Software for stochastic eigen-analysis