350 likes | 663 Views
The QR iteration for eigenvalues. The intention of the algorithm is to perform a sequence of similarity transformations on a real matrix so that the limit is a triangular matrix. . . .
E N D
The intention of the algorithm is to perform a sequence of similarity transformations on a real matrix so that the limit is a triangular matrix. . . .
The intention of the algorithm is to perform a sequence of similarity transformations on a real matrix so that the limit is a triangular matrix. . . . If this were possible then the eigenvalues would be exactly the diagonal elements.
But it may not be possible: • since • Real matrices may have complex eigenvalues • and • All of the arithmetic in the algorithm is real
But it may not be possible: • since • Real matrices may have complex eigenvalues • and • All of the arithmetic in the algorithm is real • There is no way the real numbers can converge to anything other than real numbers.
But it may not be possible: • since • Real matrices may have complex eigenvalues • and • All of the arithmetic in the algorithm is real • There is no way the real numbers can converge to anything other than real numbers. • That is: It is impossible for the limit to have numbers with non-zero imaginary parts.
But it may not be possible: • since • Real matrices may have complex eigenvalues • and • All of the arithmetic in the algorithm is real • There is no way the real numbers can converge to anything other than real numbers. • That is: It is impossible for the limit to have numbers with non-zero imaginary parts. • If any eigenvalues have non-zero imaginary parts, the sequence will not converge to them.
If any eigenvalues have non-zero imaginary parts, the sequence will not converge to them.
If any eigenvalues have non-zero imaginary parts, the sequence will not converge to them. Are we dead?
If any eigenvalues have non-zero imaginary parts, the sequence will not converge to them. Are we dead? Nope, but we have to modify our expectations.
Instead of the limit being an upper triangular matrix it is block upper triangular . . .
Instead of the limit being an upper triangular matrix it is block upper triangular . . .
Instead of the limit being an upper triangular matrix it is block upper triangular . . . The blocks are 2 by 2 and…
Instead of the limit being an upper triangular matrix it is block upper triangular . . . The blocks are 2 by 2 and… the eigenvalues we want are the complex conjugate pairs of eigenvalues of the blocks
This actually presents no major troubles. . . . The blocks are 2 by 2 and… the eigenvalues we want are the complex conjugate pairs of eigenvalues of the blocks
So this is the algorithm in a mathematical form • (as opposed to form representing what happens in storage):
So this is the algorithm in a mathematical form • (as opposed to form representing what happens in storage): • 0. Set A1 = A • For k = 1, 2, … • Do a QR factorization of Ak: Ak = QkRk • Set Ak+1= RkQk
This is the algorithm in a programming form: • For k = 1, 2, … • Do a QR factorization of A: A → QR • Set A ← RQ
Since Ak = QkRk • QkTAk= QkTQkRk= Rk
Since Ak = QkRk • QkTAk= QkTQkRk= Rk • but then • Ak+1= RkQk= QkTAkQk
Since Ak = QkRk • QkTAk= QkTQkRk= Rk • but then • Ak+1= RkQk= QkTAkQk • and since Qkis orthogonal,QkT=Qk-1 and
Since Ak = QkRk • QkTAk= QkTQkRk= Rk • but then • Ak+1= RkQk= QkTAkQk • and since Qkis orthogonal,QkT=Qk-1 and • Ak+1 = Qk-1AkQk
Since Ak = QkRk • QkTAk= QkTQkRk= Rk • but then • Ak+1= RkQk= QkTAkQk • and since Qkis orthogonal,QkT=Qk-1 and • Ak+1 = Qk-1AkQk • Ak+1 is similar to Ak
Ak+1 is similar to Ak • is similar to Ak-1
Ak+1 is similar to Ak • is similar to Ak-1 • is similar to Ak-2
Ak+1 is similar to Ak • is similar to Ak-1 • is similar to Ak-2 • . . . • is similar to A1=A
Ak+1 is similar to Ak • is similar to Ak-1 • is similar to Ak-2 • . . . • is similar to A1=A • We have a sequence of similar matrices • A1, A2, A3, … tending to a block triangular matrix • whose eigenvalues are easy to obtain.
Not only are the matrices in the sequence • similar • they are • orthogonally similar • - the similarity transformation is orthogonal
Not only are the matrices in the sequence • similar • they are • orthogonally similar • - the similarity transformation is orthogonal • Since orthogonal matrices preserve lengths, • this means: • The matrices of the sequence do not get very large or very small, and • The computations are done more accurately.
Let’s see the algorithm in action. • The sizes will be indicated by color. • Since, what will be interesting is seeing the subdiagonal components get smaller, we will use • a logarithmic scale that emphasizes small numbers. • (Unshifted) QR • Corner shifted QR • Double shift QR