The QR iteration for eigenvalues
. . .
The intention of the algorithm is to perform a sequence of similarity
transformations on a real matrix so that the limit is a triangular matrix.
. . .
The intention of the algorithm is to perform a sequence of similarity
transformations on a real matrix so that the limit is a triangular matrix.
If this were possible then the eigenvalues would be exactly the
diagonal elements.
But it may not be possible:
But it may not be possible:since • Real matrices may have complex eigenvaluesand• All of the arithmetic in the algorithm is real
But it may not be possible:since • Real matrices may have complex eigenvaluesand• All of the arithmetic in the algorithm is real
There is no way the real numbers can converge to anything other than real numbers.
But it may not be possible:since • Real matrices may have complex eigenvaluesand• All of the arithmetic in the algorithm is real
There is no way the real numbers can converge to anything other than real numbers.
That is: It is impossible for the limit to have numbers with non-zero imaginary parts.
But it may not be possible:since • Real matrices may have complex eigenvaluesand• All of the arithmetic in the algorithm is real
There is no way the real numbers can converge to anything other than real numbers.
That is: It is impossible for the limit to have numbers with non-zero imaginary parts.
If any eigenvalues have non-zero imaginary parts, the sequence will not converge to them.
If any eigenvalues have non-zero imaginary parts, the sequence will not converge to them.
If any eigenvalues have non-zero imaginary parts, the sequence will not converge to them.
Are we dead?
If any eigenvalues have non-zero imaginary parts, the sequence will not converge to them.
Are we dead?
Nope, but we have to modify our expectations.
. . .
Instead of the limit being an upper triangular matrix
. . .
Instead of the limit being an upper triangular matrix
it is block upper triangular
. . .
Instead of the limit being an upper triangular matrix
it is block upper triangular
. . .
Instead of the limit being an upper triangular matrix
it is block upper triangular
The blocks are 2 by 2 and…
. . .
Instead of the limit being an upper triangular matrix
it is block upper triangular
The blocks are 2 by 2 and…the eigenvalues we want are the
complex conjugate pairs of eigenvalues of the blocks
. . .
This actually presents no major troubles.
The blocks are 2 by 2 and…the eigenvalues we want are the
complex conjugate pairs of eigenvalues of the blocks
So this is the algorithm in a mathematical form(as opposed to form representing what happens in
storage):
So this is the algorithm in a mathematical form(as opposed to form representing what happens in
storage):
0. Set A1 = AFor k = 1, 2, …
1. Do a QR factorization of Ak: Ak = QkRk
2. Set Ak+1 = RkQk
This is the algorithm in a programming form:
For k = 1, 2, …1. Do a QR factorization of A: A → QR2. Set A ← RQ
Since Ak = QkRk
QkTAk = Qk
TQkRk= Rk
Since Ak = QkRk
QkTAk = Qk
TQkRk= Rk
but then
Ak+1 = RkQk= QkTAkQk
Since Ak = QkRk
QkTAk = Qk
TQkRk= Rk
but then
Ak+1 = RkQk= QkTAkQk
and since Qk is orthogonal, QkT = Qk
-1 and
Since Ak = QkRk
QkTAk = Qk
TQkRk= Rk
but then
Ak+1 = RkQk= QkTAkQk
and since Qk is orthogonal, QkT = Qk
-1 and
Ak+1 = Qk-1AkQk
Since Ak = QkRk
QkTAk = Qk
TQkRk= Rk
but then
Ak+1 = RkQk= QkTAkQk
and since Qk is orthogonal, QkT = Qk
-1 and
Ak+1 = Qk-1AkQk
Ak+1 is similar to Ak
Ak+1 is similar to Ak
Ak+1 is similar to Ak
is similar to Ak-1
Ak+1 is similar to Ak
is similar to Ak-1
is similar to Ak-2
Ak+1 is similar to Ak
is similar to Ak-1
is similar to Ak-2
. . .
is similar to A1 =A
Ak+1 is similar to Ak
is similar to Ak-1
is similar to Ak-2
. . .
is similar to A1 =A
We have a sequence of similar matricesA1, A2, A3, … tending to a block triangular matrix
whose eigenvalues are easy to obtain.
Not only are the matrices in the sequence similar they are
orthogonally similar - the similarity transformation is orthogonal
Not only are the matrices in the sequence similar they are
orthogonally similar - the similarity transformation is orthogonal
Since orthogonal matrices preserve lengths, this means:
• The matrices of the sequence do not get very large or very small, and
• The computations are done more accurately.
Let’s see the algorithm in action.
The sizes will be indicated by color.
Since, what will be interesting is seeing the subdiagonal components get smaller, we will use
a logarithmic scale that emphasizes small numbers.
1. (Unshifted) QR2. Corner shifted QR3. Double shift QR