10
Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n n matrix A was similar to a diagonal matrix if and only if it had n linearly independent eigenvectors. In this section we investigate ‘When is a matrix similar to a triangular matrix?’ To do this we need additional concepts. We will see that the answer to this question includes both real and complex square matrices, whether or not they are diagonalizable. In addition we will get a valuable result for symmetric matrices which we use in two applications. Some review: The dot product of a pair of vectors x and y in R n as The complex dot product of a pair of vectors x and y in C n as Length of a vector: (The length of x is also called the norm of x.)

Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n n matrix A was similar to a diagonal matrix if and only if it had n linearly

Embed Size (px)

Citation preview

Page 1: Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly

Orthogonal Matrices and Spectral Representation

In Section 4.3 we saw that n n matrix A was similar to a diagonal matrix if and only if it had n linearly independent eigenvectors.

In this section we investigate ‘When is a matrix similar to a triangular matrix?’

To do this we need additional concepts. We will see that the answer to this question includes both real and complex square matrices, whether or not they are diagonalizable.

In addition we will get a valuable result for symmetric matrices which we use in two applications.

Some review:The dot product of a pair of vectors x and y in Rn as

The complex dot product of a pair of vectors x and y in Cn as

Length of a vector:

(The length of x is also called the norm of x.)

Page 2: Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly

Notation and Definitions:

In order to incorporate both the real and complex cases it is convenient to use the conjugate transpose of a matrix, which was introduced in the Exercises in Section 1.1.

We also have that the dot product of x and y is given by the row-by-column product x* y

and that the norm of a vector x in Rn or Cn can be expressed as

Definition A set S of n 1 vectors in Rn or Cn is called an orthogonal set provided none of the vectors is the zero vector and each pair of distinct vectors in S is orthogonal; that is, for vectors x y in S, x*y = 0.

Example: Show that each of the following sets is an orthogonal set.

The columns any identity matrix

The conjugate transpose of column vector x is denoted x* and is the row vector given by

Page 3: Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly

Definition An orthogonal set of vectors S is called an orthonormal set provided each of its vectors has length or norm 1; that is, x*x = 1, for each x in S.

Example: Show that each of the following sets are orthonormal.

If S is an orthogonal set you can turn S into an orthonormal set by replacing each

vector x by its corresponding unit vector

Example: Convert each of the following orthogonal sets to an orthonormal set.

Orthogonal and orthonormal sets of vectors have an important property.

An orthogonal or orthonormal set of vectors is linearly independent.(How would you prove this?)

Page 4: Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly

Next we define an important class of matrices that have rows and columns which form an orthonormal set.

Definition A real (complex) n n matrix Q whose columns form an orthonormal set is called an orthogonal (unitary ) matrix.

To combine the real and complex cases we will use the term unitary since every real matrix is a complex matrix whose entries have imaginary parts equal to zero

Page 5: Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly

Recall that one goal of this chapter is to determine the eigenvalues of a matrix A. As we stated in Section 4.3 one way to do this is to find a similarity transformation that results in a diagonal or triangular matrix so we can in effect 'read-off' the eigenvalues as the diagonal entries. It happens that unitary similarity transformations are just the right tool as given in the following important result, which is known as

Schur's Theorem.

An n n complex matrix A is unitarily similar to an upper triangular matrix.

Page 6: Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly

We will not prove Schur’s Theorem, but it does tell us that we can determine all the eigenvalues of A by finding an appropriate unitary matrix P so that P-1AP = P*AP is upper triangular. The following statement is a special case that is useful in a variety of applications, two of which we investigate later in this section.

Schur’s TheoremAn n n complex matrix A is unitarily similar to an upper triangular matrix.

Theorem: Every Hermitian (symmetric) matrix is unitarily (orthogonally) similar to a diagonal matrix.

Recall a real matrix is symmetric if AT = A and a complex matrix is Hermitian if A* = A.

An important result of this theorem is the following:

Every symmetric matrix is diagonalizable.

In fact, we know even more by using some previous results.

If A is symmetric then all of its eigenvalues are real numbers, and we can find a corresponding set of eigenvectors which form an orthonormal set.

By combining these ideas we can show that eigenvalues and eigenvectors are fundamental building blocks of a symmetric matrix. (They are like LEGOS!)

Page 7: Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly

Application: The Image of the Unit Circle by a Symmetric Matrix

Previously we showed that the image of the unit circle in R2 by a diagonal matrix was an ellipse. (See Section 1.5.)

Here we investigate the image of the unit circle by a matrix transformation whose associated matrix A is symmetric.

We will show the fundamental role of the eigenpairs of A in determining both the size and orientation of the image.

Suppose A is 2 × 2 and symmetric. Then A is orthogonally similar to a diagonal matrix so

Now we apply the transformation defined by matrix A step-by-step.

and so A = PTDP.

Page 8: Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly

For A is 2 × 2 and symmetric. Then A is orthogonally similar to a diagonal matrix so

and so A = PTDP.

Applying the transformation defined by matrix A step-by-step to the unit circle gives the following result:

PT takes the unit circle to another unit circle; the point c is just rotated

D takes this unit circle to an ellipse in standard position

P rotates or reflects the ellipse

Thus the image of the unit circle by a symmetric matrix is an ellipse with center at the origin, but possibly with its axes not parallel to the coordinate axes.

Page 9: Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly

Next we show the role of the eigenvalues and eigenvectors of symmetric matrix A.

Since A is symmetric, and so A = PTDP.

The eigen pairs of A are (λ1, p1) and (λ2, p2) , where the eigenvectors are the columns of orthogonal matrix P.

Eigenvectors of A graphed in the unit circle.

When PT is applied to the unit circle the image of the eigenvectors are unit vectors i and j.

When D is applied to the unit circle we get an ellipse with major & minor axes in the coordinate directions scaled by the eigenvalues.

When P is applied to the ellipse a rotation or reflection is performed and the axes of ellipse are in the directions of the eigenvectors of A.

Page 10: Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly

The eigenvalues determine the stretching of axes.

The eigenvectors determine the orientation of the images of the axes.

So indeed the eigenpairs completely determine the image.

Demonstrate in MATLAB using routine mapcirc.