View
21.351
Download
10
Category
Preview:
Citation preview
Eigenvalues and Eigenvectors and their Applications
By Dr. P.K.Sharma
Sr. Lecturer in MathematicsD.A.V. College Jalandhar.
Email Id: pksharma_davc@yahoo.co.in
The purpose of my lecture is to make you to understand the following :
• What are eigenvectors and eigenvalues ?• What is the origin of eigenvectors and eigenvalues ?• Do every matrix have eigenvectors and eigenvalues ?• Are eigenvectors corresponding to a given eigenvalue unique?• How many L.I. eigenvectors corresponding to a given
eigenvalue exists ?• What are the eigenvalues corresponding to special types of
matrices like symmetric , skew symmetric , orthoganal and unitary matrices etc.
• Some important Theorems relating to eigenvalues • Why eigenvectors and eigenvalues are important ?• What are the application of eigenvectors and eigenvalues ?
Linear algebra studies linear transformations, which are represented by matrices acting on vectors. Eigenvalues, eigenvectors and eigenspaces are properties of a matrix.
• In general, a matrix acts on a vector by changing both its magnitude and its direction. However, a matrix may act on certain vectors by changing only their magnitude, and leaving their direction unchanged (or possibly reversing it). These vectors are the eigenvectors of the matrix. A matrix acts on an eigenvector by multiplying its magnitude by a factor, which is positive if its direction is unchanged and negative if its direction is reversed. This factor is the eigenvalue associated with that eigenvector.
4
Definition• If A is an n × n matrix, then a nonzero vector x in Rn is called
an eigenvector of A if Ax is a scalar multiple of x ; that is , A x = λ x , for some scalar λ. The scalar λ is called an eigenvalue of A , and x is called the
eigenvector of A corresponding to the eigenvalue λ.
In short : (1) An Eigenvector is a vector that maintains its
direction after undergoing a linear transformation.
(2) An Eigenvalue is the scalar value that the eigenvector
was multiplied by during the linear transformation
5
Example 1Eigenvector of a 2×2 Matrix
• The vector is an eigenvector of the matrix
Corresponding to the eigenvalue λ=3, since
1
2
x
3 0
8 1A
3 0 1 33
8 1 2 6A
x x
6
Example 2X is not an Eigenvector of a 2×2 Matrix
• The vector is not eigenvector of the matrix
, there donot exist scalar such that
Hence the vector x is not eigenvector of the matrix ANote: Not all matrices have eigenvalues . Only square
matrices have eigenvalues and eigenvectors.
3 0
8 1A
2
3
x
3 0 2 6 =
8 1 3 13A
x x
Eigenvalues and eigenvectors have their origins in physics, in particular in problems where motion is involved, although their uses extend from solutions to stress and strain problems to differential equations and quantum mechanics.
Recall from last class that we used matrices to deform a body - the concept of STRAIN. Eigenvectors are vectors that point in directions where there is no rotation. Eigenvalues are the change in length of the eigenvector from the original length.
The basic equation in eigenvalue problems is: Ax = λx.
The multiplication of vector x by a scalar constant is the same as stretching or shrinking the coordinates by a constant value.
Origin of Eigenvalues and Eigenvectors
How to find eigenvalues of a square matrix of order n
• To find the eigenvalues of an n × n matrix A .
we rewrite Ax = λx as Ax = λI x
or equivalently, (λI - A)x = 0 (1)
Equation (1) has a nonzero solution if and only if
det (λI - A) = 0 (2)
Equation (2) is called the characteristic equation of A; the
scalar satisfying this equation are the eigenvalues of A. When
expanded, det (λI - A) is a polynomial p in λ called the
characteristic polynomial of A.
The set of all eigenvalues of A is called the Spectrum of A .
Example 3Eigenvalues of a 3×3 Matrix
• Find the eigenvalues of
Solution.The characteristic polynomial of A is
The eigenvalues of A must therefore satisfy the characteristic equation
0 1 0
0 0 1
4 17 8
A
3 2
1 0
det( ) det 0 1 8 17 4
4 17 8
I A
3 28 17 4 0 (2) solving we find = 4 , 2 + 3 , 2 - 3On
10
Example 4Eigenvalues of an Upper Triangular Matrix
• Find the eigenvalues of the upper triangular matrix
Solution.
Recalling that the determinant of a triangular matrix is the product of the entries on the main diagonal , we obtain
Thus, the characteristic equation is (λ-a11)(λ-a22) (λ-a33) (λ-a44)=0and the eigenvalues are λ=a11, λ=a22, λ=a33, λ=a44which are precisely the diagonal entries of A.
11 12 13 14
22 23 24
33 34
44
0
0 0
0 0 0
a a a a
a a aA
a a
a
11 12 13 14
22 23 2411 22 33 44
33 34
44
0det( ) det ( )( )( )( )
0 0
0 0 0
a a a a
a a aI A a a a a
a a
a
Eigenvalues of special types of matrices
Types of Matrices
• Symmetric • Skew Symmetric • Orthogonal • Hermitian • Skew Hermatian • Unitary
Nature of Eigenvalues
• Reals• Purely Imaginary or Zero• Unit modulus • Reals• Purely imaginary or zero• Unit modulus
( = A )TA
( = -A )TA
( = -A )A
( = A )A
( A = I )A
( A = I )TA
12
Theorem 1If A is an n×n triangular matrix (upper triangular, lower triangular, or diagonal), then the eigenvalues of A are entries on the main diagonal of A.
Theorem 2If k is a positive integer, λ is an eigenvalue of a matrix A,
and x is corresponding eigenvector, then λk is an eigenvalue of Ak and x is a corresponding eigenvector.
Theorem 3A square matrix A is invertible if and only if λ=0 is not an
eigenvalue of A.
13
Theorem 4Equivalent Statements
• If A is an n × n matrix and λ is a real number, then the following are equivalent.
a) λ is an eigenvalue of A.b) The system of equations (λI-A)x = 0 has
nontrivial solutions.c) There is a nonzero vector x in Rn such that
Ax=λx.d) λ is a solution of the characteristic equation
det(λI-A)=0.
14
Finding Eigenvectors corresponding to given eigenvector (or Bases for
Eigenspaces)• The eigenvectors of A corresponding to an eigenvalue λ are
the nonzero vectors x that satisfy Ax = λx.
• Equivalently, the eigenvectors corresponding to λ are the nonzero vectors in the solution space of (λI-A)x =0. We call this solution space the eigenspace of A corresponding to λ.
• Are eigenvector x corresponding to eigenvalue λ unique ? No ; every scalar multiple of eigenvector x is also
eigvevector corresponding to eigenvalue λ, for A(kx) = k(Ax) = k(λx) = λ(kx). • The set of L.I. eigenvectors forms the bases of the
eigenspace.
15
Example 5Finding Eigenvectors of a square matrix
(or Bases for Eigenspaces)• Find eigenvectors and hence the bases for the
eigenspaces of …… (1)Solution.The characteristic equation of matrix A is λ3-5λ2+8λ-4=0 or (λ-1)(λ-2)2=0 ; ------(2) Thus, the eigenvalues of A are λ=1 and λ=2, so we
need to find the eigenvectors corresponding to these two distinct eigenvalues.
0 0 2
1 2 1
1 0 3
A
16
By definition,
Is an eigenvector of A corresponding to λ if and only if x is a nontrivial solution of (λI-A)x=0, that is, of
1
2
3
x
x
x
x
1
2
3
0 2 0
1 2 1 0 (3)
1 0 3 0
x
x
x
1
2
3
2 0 2 0
1 0 1 0
1 0 1 0
x
x
x
If λ=2, then (3) becomes
Solving this system yield x1 = - s , x2 = t , x3 = sThus, the eigenvectors of A corresponding to λ=2 are the nonzero vectors of the form
0 1 0
0 0 1
0 1 0
s s
t t s t
s s
x
1 0
0 and 1
1 0
are linearly independent eigenvectors.
and so these vectors form a basis for the eigenspace corresponding to 2.=
Similarly, the eigenvectors corresponding to λ = 1 are the nonzero- vector so it form a basis for the eigenspace
corresponding to λ = 1.
Note: The number of L.I. eigenvector corresponding to eigenvalue λ equal to n – rank(λI – A) , where n is the order of square matix A.
-2
1
1
Since
18
Theorem (5/1)Equivalent Statements
• If A is an n × n matrix, and if TA: Rn →Rn is multiplication by A, then
the following are equivalent.
a) A is invertible.
b) Ax = 0 has only the trivial solution.
c) The reduced row-echelon form of A is In.
d) A is expressible as a product of elementary matrix.
e) Ax = B is consistent for every n × 1 matrix B.
f) Ax = B has exactly one solution for every n × 1 matrix B.
g) det(A)≠0.
19
Theorem ( 5/2 )Equivalent Statements
h) The range of TA is Rn.
i) TA is one-to-one.
j) The column vectors of A are linearly independent.
k) The row vectors of A are linearly independent.
l) The column vectors of A span Rn.
m) The row vectors of A span Rn.
n) The column vectors of A form a basis for Rn.
o) The row vectors of A form a basis for Rn.
20
Theorem (5/3)Equivalent Statements
p) A has rank n.q) A has nullity 0.r) The orthogonal complement of the nullspace of A
is Rn.s) The orthogonal complement of the row space of A
is {0}.t) ATA is invertible.u) = 0 is not eigenvalue of A.
21
Diagonalization
Definition : A square matrix A is called
diagonalizable if there is an invertible
matrix P such that P-1AP is a diagonal
matrix; the matrix P is said to diagonalize
A.
22
Theorem 6If v1, v2, … vk, are eigenvectors of A corresponding to distinct eigenvalues λ1, λ2, …, λk , then {v1, v2, … vk} is a linearly independent set.
Theorem 7 If an n × n matrix A has n distinct eigenvalues, then A
is diagonalizable.
23
Theorem 8
• If A is an n × n matrix, then the following are
equivalent.
a) A is diagonalizable.
b) A has n linearly independent eigenvectors.
24
Procedure for Diagonalizing a Matrix
• The preceding theorem guarantees that an n × n matrix A with n
L.I. eigenvectors is diagonalizable, and the proof provides the
following method for diagonalizing A.
Step 1. Find n L.I. eigenvectors of A, say, p1, p2, …, pn.
Step 2. From the matrix P having p1, p2, …, pn as its column vectors.
Step 3. The matrix P-1AP will then be diagonal with λ1, λ2, …, λn as
its successive diagonal entries, where λi is the eigenvalue
corresponding to pi, for i=1, 2, …, n.
25
Example 6Finding a Matrix P That Diagonalizes a Matrix A
• Find a matrix P that diagonalizes
Solution.
From Example 5 of the preceding section we found the characteristic
equation of A to be (λ-1)(λ-2)2=0
and we found the following bases for the eigenspaces:
0 0 2
1 2 1
1 0 3
A
1 2 3
1 0 2
2 : 0 , 1 =1: 1
1 0 1
p p p
26
Example 6 (Cont.)Finding a Matrix P That Diagonalizes a Matrix A
There are three basis vectors in total, so the matrix A is diagonalizable and
diagonalizes A. As a check, the reader should verify that
1 0 2
0 1 1
1 0 1
P
1
1 0 2 0 0 2 1 0 2 2 0 0
1 1 1 1 2 1 0 1 1 0 2 0
1 0 1 1 0 3 1 0 1 0 0 1
P AP
27
Example 7A Matrix That Is Not Diagonalizable
• Find a matrix P ( if possible ) that diagonalize the matrix
Solution.The characteristic polynomial of A is
1 0 0
1 2 0
3 5 2
A
2
1 0 0
det( ) 1 2 0 ( 1)( 2)
3 5 2
I A
28
Example 7 (Cont.)A Matrix That Is Not Diagonalizable
so the characteristic equation is
(λ-1)(λ-2)2=0
Thus, the eigenvalues of A are λ=1 and λ=2. We can easily show
that bases for the eigenspaces are
Since A is a 3×3 matrix and there are only two basis vectors in total, A is not diagonalizable.
1 2
1/ 8 0
1: 1/ 8 2 : 0
1 1
p p
Algebraic multiplicity of : It is defined as the number of times the root occur in the characteristic equation and is denoted by M . Geometric multiplicity of : It is defined as the number of L.I. eigenvectors associated with and it denoted by m (= dimension of eigenspace of )
In general, m M
Defect of : = M- m
Remark : (1) Defective matrices are not diagonalizable(2) If eigenvalues are repeated, the matrix A may or may not have n independent eigenvectors.
30
Application of Eigenvalues and Eigenvectors
• Computing Powers of a Matrix :
There are numerous problems in applied mathematics that require
the computation of high powers of a square matrix. We shall
conclude this section by showing how diagonalization can be used
to simplify such computations for diagonalizable matrices.
If A is an n × n matrix and P is an invertible matrix, then
(P-1AP)2 = (P-1AP)(P-1AP) = P-1AIAP=P-1A2P
More generally, for any positive integer k
(P-1AP)k = P-1Ak P
31
Computing Powers of a Matrix (cont.)It follows form this equation that if A is diagonalizable, and P-1AP=D is
diagonal matrix, then P-1AkP = ( P-1AP ) k = Dk
Solving this equation for Ak yield : Ak = PDk P-1
This last equation expresses the kth power of A in terms of the kth
power of the diagonal matrix D. But Dk is easy to compute; for
example, if 1 1
2 k 2
0 ... 0 0 ... 0
0 ... 0 0 ... 0, and D
: : : : : :
0 0 ... 0 0 ...
k
k
kn n
d d
d dD
d d
32
Example 8 Power of a Matrix
• Find A13, where
Solution.
We showed in Example 5 that the matrix A has three L.I. eigenvectors
and so the matrix A is diagonalized by
and that
0 0 2
1 2 1
1 0 3
A
1 0 2
0 1 1
1 0 1
P
1
2 0 0
0 2 0
0 0 1
D P AP
33
Example 8 ( Cont.)Power of a Matrix
Thus , we have 13
13 13 1 13
13
1 0 2 2 0 0 1 0 2
0 1 1 0 2 0 1 1 1
1 0 1 0 0 1 1 0 1
8190 0 16382
8191 8192 8191
8191 0 16383
A PD P
Cayley-Hamilton Theorem Every square matrix satisfy its characteristic polynomial
i.e. If p() = det(A I) is the characteristic polynomial of an n n matrix A, then p(A) = 0.
Application of Cayley-HamiltonTheoremThe Cayley-Hamilton Theorem can be used to find: The Power of a matrix and The Inverse of an n n matrix A, by expressing
these as polynomials in A of degree < n.
• The eigenvalue and eigenvector method of mathematical analysis is useful in many fields because it can be used to solve homogeneous linear systems of differential equations with constant coefficients. Furthermore, in chemical engineering many models are formed on the basis of systems of differential equations that are either linear or can be linearized and solved using the eigenvalue eigenvector method. In general, most ODEs can be linearized and therefore solved by this method
Use of eigenvectors and eigenvalues in solving Linear differential equations
How to solve Initial value problem using Eigenvalues and Eigenvectors
• Solve the following Initial value problem:
111 1 12 2 1
221 1 22 2 2
= a + a + .........+ a
= a + a + .........+ a
.......................................................
.......................................................
= a
n n
n n
nn
duu u u
dtdu
u u udt
du
dt 1 1 2 2 + a + .........+ a
that u = b when t = 0 , for i = 1 , 2 ,...., n
n nn n
i i
u u u
given
1 11 12 1 1
2 21 22 2 2
1 2
write the above system in the matrix form as:
( ) ...
( ) ... = Au , where u(t) = , A = , u(0) =
: ... ... ... ... :
( ) ...
n
n
n n n nn n
we
u t a a a b
u t a a a bdu
dt
u t a a a b
1 2
1 2
, , ......., be the eigenvalues of matrix A and
X , X , ............, X be the corresponding eigenvectors.
Then the solutions of this L.D.E. is given by :
u = X e , where X is the co
n
n
t
Let
rresponding eigenvector of .
1
general solution is given by :
u(t) = i
nt
i ii
The
c X e
1 1 1
2 2 2
1
, u(0) = , where X: : :
solving these equations , we can find c , for i = 1,2,...,n
So the solution of the given Initial
i i
ni i
i ii
n in in
i
b x x
b x xNow c
b x x
on
1 1
2 2
1
value problem is :
( )
( )( ) = , On comparing , we get
: :
( )
u ( ) = , for i = 1 , 2 , ....., n .
i
i
i
ni t
ii
n in
ti i ii
u t x
u t xu t c e
u t x
t c x e
Example 9 Solve the Initial Value Problem
• Solve the Initial value problem
• The problem is to find v(t) and w(t) for t > 0. We write the system in the matrix form as :
As in EX. 1 , we see that eigenvalues of the matrix A are
= 3v ; v = 6 at t = 0
= 8v - w ; w = 5 at t = 0
dv
dtdw
dt
( ) 3 0 6 = Au , where u(t) = , A = , u(0) =
( ) 8 1 5
v tdu
w tdt
1 23 , 1.
The corresponding eigenvalues are , The solution to this L.D.E. is given by
The general solution is given by
Thus ; the solution of the given Initial value problem is :
1 2
1 0 = , X = .
2 1X
= Au du
dt X e , where X is the eigenvector corresponing to eigenvalue tu
1 2 1 1 2 2 1 2u(t) = C X e + C X e , where C and C be arbitrary constantst t
1 2 1 2
6 1 0So , u(0) = = C + C , that C 6 and C 7.
5 2 1implies
3 - ( ) 1 0So we get u(t) = = 6 e -7 e
( ) 2 1t tv t
w t
3 3 comparing , we get v(t) = 6e and w(t) = 12e 7et t tOn
Eigenvectors and eigenvalues are used in structural geology to determine the directions of principal strain; the directions where angles are not changing.
In seismology, these are the directions of least compression (tension), the compression axis, and the intermediate axis (in three dimensions). Some facts:
• The product of the eigenvalues = det|A|
• The sum of the eigenvalues = trace(A)
The (x,y) values of A can be thought of as representing points on an ellipse centered at (0.0). The eigenvectors are then in the directions of the major and minor axes of the ellipse, and the eigenvalues are the lengths of these axes to the ellipse from (0,0.)
©DB Consulting, 1999, all rights reserved
Eigenvalues & Eigenvectors• Example of uses:
–Structural analysis (vibrations).–Correlation analysis
(statistical analysis and data mining).• We use the Jacobi method applicable to symmetric
matrices only.• A 2x2 rotation matrix is applied on the matrix to
annihilate the largest off-diagonal element.• This process is repeated until all offdiagonal
elements are negligible.
Questions
?
Thanks
Recommended