Upload
firoz-ahmed-khan
View
12
Download
0
Embed Size (px)
DESCRIPTION
Basic Concepts about matrix algebra
Citation preview
Matrix Algebra
Review of important concepts
• A matrix is a rectangular array of numbers.• An mn (“m by n”) matrix has exactly m horizontal rows, and
n vertical columns.• The rows in a matrix are usually indexed 1 to m from top to
bottom. The columns are usually indexed 1 to n from left to right. Elements are indexed by row, then column.
jinmjinm
jiji
nmmm
n
n
ji aaa
aaa
aaa
aaa
a ,,,,
1,,
,2,1,
,22,21,2
,12,11,1
, ][
A
• The element ai,i is called the ith diagonal element of A and ai,j, for i ≠ j is called the (i,j) th element of A.
• The case m=n is important in practical applications. Such matrices are called square matrices of order n. Matrices for which m≠n are called non-square or rectangular matrix
• An m1 matrix is said to be m-vector or a column m-vector.
• 1xn matrix is said to be n-vector or a row n-vector.
• Commonly Rn and Cn are notations for the sets of real and complex column n-vector; and Rmxn and Cmxn are notations for the sets that contain mxn real and complex matrices respectively. So matrix A can be either real or complex
cbawvi
i
,
2
1,
4
2
3
0
5.2
1A
• The Null/Zero matrix, written 0, is the matrix all of whose components are zero.
• The null matrix of order 3 × 3 is
000
000
000
Z
• The identity matrix, written I, is a square matrix all of which entries are zero except those on the main diagonal, which are ones.
100
010
001
I
• A diagonal matrix is a square matrix all of which entries are zero except for those on the main diagonal, which may be arbitrary.
3000
0000
0050
00014
00
00
00
Dor
c
b
a
D
• An upper triangular matrix is a square matrix in which all
elements underneath the main diagonal vanish.
6000
4600
4460
1246
00
0 Uor
f
ed
cba
U
• A lower triangular matrix is a square matrix in which all entries above the main diagonal vanish.
fed
cb
a
0
00
• A matrix S is said to be sub matrix of A if the rows and columns of S are consecutive rows and columns of A. if the rows and columns start from first ones , S is called a leading sub matrix
21S Is a leading sub matrix of
2
4
1
1
3
5A
• Square matrices for which ai,j = aj,i are called symmetric about the main diagonal or simply symmetric.
• Square matrices for which ai,j = - aj,i are called antisymmetric or skew-symmetric. The diagonal entries of an antisymmetric matrix must be zero
Equality:• Two matrices A and B of same order m×n are said to be equal if
and only if all of their components are equal: ai,j= bi,j
for all i = 1,...m, j = 1,... n. We then write A = B.
• If the inequality test fails the matrices are said to be unequal and we write A ≠ B.
• Two matrices of different order cannot be compared for equality or inequality.
Basic Operations • Transpose : The transpose of a matrix A is
another matrix denoted by AT that has n rows and m columns.
• The rows of AT are the columns of A, and the rows of A are the columns of AT
• Obviously the transpose of AT is again A, that is, (AT )T = A.
Ex.
• The transpose of a square matrix is also a square matrix. The transpose of a symmetric matrix A is equal to the original matrix, i.e., A = AT
• The negated transpose of an antisymmetric matrix A is equal to the original matrix, i.e. A =− AT
Trace of a Square Matrix: the trace of an n x n real square matrix A =[ai,j] is defined by sum of the diagonal element of A; i.e
541)(43
21
1
AtrthenA
aAtrn
k kk
Ex.
Scalar Multiplication
• Multiplication of a matrix A by a scalar α is defined by means of the relation
mnijij
mnij
babA
aA
where α=3
Ex.
Dot product(Inner product) and Orthogonality
• Given two vector
• The dot product or inner product of u and v is a scalar α and is defined as
n
nn
Cin
v
v
vand
u
u
u
.
.
.
.
.
.11
k
n
kk
n
n vu
v
v
uuvu
1
*
1
**1
*
.
.
.
...
132
323
42465
4321*
2
3,
65
4,
32
1
wwand
ii
iivu
then
wi
iv
iu
T
Vector u and v are said to be orthogonal if u’v=0. A set of vectors {v1, . . . , vm} is said to be orthogonal if v’ivj=0 for all i≠j
and is said to be orthonormal if in addition v’ivj=1 for all i=1,
…,m.
1
1
2
1,
1
1
2
1,
2
2,
1
121 wvuu
The set {u1,u2} is orthogonal and the set {v1,v2} is orthonormal
• Matrix addition and scalar multiplication have the following properties:
1. A + B = B + A2. A + (B +C) = (A + B) + C3. (αβ)A = α(βA) = β(αA)4. (A + B )T = AT+ BT
Matrix Multiplication• A = [aij] is a matrix of order m × n,
B = [bjk ] is a matrix of order n × p,
and C = [cik ] is a matrix of order m × p.
The entries of the result matrix C are defined by the formula
The (i, k)th entry of C is computed by taking the inner product of the ith row of A with the kth column of B. For this definition to work and the product be possible, the column dimension of A must be the same as the row dimension of B.
• Matrix multiplication has the following properties1. ABC=A(BC)=(AB)C2. (A+B)C = AC + BC3. A(B + C) = AB + AC4. (AB)T = BTAT ; if A and B are real
Determinant of a square matrix
• The determinant of a square matrix A = [aij] is denoted by |A| or det(A)
• For a general n x n matrix A =[aij], the determinant is defined as
)det()1(
)det()1()det(
1
1
kjkj
n
k
jk
n
kikik
ki
Aa
AaA
For any 1≤i, j≤n where Apq is the (n-1) X (n-1) matrix resulting from deletion of the pth row and the qth column of A.
Linear Independence of Vectors & Basis Vectors
Linear Independence of Vectors
Basis Vectors
Rank of Matrix
• The rank of an m x n matrix A, denoted b y rank (A), is the largest number of columns (or rows) of A that form a set of linearly independent vectors.
Ex.
Ex.
Inverse of matrix
Ex.
Eigen values and Eigen vectors of Matrix and Spectral Radius
Eigen values and Eigen vectors have the following properties
Theorem 1:
Theorem 2:
KRONECKER PRODUCT AND KRONECKER SUMKronecker product: let A = [aij]mn and B = [bij]mn . The kronecker
product of A and B denoted by is an mp x nq matrix defined by
Ex.
A Bi jA B a B
Kronecker Sum:
If (λ i,x i) is an eigen pair of A and (µ j,y j) is an eigen pair of B, then (λ i+µj , x i+ y j ) is an eigen pair of A B
Vector Norms
Matrix Norms
EX. 1.:
EX. 2.:
Condition Numbers
EX.
Similarity Transformation
EX.
Ex.
Singular values , Singular value decomposition and Pseudo - inverse
Singular value decomposition
Ex.
Pseudo-Inverse