5
 1 Matri x Fundamentals A = (a ij ) m×n = a 12  ···  a 1n . . .  . . .  . . . a m1  ···  a mn = a 1 . . . a n (a 1 , ···  , a n ) 1.1 Kro neke r Product A m×n  B  p×q = (a ij B) mp×nq 1.2 Hat amand Product A m×n  B m×n = (a  j b  j ) m×n 1.3 Range and Null spa ce range(A) = {y|y =  Ax, for some x R n }  null(A) = {x|Ax = 0} Theorem 1.1  range ( A) = span  a 1 ,..., a n 2 Ort hog ona l V ectors and Matri ces 2.1 Inner Pr oduc t inner product of vectors :  x, y R n , x T y = m i=1 x i y i The Euclidean length of  x may be written as ||x|| 2 , ||x|| 2  = √ x T x = ( m i=1 x 2 ) 1 2 The cosine of  x  and  y  i s  cosα =  x T y ||x|| 2 ||y|| 2 2.2 Orthogo nal V ect ors if  x T y = 0,  x and  yare orthogonal 2.3 Ort hogonal Sets Two sets  X, Y  are orthogonal, if for every  x X and every  y Y, x T y = 0 A set of nonzero vectors Sis orthogonal if its complement are pairwise orthogonal. If  x, y S, x T y = 0, x = y 2. 4 Orthonormal x = yx T y = 0, x T x = 1 Theorem 2.1  The vectors in a orthogonal set  S  are linearly independent 1

MATRICES

Embed Size (px)

DESCRIPTION

EJERCICIOS VARIOS

Citation preview

  • 1 Matrix Fundamentals

    A = (aij)mn

    =

    a12 a1n... . . . ...am1 amn

    =a1...

    an

    (a1, ,an)1.1 Kroneker Product

    Amn B

    pq= (aijB)

    mpnq

    1.2 Hatamand Product

    Amn Bmn

    = (ajbj)mn

    1.3 Range and Nullspace

    range(A) = {y|y = Ax, for some x Rn} null(A) = {x|Ax = 0}

    Theorem 1.1 range (A) = span a1, . . . ,an

    2 Orthogonal Vectors and Matrices

    2.1 Inner Product

    inner product of vectors : x,y Rn,xTy =mi=1

    xiyi

    The Euclidean length of xmay be written as ||x||2, ||x||2 =

    xTx = (mi=1

    x2)12

    The cosine of x and y is cos = xTy

    ||x||2||y||2

    2.2 Orthogonal Vectors

    if xTy = 0, xand yare orthogonal

    2.3 Orthogonal Sets

    Two sets X,Y are orthogonal, if for every x X and every y Y,xTy = 0A set of nonzero vectors Sis orthogonal if its complement are pairwise orthogonal. If x,y S,xTy = 0,x 6= y

    2.4 Orthonormal

    x 6= yxTy = 0,xTx = 1

    Theorem 2.1 The vectors in a orthogonal set S are linearly independent

    1

  • 2.5 Orthogonal Matrix

    Q = Rmm,QTQ = QQT = ImQ = Cmm,QHQ = QQH = Im, (QH is the unitary matrix of Q)

    Q1 = QT

    |det(Q)| = 1 ||Qx||2 = ||x||2

    2.6 Column Orthogonal Matrix

    Q1 Rmn, n < m, QT1 Q1 = In, Q1QT1 6= Im(rank(Q) n).However, we can construct the matrix from column orthogonal matrix by adding colunmsQ2 Rm(mn), QT2Q2 = Im1, QT1Q2 = 0, Q = [Q1

    nQ2mn

    ]

    3 Rank

    column rank = row rank

    Proposition 3.1 Let A and B be m n and n p matrices, then rank(A) = rank(AT ) = rank(ATA) = rank(AAT ) rank(AB) min{rank(A), rank(B)} rank(AB) = rankA if B is square matrix of full rank rank(A + B) rank(A) + rank(B)

    Theorem 3.1 rank(A,B) = rank(A) + rank(B)

    rank

    (A BC D

    )=rank

    (B AD C

    )=rank

    (C DA B

    )=rank

    (D CB A

    )Theorem 3.2 Rank factorization Let Abe an m n nonull matrix of rank r, then thereexist an m r matrix and an r n matrix T, such that A= BT. Moreover, for any m r matrix Band r n matrix Tsuch that A= BT, rank (B) = rank (T) = r.

    4 Inverse

    A is n m matrix. rank (A)= m. A= (a1, . . . ,am).{a1, . . . ,am}is a group of bases in Rnspace. For any vector Rn, there is a linear combination of bases.

    ei =

    0...1i0

    n1

    I = [e1, . . . , em] = (a1, . . . ,am)

    b11 b1m... . . . ...bm1 bmm

    = AB

    2

  • B is the right inverse of A.Similarly, I= CA, and C is called the left inverse of A.Since C=CI= CAB= B, the left inverse of a matrix is equivalent to its right inverse, wehave AA1 = A1A = I.

    Proposition 4.1 For A Rmm, the following conditions are equivalent: Ahas an inverse A1

    rank(A) = m range(A)= Rm

    null(A) = {0} 0 is not an eigenvalue of A 0 is not a singular value of A det(A) 6= 0

    Proposition 4.2 (The Sherman-Merriam-Woodburry formular) m n, Ais diag-onal and invertible.We have

    (ABD1C)1mm

    = A1 + A1B (DCA1B)1CA1nn

    Proposition 4.3 (Matrix Inverse) A, B, C, D are matrices

    (

    A 00 D

    )1=

    (A1 0

    0 D1

    )

    (

    In 0V Im

    )1=

    (In 0V Im

    )

    (

    A 0C D

    )1=

    (A1 0

    D1CA1 D1)

    (

    A B0 D

    )1=

    (A1 A1BD1

    0 D1

    )

    (

    A BC D

    )1=

    (A1 + A1BQ1CA1 A1BQ1

    Q1CA1 Q1)

    Q = DCA1B(Schur Complement)(A BC D

    )is invertible iff Aand Q are invertible

    3

  • S=(

    Im BD10 In

    )P=S

    (A BC D

    )=

    (ABD1C 0

    C D)

    T=

    (Im 0

    D1C In

    ), PT=

    (ABD1C 0

    0 D

    )S

    (A BC D

    )T=

    (ABD1C 0

    0 D

    )=

    (A BC D

    )=

    (11 1221 22

    ), 1 = =

    (11 1221 22

    )111 = 11 12122 21, 111 12 = 12122 , 21111 = 122 21

    Probabilistic Graphical Model :

    X1...Xn

    =X N(0,)Xi qXj |Xrex iff ij = 0, Xi qXj iff ij = 0

    5 Trace

    Trace of an (mm) matrix A = [aij ] is defined as tr(A) =mi=1

    aii

    Proposition 5.1 tr(1A + 2B) = 1trA + 2trB

    trA = tr(AT ) =mi=1

    i

    tr(AB) = tr(BA), tr(xyT ) =mi=1

    (xiyi)

    A = (a1, ,an), vec(A) =

    a1...an

    .tr(AB) = vec(AT )T vec(B) = vec(BT )T vec(A)

    tr(ABCD) = vec(DT )T (CT A)vec(B) = vec(AT )T (DT B)vec(C)= vec(BT )T (AT C)vec(D) = vec(CT )T (BT D)vec(A)

    tr(AB) = tr(A)tr(B)

    4

  • 6 Determinant

    A = [aij ]mm, |A| = det(A)

    Proposition 6.1 det(A) =mi=1

    i

    det(A) = mdetA det(AB) = det(A)det(B) det(AT ) = detA det(A1) = 1detA det(Im + AAT ) = det(In + ATA)

    det(

    A B0 C

    )= det(A)det(C)

    det

    (A BC 0

    )= (1)mdet(B)det(C)

    Amm Bmn Cnn Dnn

    Anonsingular =(

    A BC D

    )= det(A)det(DCA1B)

    Dnonsingular =(

    A BC D

    )= det(D)det(ABD1C)(

    A abT c

    )= det(A)det(c bTA1a) = det(A)(c bTA1a)

    det(AB) = (detA)n(detB)m

    det(A + BC) = det(A)det(I + A1BC) = detAdet(I + CA1B)proof: det(A+BC) = det(A(I+A1BC)) = det(A)det(I+A1BC) = detAdet(I+CA1B)

    5