3
Linear Algebra Systems of linear equations 1. Matrix-vector product A~v = v 1 ~a 1 + v 2 ~a 2 + ··· + v n ~a n 2. Gaussian elimination: matrix forward pass -→ row echelon form backward pass -→ reduced row echelon form Vector and matrix equations 1. Linear combination of vectors c 1 ~u 1 + c 2 ~u 2 + ··· + c k ~u k 2. Span of S = {~u 1 ,~u 2 , ··· ,~u k }: the set of all linear combinations of ~u 1 ,~u 2 , ··· ,~u k 3. a set of vectors {~u 1 ,~u 2 , ··· ,~u k } is linear independent if c 1 ~u 1 + c 2 ~u 2 + ··· + c k ~u k = 0, then c 1 = c 2 = ··· = c k =0 Matrix algebra 1. Matrix multiplication suppose the product of a m × n matrix A and a n × p matrix B is a m × p matrix C , then the elements of C c ij = n X k=1 a ik b kj AB 6= BA property of transpose: (AC ) T = C T A T 2. Linear correspondence property: if R = rref (A), then the columns of R and A have the same linear relations. 3. Matrix inversion: suppose there exist operation P such that P [AI n ]=[RB] and R = rref (A)= I n , then B = A -1 4. LU decomposition: if I l ik = a ik a kk L, A ref U , then A = LU 5. Linear transformation (a) if T (~u + ~v)= T (~u)+ T ( ~v) and T (c~u)= cT (~u), then T is linear. (b) if T is linear T ( ~v)= A~v T ( ~o)= ~o T (-~u)= -T (~u) T (a~u + b~v)= aT (~u)+ bT ( ~v) (c) Null A T : solutions to A T ~v = ~o (d) dim(Null A T )= m - rankA (e) for an m × n matrix A =[T ( ~e 1 ) T ( ~e 2 ) ··· T ( ~e n )] (see table on page 2) Determinants 1. cofactor expansion detA = a i1 c j 1 + a i2 c j 2 + ··· + a in c jn , where c ij =(-1) i+j detA ij 2. detA =(-1) r u 11 u 22 ··· u nn , where r = # of row interchanges (no scaling operations) 1

Review 1

Embed Size (px)

Citation preview

Page 1: Review 1

Linear Algebra

Systems of linear equations

1. Matrix-vector product

A~v = v1~a1 + v2~a2 + · · ·+ vn~an

2. Gaussian elimination: matrixforward pass

−→ row echelon formbackward pass

−→ reduced row echelonform

Vector and matrix equations

1. Linear combination of vectors

c1~u1 + c2~u2 + · · ·+ ck~uk

2. Span of S = {~u1, ~u2, · · · , ~uk}: the setof all linear combinations of~u1, ~u2, · · · , ~uk

3. a set of vectors {~u1, ~u2, · · · , ~uk} islinear independent ⇔ ifc1~u1 + c2~u2 + · · ·+ ck~uk = 0, thenc1 = c2 = · · · = ck = 0

Matrix algebra

1. Matrix multiplication

• suppose the product of a m× nmatrix A and a n× p matrix Bis a m× p matrix C, then theelements of C

cij =n∑

k=1

aikbkj

• AB 6= BA

• property of transpose:(AC)T = CTAT

2. Linear correspondence property: ifR = rref(A), then the columns of Rand A have the same linear relations.

3. Matrix inversion: suppose there existoperation P such that P [AIn] = [RB]and R = rref(A) = In, then B = A−1

4. LU decomposition: ifI → lik = aik

akk→ L, A→ ref→ U ,

then A = LU

5. Linear transformation

(a) if T (~u+ ~v) = T (~u) + T (~v) andT (c~u) = cT (~u), then T is linear.

(b) if T is linear

• T (~v) = A~v

• – T (~o) = ~o

– T (−~u) = −T (~u)

– T (a~u+ b~v) =aT (~u) + bT (~v)

(c) Null AT : solutions to AT~v = ~o

(d) dim(Null AT ) = m− rankA(e) for an m× n matrix

A = [T (~e1)T (~e2) · · ·T (~en)] (seetable on page 2)

Determinants

1. cofactor expansiondetA = ai1cj1 + ai2cj2 + · · ·+ aincjn,where cij = (−1)i+jdetAij

2. detA = (−1)ru11u22 · · ·unn, wherer = # of row interchanges (no scalingoperations)

1

Page 2: Review 1

Rank of A # of solutions to A~x = ~b columns of A property of Tm at least one spanning set for Rm onton at most one linearly independent one-to-one

m = n unique solution linearly independent spanning set invertible

Vectors space

1. Subspaces: set W is subspace if

• ~0 ∈ setW

• closed under addition

• closed under scalar multiplication

2. Null A: solution set of A~x = ~0

3. col A: span of columns of A

4. Row A: subspaces spanned by rows ofA

5. Basis and dimension

• Basis: a linearly independentspanning set.

• dimension: number of vectors ina basis.

• to show B is a basis for subspaceV

(a) show B is contained in V

(b) show B is linearlyindependent

(c) computer the dimension ofV , confirm the number ofvectors in B equals dim V .

• Nonzero rows of rref form a basisfor Row A.

subspace containing space dimensionNull A <n nullity A = n− rankARow A <n rank ACol A <m rank A

Eigenvalues and eigenvectors

A~v = λ~v

(A− λIn)~v = ~0

where λ are eigenvalues and ~v areeigenvectors of A.

1. characteristic polynomialdet(A− λIn) = 0

2. diagonalization of matrices

A = PDP−1

where P = [~v1 ~v2 · · ·~vn],

D =

λ1 0 · · · 00 λ2 · · · 0... 0

. . . 00 · · · 0 λn

Orthogonality and leastsquares

1. ~U⊥~V if ~U · ~V = 0

2. Cauchy-Schwarz inequality|~U · ~V | ≤ |~U ||~V |

3. Triangle inequality|~U + ~V | ≤ |~U |+ |~V |

2

Page 3: Review 1

4. Gram-Schmidt process

~v1 = ~u1

~vi = ~ui−~ui · ~v1

|~v1|2~v1−

~ui · ~v2

|~v2|2~v2−· · ·−

~ui · ~vi−1

|~vi−1|2~vi−1

5. QR factorization: givenA = [~u1, ~u2, · · · , ~un]

• Gram-Schmidt → ~vi

• normalize → ~ei

• Q = [~e1, ~e2, · · · , ~en]

• R = Q−1A = QTA

to min |~b− A~x|, A~x = ~b, QR~x = ~b,

R~x = Q−1~b = QT~b, solve for x.

6. Orthogonal projection of ~b ontosubspace W

~w = A(ATA)−1AT~b

~w = V V T~b

where V is an orthonormal basis forsubspace W

7. Least squares

A~x = ~b

ATA~x = AT~b

~x = (ATA)−1AT~b

Symmetric matrices

1. Singular value decomposition

A = UΣV T

Σ =

s1 0 · · · 00 s2 · · · 0... 0

. . . 00 · · · 0 0

2. A is symmetric positive definite, ifA = BTB for a nonsingular matrix B.

3