Section 6.1 Inner Product, Length, Orthogonality
ELOs:
• Compute inner products (dot products) in Rn and describe their properties.
• Find the length of a vector in Rn and the distance between two vectors.
• Use the inner product to check if two vectors are orthogonal.
Goal: Introduce geometric concepts of length, distance and orthogonality in vector spaces.
Motivation:
Warm-up: Let u be a vector in Rn. We may consider vectors in Rn as n ⇥ 1 matrices and define the
transpose uT as a 1 ⇥ n matrix. Then the matrix product u
Tu is a 1 ⇥ 1 matrix, which we write as a
scalar without brackets.
Definition 1 Let u and v be vectors in Rn. The scalar, uTv = u · v, is called the inner product (or
dot product) of u and v
u · v = uTv =
⇥u1 u2 . . . un
⇤
2
6664
v1v2...vn
3
7775= u1v1 + u2v2 + · · ·+ unvn
1
noteif
youhad
calculusalready
on
innerproduct
concepts of length distance perpendicular is a
Are all understood for 112241123 dotpoduct
but we need to extend these ideas
for Rn n 3
Ex if I zthen view Ttu I 2371
p122 32 14
Ex if I 1 then I.I I o 2 5 17fog1 11034254
Theorem 1 Let u,v,w 2 Rn and c 2 R.
(a) u · v = v · u
(b) (u+ v) ·w = u ·w + v ·w
(c) (cu) · v = c(u · v) = u · (cv)
(d) u · u � 0 and u · u = 0 () u = 0
Repeated application of (b) and (c) yields
(c1u1 + · · ·+ cpup) ·w = c1(u1 ·w) + · · ·+ cp(up ·w)
Example:
Definition 2 The length (or norm) of v is the nonnegative scalar k v k defined by
k v k=pv · v =
qv21 + v22 + · · ·+ v2n
k v k2= v · v
A unit vector, u, is a vector of length 1. To create a unit vector or to “normalize” a vector, dividethe vector by its length. That is,
u =v
k v kwhere u is in the same direction as v.
Example:
Observe: For any scalar c, k cv k= |c| k v k
2
Properties of Inner Products
commutativitydistributivitycommutativity Eassociatuity
wscalarmultiplication
note uaJ I
compute Itv I if a L F EI
note thisextendswhat
happensin 1122
JT j g 1123
For I LIfind a unit vector 8
Distance in Rn
Definition 3 The distance between u and v in Rn is
dist(u,v) =k u� v k
Example:
Orthogonal Vectors
Two lines are geometrically perpendicular if and only if the distance from u to v is the same as thedistance from u to �v. That is, the squares of the distances are the same.
3
y v 7 lur ve t thenUn
Find distance between IIg
and Ffaq
y
men Ese one
For ILE to be true we need
the811 110 CEHHui TIF the C E 112
y u 4 Luz y 5 4 4 7 cestus
up u tf Hui Zun 2uzvz
ft U f t u Zu v 1242k
I I tv.v zu.u u.ci vovt2uev
HIM HELP 25.5 115112 118112 25.8
2in 2in container t
Definition 4 Two vectors u,v 2 Rn are orthogonal (to each other) if u · v = 0.
Observe: The zero vector is orthogonal to every vector in Rn.
Theorem 2 Two vectors u and v are orthogonal if and only if k u+ v k2=k u k2 + k v k2.
Orthogonal Complements
Example:
Observation: The orthogonal complement of W is the collection of all vectors in Rn orthogonal to W :
W? = {v 2 Rn : v ·w = 0 for all w 2 W}.
Facts about W? :
•
•
Theorem 3 Let A be an m⇥ n matrix. The orthogonal complement of the row space of A is the nullspace of A, and the orthogonal complement of the column space of A is the null space of AT :
(Row A)? = Nul A and (Col A)? = Nul AT
4
This expands idea from 1122 to 1124 th 2,3
Pythagorean theorem
A Itvin 1122
µIn 1123 the z axis i.e any vector
of
the form where c constant
is orthogonal to the Xy plane i e the
subspace in1123 spanned by too f 3
so we'd say we span 1913 when w spay 1113called w perp
Wt is subspace of IRIE wt iff orthogonalto every rector in a spanningset of W
A man
hole Nul AC IR f Colt C 112M
Row A Ck Nul AT CIRM
Section 6.2 Orthogonal Sets
ELOs:
• Define and give examples of orthogonal sets and orthogonal bases.
• Find an orthogonal projection in Rn.
Definition 1 A set of vectors {u1, . . . ,up} in Rnis an orthogonal set if each pair of distinct vectors
from the set is orthogonal. That is, ui · uj = 0 when i 6= j.
Example: Show that
⇢2
41
�1
0
3
5 ,
2
41
1
0
3
5 ,
2
40
0
1
3
5�
is an orthogonal set.
Exercise: Construct an orthogonal set in R3that contains three vectors.
Theorem 4 If S = {u1, . . . ,up} is an orthogonal set of nonzero vectors in Rn, then S is linearly
independent and hence is a basis for the subspace spanned by S.
Partial Proof:
5
PI Assume 3 4 cp s tc It t tcpup 8 ie we're exploring if S
is linearly independentset
Then Ciu t tCpup I 3out would betrue
GLI it tczlux.it t rtcpcup.ieO
C I out o since Izatt TepTe O since
sis orthogonal set
But it to I out to 9 0 must be true
By same argumentfor 45 step we get q Cp D
sis lui indep set It
Definition 2 An orthogonal basis for a subspace W of Rnis a basis for W that is an orthogonal set.
Note: We can readily compute the weights/coe�cients in a linear combination of orthogonal basis vectors.
Theorem 5 Let {u1, . . .up} be an orthogonal basis for subspace W ⇢ Rn. For each y 2 W , the
weights in the linear combination
y = c1u1 + c2u2 + · · ·+ cpup
are given by
cj =y · uj
uj · uj(j = 1, . . . , p)
Example:
6
Let I C I t 1GUT where S it jen is
orthogonal basis of IR
Then B UT quit tarun If ut cfui.ieI I u likewise cj I f
note If you had calc3 do you notice this is the
projection coefficient
i e g Ju
11 112
S ft z is an orthogonal
basis for IR Find g Srt af 1431 13
node without an orthogbasis we have to solve a
systemof egoswith an orthog basis we can use
formula from them5 i.e projections
Orthogonal Projections
Problem: Suppose we have a preferred vector u. How can we decompose any vector y 2 Rninto the sum
of two vectors - one a multiple of u, and the other orthogonal to u?
y = (multiple of u) + (multiple of a vector ? u)
Example: Let y =
�8
4
�,u =
3
1
�. Find the orthogonal projection of y onto u. Then write y as the sum
of two orthogonal vectors, one in span{u} and one orthogonal to u.
7
nn
got EE c L'T0
we want ye L it BE where 2 BER constants
put_of in and it is 1 tou
o put I j tu I
stfu o J E Iie this is orthogprojectors
p fa I ai I Isnt
Geometrically
meine ws dose.azdistance from J to L is Huf projIH
Orthonormal Sets
Definition 3 A set of vectors {u1,u2, . . . ,up} in Rnis an orthonormal set if it is an orthogonal set
of unit vectors. If W = span{u1,u2, . . . ,up}, then {u1,u2, . . . ,up} is an orthonormal basis of W since
the set of p vectors is automatically linearly independent by Theorem 4.
Example:
Theorem 6 An m⇥ n matrix U has orthonormal columns if and only if UTU = I.
Proof:
Theorem 7 Let U be an m⇥ n matrix with orthonormal columns, and let x and y be in Rn. Then
(a) k Ux k=k x k
(b) (Ux) · (Uy) = x · y
(c) (Ux) · (Uy) = 0 if and only if x · y = 0
Example:
Definition 4 An orthogonal matrix is a square invertible matrix U such the U�1= UT
. An orthogonal
matrix has orthonormal columns and orthonormal rows.
Exercise: Show that for an n⇥ n invertible matrix U , UTU = I and UUT= I. Thus, UT
= U�1.
Hint : Consider using an approach similar to the proof of Theorem 6.
8
an
is a
Check each vector is unitvector and check eachpair ofvectors are orthogonal
notice UT kind of acts like an inverse even thoughU tdoesn't exist if m n
UI Ug UETtly xtutuy xty x.lybecause UE Uf Jef
Does U have orthonormal columns
µ Yzht commie utu
so if his square matrix and its columns tons
Are orthonormal then UTU _I UT u l