7
DSC 155 & MATH 182: “WINTER ” 2020 TAKE HOME MIDTERM E XAM Available Monday, February 10 Due Tuesday, February 11 Turn in the exam on Gradescope before midnight. You may use whatever text / online resources you like, provided you cite them in your solutions. Do not consult any other humans about this exam. 1. Let A 2 M mn and B 2 M np be matrices. (a) (3 points) Show that Nullspace(B) Nullspace(AB). (b) (3 points) Conclude that rank(AB) rank(B).[Hint: Use the rank-nullity theo- rem.] (c) (2 points) Show also that rank(AB) rank(A).[Hint: You may use the fact that rank(C > ) = rank(C ) for any matrix C .] 2. (8 points) Let X be a random variable, whose moment generating function M X (t)= E(e tX ) is < 1 for some fixed t 2 R. We wish to estimate the moment generating function’s value = M X (t) from data x 1 ,...,x N modeled as i.i.d. samples of X . We consider the following two estimators: T N (x 1 ,...,x N )= 1 N N X j =1 e tx j , U N (x 1 ,...,x N ) = exp t N N X j =1 x j ! . Determine whether T N and U N are (a) unbiased, and (b) consistent estimators for . 3. (6 points) Let {x 1 ,..., x N } be data in R m with sample covariance matrix C. Suppose u is an eigenvector of C, with eigenvalue σ 2 . Let P be the orthogonal projection onto span{u}. Let y j = P x j . Show that the sample variance of the data {y 1 ,..., y N } is precisely σ 2 . 4. Consider the following Gaussian noise model. Let d N m be positive integers. Let Z 1 ,..., Z N be m-dimensional random vectors with normal distribution N (0, υ 2 I m ), and let X j = μ + Qβ j + Z j for 1 j N , where υ 2 > 0, μ 2 R m , β j 2 R d , and Q 2 M md , Q > Q = I d are unknown parameters. (a) (4 points) Show that the MLE for (μ, Q, β 1 ,..., β N ) is precisely PCA. [Hint: Do not attempt to compute the MLE. Just set up the maximization problem that de- termines the MLE, and show that it leads to the same problem whose solution is PCA.] (b) (4 points) Find the MLE for the unknown common variance parameter υ 2 ; express it in terms of the singular values of the sample covariance matrix of the given data. -

DSC 155 & MATH 182: “WINTER”2020 TAKE HOME …tkemp/182/182.Midterm.Solutions.pdfDSC 155 & MATH 182: “WINTER”2020 TAKEHOMEMIDTERMEXAM Available Monday, February 10 Due Tuesday,

  • Upload
    others

  • View
    8

  • Download
    0

Embed Size (px)

Citation preview

DSC 155 & MATH 182: “WINTER” 2020 TAKE HOME MIDTERM EXAM
Available Monday, February 10 Due Tuesday, February 11
Turn in the exam on Gradescope before midnight. You may use whatever text / online resources you like, provided you cite them in your solutions. Do not consult any other humans about this exam.
1. Let A 2 Mmn and B 2 Mnp be matrices. (a) (3 points) Show that Nullspace(B) Nullspace(AB). (b) (3 points) Conclude that rank(AB) rank(B). [Hint: Use the rank-nullity theo-
rem.] (c) (2 points) Show also that rank(AB) rank(A). [Hint: You may use the fact that
rank(C>) = rank(C) for any matrix C.]
2. (8 points) Let X be a random variable, whose moment generating function MX(t) = E(etX) is < 1 for some fixed t 2 R. We wish to estimate the moment generating function’s value = MX(t) from data x1, . . . , xN modeled as i.i.d. samples of X . We consider the following two estimators:
TN(x1, . . . , xN) = 1
! .
Determine whether TN and UN are (a) unbiased, and (b) consistent estimators for .
3. (6 points) Let {x1, . . . ,xN} be data in Rm with sample covariance matrix C. Suppose u is an eigenvector of C, with eigenvalue 2. Let P be the orthogonal projection onto span{u}. Let yj = Pxj . Show that the sample variance of the data {y1, . . . ,yN} is precisely 2.
4. Consider the following Gaussian noise model. Let d N m be positive integers. Let Z1, . . . ,ZN be m-dimensional random vectors with normal distribution N (0, 2Im), and let Xj = µ + Qj + Zj for 1 j N , where 2 > 0, µ 2 Rm, j 2 Rd, and Q 2 Mmd, Q>Q = Id are unknown parameters.
(a) (4 points) Show that the MLE for (µ, Q,1, . . . ,N) is precisely PCA. [Hint: Do not attempt to compute the MLE. Just set up the maximization problem that de- termines the MLE, and show that it leads to the same problem whose solution is PCA.]
-
1. la) Suppose v. c- Null space ( B) ; this means By -
-e
i- v.edu/lspablABl .
Ths shows dulls pale (B) s dullspace IAB) .
(b) By the rank - nullity theorem, ranklB) +nullity I B) =p = #columns n B
i - rank I B) =p - nullity I B) =p - dim dullspace IBD .
Sine dullspace IB) s Null space IAB) , nullity IB ) s nullity LAB)
i. rank 1B) Z p - nullity ( AB) .
Now , again by rank -nullity ,f. p.TT#.Y..;:uhhnbuY:sj*alumsnsrs=p .
⇒ rank(B) 3 rank (AB) as claimed .
(c) rank AB) -- rank KABI'T ' - rankBTAT) Frank 1st)
( by Lbj ) -hamlet!
'
i - Tp is unbiased .
they are Lt smu # letXi) = Mxltkcs . So
by the WLLN , Talk
by independent = El e¥Hf¥k . . - eat" )
= lets" ) #Letty .
0
leg . F X -- Alge ), 14×14 -
- et't , 14×1%9=04240
.) so Un is not generally unbiased .
2lb) ( cont) Also , we knew from LLN In→ EN -
y .
.
This is not typically equal to MyHH #Lett) - o .
Eg . X-Ngl) , MyHI -- eth, but µ - e se etm = Ite . Thus Ua is not generally consistent . ( Un is just set an estimator for My It) ; it is a consistent estimate for et#Mf # letx ) . Even then
, it is a based estimated et#NO
. )
The sample variance is
Sal .gs .> yd - f Eg Hyo - YI tf where g. . = 3¥, gj . Following the course dates
l Proposition 1.281 , we have
yin .- I y ; - II.By. -- Pl 's!E, ' Pssi
and se y;- yin =P - PIN = Pl .
Thus
Saly , . -, yo ' TIS, " PIL's " go ahead and 99¥ ,y Now
, P - - proj ante span { i3 = unlit so
H P It? Hunt IT = uutyj.nu?ij=,@uutIijutIj .I
speakers → 3. tank
Now following exactly the coursenotes (Equations 4.6-1.81 ,
lui ' = lust I u
. I = I get y .
Thus Saly , , . ., y D=
T.wsls.iq's is -
.
I = 62
-AtQpjt Is' ← me, rien ~ ftp.tapj, N'Ind
Thurs , the joint density of the X.j is
f-¥ ,
" HtQBs "7202
= ¥, llegllhonr - tqzll teapot - mL leg 12*07 -Evil 's' - fetapill?
(a) Thinking of la as a function of HQ,ppl, the MLE is the maximizer of la .
The first term is constant with respect to be
,
a , p;) , and the
second term is a negative multiple of the least squash sum .
i. tea, mis - - wgmaxln! agm in Mitttapis -
This is the least squares problem that defines PCA ( c. f. DefmAron 1.7 in Cease Notes ) ,
so the MLE is PCA .
(b) dow , has ig found the optimal Ip
,
)
la tu) = -mLleghav ') - Iga " ki - tent at IT
-
If ,
'
92
i . Iffy = - m N -f - Tus - N d.Ff = O
N
j=dt I