View
224
Download
0
Category
Tags:
Preview:
Citation preview
The Multivariate Normal Distribution, Part 2
BMTRY 7261/14/2014
Multivariate Normal PDF
• Recall the pdf for the MVN distribution
• Where– x is a p-length vector of observed variables–m is also a p-length vector and E(x)=m– S is a p x p matrix, and Var(x)=S • Note, S must also be positive definite
122
' 11 1exp
22p
x x x
Univariate and Bivariate Normal
Contours of Constant Density
• Recall projections of f(x) onto the hyperplane created by x are called contours of constant density
• Properties include:– P-dimensional ellipsoid defined by:
– Centered at m– Axes lengths:
' 1 2c x x
, 1, 2,...,i ic i p e
Bivariate Examples
11 22; 0 11 22; 0
Why Multivariate Normal
• Recall, statisticians like the MVN distribution because…– Mathematically simple– Multivariate central limit theorem applies– Natural phenomena are often well approximated
by a MVN distribution
• So what are some “fun” mathematical properties that make is so nice?
Properties of MVN
Result 4.2: If then
has a univariate normal distribution with mean
and variance
~ pNX μ, Σ
1 1'
p p
i j iji ja a
a a
1 1 2 2' ... p pa a a a μ
1 1 2 2' ... p pa X a X a X a X
Example1
2 '
1
0Given & find the distribution of
0p
X
X
X
X a a X
Properties of MVN
Result 4. 3: Any linear transformation of a multivatiate normal random vector has a normal distribution
So if
and
and B is a k x p matrix of constants
then
~ pNX μ, Σ
1
2 is ~N , 'k
k
Y
Y
Y
Y B X d B μ d BΣB
1 2' pd d d d
Spectral Decomposition
Given S is a non-negative definite, symmetric, realmatrix, then S can be decomposed according to:
Where the eigenvalues are The eigenvectors of S are e1, e2,...,ep
And these satisfy the expression
'
1
p
i i ii
Σ e e
, 1, 2,...,i i i i p Σe e
1 2 ... 0p
Where
Recall that
Then
And
'
1
1
'
p
i i ii
p
Σ e e
A A
1 2 ... p A e e e
' p pA A I
' '1 and 0i i j i i j e e e e
1 1
' ' '
p p
A ΣA A A A A
Definition: The square root of S is
And
Also
12
1'
1'
p
i i ii
p
Σ A A e e
1 12 2Σ Σ = Σ
1
1
1 '11
1
'i
p
p
i ii
Σ A A e e
From this it follows that the inverse square root of S is
Note
This leads us to the transformation to the canonical form:If
1
12
1
'11
1
'i
p
p
i ii
Σ A A e e
1 12 2
1 12 2
1
p p
Σ Σ = Σ
Σ ΣΣ = I
12
1
1 2~ , and , ,..., ~ 0,1iid
p
p
Z
N Z Z Z N
Z
Z Σ X μ 0 I
~ , , thenpNX μ Σ
Marginal DistributionsResult 4.4: Consider subsets of Xi’s in X. These subsets are also
distributed (multivariate) normal.
If
Then the marginal distributions of X1 and X2 is
1( 1) 1 11 121
2( 1) 2 21 22
~ ,q
p pp q
N
X μ Σ ΣX
X μ Σ Σ
1 1
1 1 11 2 2 22~ , ~ ,q
q p q
q p
X X
N N
X X
X μ Σ X μ Σ
Example• Consider , find the marginal distribution
of the 1st and 3rd components 5~ ,NX μ Σ
Example• Consider , find the marginal distribution
of the 1st and 3rd components 5~ ,NX μ Σ
Marginal Distributions cont’dThe converse of result 4.4 is not always true, an additional
assumption is needed.
Result 4.5(c): If…
and X1 is independent of X2
then1( 1) 1 11
12( 1) 2 22
~ ,q
p pp q
N
X μ Σ 0X
X μ 0 Σ
1 1 11 2 2 22~ , and ~ ,q p qN N X μ Σ X μ Σ
Result 4. 5(a): If X1(qx1) and X2(p-qx1) are independent then Cov(X1,X2) = 0
(b) If
Then X1(qx1) and X2(p-qx1) are independent iff
1 1 11 12
2 2 21 22
~ ,pN
X μ Σ ΣX
X μ Σ Σ
'12 21 Σ Σ 0
Example
• Consider
• Are x1 and x2 independent of x3?
1
2
3
4 1 0
~ , 1 3 0
0 0 2
N
X
Conditional DistributionsResult 4.6: Suppose
Then the conditional distribution of X1 given that X2 = x2 is a normal distribution
Note the covariance matrix does not depend on the value of x2
1 1 11 12
2 2 21 22
~ ,pN
X μ Σ ΣX
X μ Σ Σ
11 2 2 1 12 22 2 2
11 2 2 11 12 22 21
E
V
X X x μ Σ Σ x μ
X X x Σ Σ Σ Σ
Proof of Result 4.6
Proof of Result 4.6
Multiple Regression
Consider
The conditional distribution of Y|X=x is univariate normal with
1 1~ ,
y
YY YX
XY XX
p p
Y
XYN
X
Σ
Σ ΣX
1
1
1 1 1
1
...
Y YX XX X
Y p X
Y p p p
YY YX XX XY
E Y
x x
V Y
X x Σ Σ x μ
x μ
X x Σ Σ Σ
Example
Consider
find the conditional distribution of the 1st and 3rd components
1
2
3
4
1 4 0 1 3
2 0 4 1 1~ ,
0 1 1 3 1
3 3 1 1 9
X
XN
X
X
Example
Example
Result 4.7: If and S is positive definite, then
Proof:
1 2' ~ p X μ Σ X μ
~ ,NX μ Σ
Result 4.7: If and S is positive definite, then
Proof cont’d:
1 2' ~ p X μ Σ X μ
~ ,NX μ Σ
Result 4.8: If are mutually independent with
Then
Where vector of constants
And are n constants.
Additionally if we have and which are r x p matrices of constants we can also say
~ ,j p jNX μ Σ1 2, ,..., nX X X
2
1 1 1~ ,
n n n
j j j j jj j jc N c c
Y b X b μ
1nb
1 2, ,..., nc c c
1 2, ,..., nC C C1rb
'
1 1 1~ ,
n n n
j j j j j jj j jN
Y b C X b C μ C C
Sample Data• Let’s say that X1, X2, …, Xn are i.i.d. random vectors
• If the data vectors are sampled from a MVN distribution then
11 21 1
12 22 2
1 2 1
1 2
11 12 1
121 22 2
1 2
and
n
n
p p np
p
p
j j
pp p pp
X X X
X X X
X X X
E V
X X X
μ
X μ X
μ
~ , 1,2,...,j NID j nX μ Σ
Multivariate Normal Likelihood• We can also look at the joint likelihood of our
random sample
1
122
21
1
22
1'
2
11 2
1'
2
Joint Density 1e
, ,..., 2
1e
2
j j
p
j jj
np n
n
jn
x μ Σ x μ
x μ Σ x μ
X X X Σ
Σ
Some needed Results(1) Given A > 0 and are eigenvalues of A
(a) (b) (c)
(2) From (c) we can show that:
1 2, ,..., p
' '1 1
1 1
n n
j j j jj jtr n
x - μ 'Σ x - μ Σ x - x x - x x - μ x - μ
' ' 'tr tr x Ax x Ax Axx
1
' 'p
iitr tr tr tr
A PΛP ΛPP Λ
1' '
p
ii
A PΛP ΛPP Λ
Some needed Results(2) Proof that: ' '1 1
1 1
n n
j j j jj jtr n
x - μ 'Σ x - μ Σ x - x x - x x - μ x - μ
Some needed Results(2) Proof that: ' '1 1
1 1
n n
j j j jj jtr n
x - μ 'Σ x - μ Σ x - x x - x x - μ x - μ
Some needed Results(1) Given A > 0 and are eigenvalues of A
(a) (b) (c)
(2) From (c) we can show that:
(3) Given Spxp > 0, Bpxp > 0 and scalar b > 0
1 2, ,..., p
1 1 1
1 1
n n
j j j jj jtr n tr
x - μ 'Σ x - μ Σ x - x x - x ' Σ x - μ x - μ '
' ' 'tr tr x Ax x Ax Axx
1
' 'p
iitr tr tr tr
A PΛP ΛPP Λ
1' '
p
ii
A PΛP ΛPP Λ
1
1 1exp 2 exp
2pb
b b
trb bp
B
Σ B
MLE’s for . ,μ Σ
MLE’s for . ,μ Σ
MLE’s for . ,μ Σ
Next Time
• Sample means and covariance• The Wishart distribution• Introduction of some basic statistical tests
Recommended