1
Distribution PDF/PMF SUPPORT EX VARX MGF Notes Bernoulli(p) ! 1 !!! X=0,1; 0≤p≤1 p p(1p) (1 ) + ! Binomial(n,p) ! 1 !!! X=0,1,…n 0≤p≤1 p np(1p) ! + 1 ! Multinomial Distr. Is multivariate binomial Discrete Uniform 1 X= 1,..N; N=1,2,… (N+1)/2 (N 2 1)/12 1 !" Geometric (p) 1 !!! X=1,2,..; 0≤p≤1 1/p (1p)/p 2 ! 1 1 ! In MGF: t<log(1p) Y=X1 is negative binomial(1,p) Memoryless: P(X>s|X>t)=P(X>st) Hypergeometric ! ! !!! !!! ! ! X=0…K; N,M,K≥0; M(NK)≤x≤M (KM)/N ( ) ! ( 1) Negative Binomial (r,p) + 1 ! 1 ! X=0,1..; 0≤p≤1 r(1p)/p r(1p)/p 2 1 1 ! ! Gamma mixture of Poisson’s X is trial at which r th success occurs Possion (λ) !! ! ! X=0,1,…; 0≤λ≤inf λ λ ! ! ! !! MGF uses (! ! !)^! !! = ! ! ! ! ! Good approximation to binomial, n>20, p<0.5 Beta (α,β) Γ + Γ Γ !!! 1 !!! 0≤X≤1; α,β>0 α/(α+β) + ! ( + + 1) Beta(1,1) is Uniform(0,1) Β>α is skewed right, Β=α is symmetric X~Gam(a,c), Y~Gam(b,c) then X/(X+Y)~Beta(a,b) ChiSquared (p) 1 Γ 2 2 ! ! ! ! !! ! ! ! 0≤X≤inf; p=1,2,.. p 2p 1 1 2 ! ! Special case of the gamma (p/2,2), (n1)s 2 2 (n1) ∑Zi 2 (n) Exponential (β) 1 ! ! ! 0≤X≤inf; β>0 β β 2 ! !! !" ; t<1/ β Special case of the gamma (1, β), Memoryless X~Exp(1) means ∑Xi~Gamma(n,1) F !!,!! = !! ! ! ! ! ! ! 0≤X≤inf; v1,v2=1,2,… v2/(v22) Does not exist Special case of independent chisquares F1,v =(tv) 2 1/Fm,n =Fn,m Gamma (α,β) 1 Γ ! !!! ! ! ! 0≤X≤inf; α,β>0 αβ αβ 2 1 1 ! Exponential (α=1) and Chisquared (α=p/2,β=2) ∑Xi~Gamma(∑αi,β) where Xi~Gamma(αi,β) cX~Gamma(α,cβ) Normal (µ,σ 2 ) 1 2 ! ! ! !! ! !!! ! inf≤X,µ≤inf; σ>0 µ σ 2 !"! ! ! ! ! ! MGF: complete the square, MGF proves Z~N(0,1), Sums of normals are normal, Z 2 ~Chisquare1ratio of 2 norms= Cauchy t Γ( + 1 2 ) Γ( 2 ) 1 1 1 + ! (!!!)/! inf≤X≤inf; v>0 0 v/(v2); v>2 F1,v =(tv) 2 = ! ! ! where U~N(0,1), v~χ(p) (prove using Jacobian,T=U/(WP) 0.5 , W=V Uniform (a,b) 1 a≤x≤b (b+a)/2 (ba) 2 /12 !" !" a=0,b=1 is case of the beta (1,1) X 2 ~Beta(1/2, 1) if X~Unif(0,1) Jth order statistic of U(0,1) ~Beta (j,n+j1) ! = ! !! ! !" !! ! , ! = !!,!! (! ! , ! , ! ! , ! = !! ! !! ! !! ! !! ! !! ! !! ! !! ! !! ! EX n =d n /dt n Mx(t)|t=0, lim(1+a/n) n =e a Tranformations: i) Single RV, nonlinear monotonic (x 2 ,logx) uses CDF technique ii) linear transformations, single/multiple RV’s uses MGF iii) nonlinear transformations, multiple RV’s uses Jacobian Cov(X,Y) = E[xµx,Yµy] = EXYEXEY, Corr(X,Y) = Cov(X,Y)/(σxσy), Var(X) = EX 2 (EX) 2 , EX=EyEx(X|Y)); prove using joint expectation, VarX = E(Var(X|Y))+Var(E(X|Y)); prove adding/subtracting E(X|Y) to VarX=E([XEX] 2 ) Chebychev: Pr(|xµ|<tσ 2 )= 11/t 2 , P(g(x)≥r)≤E(g(x))/r…Marginal Distributions: ! = !,! , = !,! , ! ,s 2 = ! !!! ! ! ( ! ], (n1)s 2 2 2 n1, = / / ,V~ χ 2 p, U~N(0,1), U independent of V Order Statistics:x(n) in Unif(0,θ), f(y)=(ny n1 n )I(0,θ)(y) implies max as n(Fx(y)) n1 fx(y) : x(1) in Unif(θ,1), f(y)=(n(1(yθ)/(1 θ)) n1 (1/(1θ))I(θ,1)(y) implies min as n(1Fx(y)) n1 fx(y) : xr: f(y)= n!/((nr)!(nr1)!)Fx(y) r1 fx(y)(1Fx(y)) nr = + , = 0 ! ! ! !! Factorization Theorem: f(X,θ) = h1(T(X), θ)h2(X) implies T(X) sufficient, Complete: Eθ[g(T(X)]=0 for all θ implies Pr[g(T(X))=0] = 1, Exponential Family: f(X|θ)= h1(X)h2(θ)exp{T1(X)w1(θ)…Tk(X)wk(θ)}…T1..k(X) complete if w1…k(θ) form kdimensional rectangle

Distribution* PDF/PMF* SUPPORT* EX* VARX* MGF* Notes* 1 , , 1 … · 2014-10-01 · Distribution* PDF/PMF* SUPPORT* EX* VARX* MGF* Notes* Bernoulli(p), !!1−!!!!! X=0,1;,0≤p≤1,

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Distribution* PDF/PMF* SUPPORT* EX* VARX* MGF* Notes* 1 , , 1 … · 2014-10-01 · Distribution* PDF/PMF* SUPPORT* EX* VARX* MGF* Notes* Bernoulli(p), !!1−!!!!! X=0,1;,0≤p≤1,

Distribution   PDF/PMF   SUPPORT   EX   VARX   MGF   Notes  Bernoulli(p)   𝑝! 1 − 𝑝 !!!   X=0,1;  0≤p≤1   p   p(1-­‐p)   (1 − 𝑝)  +  𝑝𝑒!    Binomial(n,p)   𝑛

𝑥 𝑝! 1 − 𝑝 !!!   X=0,1,…n  0≤p≤1  

p   np(1-­‐p)   𝑝𝑒! + 1 − 𝑝 !   Multinomial  Distr.  Is  multivariate  binomial  

Discrete  Uniform   1𝑁  

X=  1,..N;  N=1,2,…   (N+1)/2   (N2-­‐1)/12   1  𝑁 𝑒!"  

 

Geometric  (p)   𝑝 1 − 𝑝 !!!   X=1,2,..;  0≤p≤1   1/p   (1-­‐p)/p2   𝑝𝑒!

1 − 1 − 𝑝 𝑒!  In  MGF:  t<-­‐log(1-­‐p)  Y=X-­‐1  is  negative  binomial(1,p)  Memoryless:  P(X>s|X>t)=P(X>s-­‐t)  

Hypergeometric     !!

!!!!!!!!

 X=0…K;  N,M,K≥0;  M-­‐(N-­‐K)≤x≤M  

(KM)/N   𝐾𝑀 𝑁 −𝑀 (𝑁 − 𝐾)𝑁!(𝑁 − 1)  

   

Negative  Binomial  (r,p)   𝑟 + 𝑥 − 1𝑥 𝑝! 1 − 𝑝 !  

X=0,1..;  0≤p≤1   r(1-­‐p)/p   r(1-­‐p)/p2   𝑝1 − 1 − 𝑝 𝑒!

!   Gamma  mixture  of  Poisson’s  

X  is  trial  at  which  rth  success  occurs  Possion  (λ)   𝑒!!𝜆!

𝑥!   X=0,1,…;  0≤λ≤inf  

 λ   λ   𝑒! !!!!   MGF  uses   (!!!)^!

!!= 𝑒!!!!

!  Good  approximation  to  binomial,  n>20,  p<0.5  

Beta  (α,β)   Γ 𝛼 + 𝛽Γ 𝛼 Γ 𝛽 𝑥!!! 1 − 𝑥 !!!  

0≤X≤1;  α,β>0   α/(α+β)   𝛼𝛽𝛼 + 𝛽 !(𝛼 + 𝛽 + 1)  

  Beta(1,1)  is  Uniform(0,1)  Β>α  is  skewed  right,  Β=α  is  symmetric    X~Gam(a,c),  Y~Gam(b,c)  then  X/(X+Y)~Beta(a,b)  

Chi-­‐Squared  (p)   1

Γ 𝑝2 2

!!  𝑥!!!!𝑒!

!!  

0≤X≤inf;  p=1,2,..   p   2p   11 − 2𝑡

!!  

Special  case  of  the  gamma  (p/2,2),  (n-­‐1)s2/  σ2~χ(n-­‐1)  ∑  Zi

2~χ(n)  

Exponential  (β)   1𝛽 𝑒!

!!   0≤X≤inf;  β>0   β   β2   !

!!  !"  ;  t<1/  β   Special  case  of  the  gamma  (1,  β),  Memoryless  

X~Exp(1)  means  ∑Xi~Gamma(n,1)  F    

𝐹!! ,!! =𝜒!!!

𝑣!𝜒!!! 𝑣!

 0≤X≤inf;  v1,v2=1,2,…   v2/(v2-­‐2)     Does  not  exist   Special  case  of  independent  chisquares  

F1,v  =  (tv)2  

1/Fm,n  =Fn,m  

Gamma  (α,β)   1Γ 𝛼 𝛽! 𝑥

!!!𝑒!!!  

0≤X≤inf;  α,β>0   αβ   αβ2   11 − 𝛽𝑡

!   Exponential  (α=1)  and  Chi-­‐squared  (α=p/2,β=2)  

∑Xi~Gamma(∑αi,β)  where  Xi~Gamma(αi,β)  cX~Gamma(α,cβ)  

Normal  (µ,σ2)   12𝜋𝜎!

𝑒!!!!! !!! !

 -­‐inf≤X,µ≤inf;  σ>0  

µ   σ2  

𝑒!"!!!!!!  

MGF:  complete  the  square,  MGF  proves  Z~N(0,1),  Sums  of  normals  are  normal,  Z2~Chi-­‐square1ratio  of  2  norms=  Cauchy  

t   Γ(𝑣 + 12 )

Γ(𝑣2)

1𝑣𝜋

1

1 + 𝑥!𝑣

(!!!)/!  -­‐inf≤X≤inf;  v>0   0   v/(v-­‐2);  v>2     F1,v  =  (tv)

2  

𝑇 = !!!

 where  U~N(0,1),  v~χ(p)  (prove  using  

Jacobian,T=U/(WP)0.5,  W=V  Uniform  (a,b)   1

𝑏 − 𝑎  a≤x≤b   (b+a)/2   (b-­‐a)2/12   𝑒!" − 𝑒!"

𝑏 − 𝑎 𝑡  a=0,b=1  is  case  of  the  beta  (1,1)  X2~Beta(1/2,  1)  if  X~Unif(0,1)  Jth  order  statistic  of  U(0,1)  ~Beta  (j,n+j-­‐1)  

𝑓! 𝑦 = 𝑓! 𝑔!! 𝑦   !

!"𝑔!! 𝑦 𝑓𝑜𝑟  𝑦 ∈ 𝑌    …    𝑓 𝑢!,𝑢! = 𝑓!! ,!!(ℎ! 𝑢!,𝑢! , ℎ! 𝑢!,𝑢! 𝐽 𝑤ℎ𝑒𝑟𝑒  𝐽 =

!!!!!!

!!!!!!

!!!!!!

!!!!!!

       EXn=dn/dtnMx(t)|t=0,  lim(1+a/n)n=ea  

Tranformations:  i)  Single  RV,  non-­‐linear  monotonic  (x2,logx)  uses  CDF  technique  ii)  linear  transformations,  single/multiple  RV’s  uses  MGF  iii)  non-­‐linear  transformations,  multiple  RV’s  uses  Jacobian    Cov(X,Y)  =  E[x-­‐µx,Y-­‐µy]  =  EXY-­‐EXEY,  Corr(X,Y)  =  Cov(X,Y)/(σxσy),  Var(X)  =  EX

2-­‐(EX)2,  EX=EyEx(X|Y));  prove  using  joint  expectation,  VarX  =  E(Var(X|Y))+Var(E(X|Y));  prove  adding/subtracting  E(X|Y)  to  VarX=E([X-­‐EX]2)  

Chebychev:  Pr(|x-­‐µ|<tσ2)=  1-­‐1/t2,  P(g(x)≥r)≤E(g(x))/r…Marginal  Distributions:𝑝! 𝑥 = 𝑝!,! 𝑥, 𝑦 =   𝑓!,! 𝑥, 𝑦 𝑑𝑦! ,  s2=   !

!!!𝑥! − 𝜇 ! − 𝑛(𝑥 − 𝜇 !],  (n-­‐1)s2/σ2~χ2n-­‐1,  𝑇 = 𝑈/ 𝑉/𝑝  ,V~  χ2p,  U~N(0,1),  U  independent  of  V  

Order  Statistics:  x(n)  in  Unif(0,θ),  f(y)=(nyn-­‐1/θn)I(0,θ)(y)  implies  max  as  n(Fx(y))

n-­‐1fx(y)  :  x(1)  in  Unif(θ,1),  f(y)=(n(1-­‐(y-­‐θ)/(1-­‐  θ))n-­‐1(1/(1-­‐θ))I(θ,1)(y)  implies  min  as  n(1-­‐Fx(y))

n-­‐1fx(y)  :  xr:  f(y)=  n!/((n-­‐r)!(n-­‐r-­‐1)!)Fx(y)r-­‐1fx(y)(1-­‐Fx(y))

n-­‐r  

𝑥 − 𝑎 =  − 𝑥 − 𝑎 𝑑𝑥 +   𝑥 − 𝑎 𝑑𝑥       𝑡𝑜  𝑚𝑖𝑛𝑖𝑚𝑖𝑧𝑒, 𝑡𝑎𝑘𝑒𝑑𝑑𝑎 = 0

!

!

!

!!

 

Factorization  Theorem:  f(X,θ)  =  h1(T(X),  θ)h2(X)  implies  T(X)  sufficient,  Complete:  Eθ[g(T(X)]=0  for  all  θ  implies  Pr[g(T(X))=0]  =  1,    Exponential  Family:  f(X|θ)=  h1(X)h2(θ)exp{T1(X)w1(θ)…Tk(X)wk(θ)}…T1..k(X)  complete  if  w1…k(θ)  form  k-­‐dimensional  rectangle