12
Mar . 23 y ' Moment Generating Functions - Let X be a random variable . The moment generating function of X , denoted by Mx Lt ) = E Lett ) , for all t for which the expectation exists . Note that at t - - o , Mx lo ) = Ele o ) = 1 Example . Say X - ft ( oil ) . Mx Lt ) = E Lett ) = Set " fate - " " dx - P = e - EH ' -241g , ÷ = f e - Ifk - H ' - t ' ) dy - P = et 42 S # e - Ilk - H 'd , - p Nct , l ) = et 'll exists for all t ER ,

y Moment Generatingstat353/lectures/Week10.pdf · Examples Set t = k¥5 where k is a positive integer then Chebyshev's inequality gives PCI x-ECHL E KENT) s YFTxT= IT e.g, 4=2. For

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: y Moment Generatingstat353/lectures/Week10.pdf · Examples Set t = k¥5 where k is a positive integer then Chebyshev's inequality gives PCI x-ECHL E KENT) s YFTxT= IT e.g, 4=2. For

Mar.23y

'

Moment Generating Functions-Let X be a random variable .

The

moment generating function of X ,denoted

by Mx Lt ) = E Lett ) , for all t for whichthe expectation exists .Note that at t -- o

,Mx lo) = Ele o) = 1

Example. Say X -ft ( oil ) .

Mx Lt) = E Lett) = Set" fate - "" dx

- P

= 5¥ e- EH' -241g ,

÷= f e

- Ifk - H'- t') dy

- P•

= et42 S # e- Ilk -H 'd,

- p✓Nct , l )

= et'll

exists for all t ER,

Page 2: y Moment Generatingstat353/lectures/Week10.pdf · Examples Set t = k¥5 where k is a positive integer then Chebyshev's inequality gives PCI x-ECHL E KENT) s YFTxT= IT e.g, 4=2. For

theorem Suppose that Mx Lt) and My Ct)are mgf 's such that Mx CH = My Lt) for all t

in an openinterval C - to

, to ) for some to > o .

Then Fx ( H = Fy ( E) for all z EIR ,where Fx

and Fy are the cdf 's corresponding to thedistributions associated with Mutt and My Lt) .

Renate ! Conversely ,if Fxlz) = Fy ( H fo - all

2- EIR ,then Mx LH = My Ct) for all t fo-

which the expectations exist .

So we can conclude that each random variableX has a unique mgf associated with its

distribution.

The mgf is another way to representthe distribution of a random variable .

Properties① Mdt) -TIE Lett)

= E [In ett)= E ( x " ett)

The. IF Mitt!

.

= ECX ")

Page 3: y Moment Generatingstat353/lectures/Week10.pdf · Examples Set t = k¥5 where k is a positive integer then Chebyshev's inequality gives PCI x-ECHL E KENT) s YFTxT= IT e.g, 4=2. For

② Say Xi , w, Xn are mutually independent ,and suppose Xi has ngf Mxilt) .

Let

X = X,t . . . t X n .

Then the mgf of Xis Mx Ltt = E Lett)

= E [ ett Xi t - n TX n ))= EC.LI

,

et ti )' II Ecetti)- IIM×

③ Let X be a random variable and let

Ya a Xtb,where a and, b are constants

.

Then the mgf of Y is

My LH = E ( et Y)= Ecetcaxtbl)= etb Ece tax )= etb Mx ( at)

Eixample.

We saw that if X n Nco , D ,

then Mx Lt) = et't'

.

We know that '

if

Y = o Xt n,

where o > o and MEIR

then Y - N ( M ,o' )

.

Page 4: y Moment Generatingstat353/lectures/Week10.pdf · Examples Set t = k¥5 where k is a positive integer then Chebyshev's inequality gives PCI x-ECHL E KENT) s YFTxT= IT e.g, 4=2. For

Then by property ③ the nngf of Y is

My Lt) = etu Milot )

= etMelott' 12

=

eth' t to't

'

Exampte Let X - Gamma ( r , X ) .Then

Mx Lt) = E Lett)= !-

et"r, xr- '

e-"'

die

=f÷, f-

si- '

e-KH - Hd ,

= It ! xr-

ie-" Htt'd ,

Gam#,H-tdsity= (¥+5

,til

.

Now suppose Xi . . . , X n are independent ,

with Xi - Gamma ( risk ),and let

X = X,t . . - t Xn .

The mgf of X is

M xLH =

.

Mxiltl

' iii.⇐tri= (¥,)

" t -ut rn ← fft:{It . . trait)⇒ X- Gamma ( ret . . .tn , H) .

Page 5: y Moment Generatingstat353/lectures/Week10.pdf · Examples Set t = k¥5 where k is a positive integer then Chebyshev's inequality gives PCI x-ECHL E KENT) s YFTxT= IT e.g, 4=2. For

Mari

Eixample Suppose X , . - - y X , are independentrandom variable with Xi - Poisson ( ti ) , and

let X = X ,t . . -t Xn .

The pmf of Xi is

pxi ( K ) = e- hi

,

K -- o , 1,2 , - . . .

The mgf of Xi isMxiltl a E ( et

Xi)-

- E. et" e- ti

= e- ti Eto a Taylor series expansion

= e- tieet.li

of ee't ;

= exit et - t),for all te IR .

Then by property ② the mgf of X is

Mxltl ' II,Mx.lt/=iIIe filet - i )

= eti ( et . ,)

Igf of a Poisson ( Eti)⇒ X - Poisson ( ti) .

Eixample Let X u N lo ,t )

.Find ECX ") for

all n Z l.

Note thatMitt : E¥÷) .- E

. "Et÷÷n÷¥¥:sum with f , -s't

term El X") .

Page 6: y Moment Generatingstat353/lectures/Week10.pdf · Examples Set t = k¥5 where k is a positive integer then Chebyshev's inequality gives PCI x-ECHL E KENT) s YFTxT= IT e.g, 4=2. For

From last time we computed the mgf of X as

Mx Lt ) = et' "

= tty÷= E t

" ←

h = O

= [ ant"K= o /

where are = o for kodd

and an =%T )Then ÷nM× It) - f÷÷E ant "

= n ! a. t ( something) x t t ( something)xE

Then In Mit!!.

= n ! ant - - -

= ELK "]So E[ Xn) = n ! an

= 0if n is odd

if n is even

InequalitiesMarkuisIneg.ua/ity-Suppose that X is a random variable with

support in co,o)

,i.e

. X Zo with probability. I.

Then for t > 0,

P CX et ) E F¥.

Page 7: y Moment Generatingstat353/lectures/Week10.pdf · Examples Set t = k¥5 where k is a positive integer then Chebyshev's inequality gives PCI x-ECHL E KENT) s YFTxT= IT e.g, 4=2. For

Prout ( continuous case )

E Cx ) = !-

xfxlxldx

± f-

xf.lk/dKZfotfxG4dx--tfrfxCsc)dk--tPCXzt)

⇒ pcxzt) E EEixample Suppose X is nonnegative and PCX > lo) - f

.

Then with t - lo ,Markov's inequality gives

ECX) Z to P ( X 210) = lol E) ' 2 .

mo-e mass in

d the tail• /_#E¥? it

.

F- Cx) is t

where the mass

balances

Chebyshev's Inequality-Let X be a random variable with finite variance .

PCI X - ECHL > t ) e Va# ←

t'

Prout PCI x - ECHL > t ) =p ( Cx- Ecx))

'

> t' )

EECCE.EC#n=Vqf4n

Page 8: y Moment Generatingstat353/lectures/Week10.pdf · Examples Set t = k¥5 where k is a positive integer then Chebyshev's inequality gives PCI x-ECHL E KENT) s YFTxT= IT e.g, 4=2. For

Examples Set t = k¥5,where k is a

positive integer , then Chebyshev's inequality givesPCI x - ECHL E KENT) s YFTxT= IT

,

e.g ,4=2 .

For anyrandom variable with finite

variance,the amount of mass more than 2 Hamdard

deviations from the mean cannot be more than ¥ =.

25

Chernoff's Bound-

Let X be a random variable

with nngf Mx Lt) .Then

P ( X za ) E mini e- at Mx Ct) ←

t Doi

Prot let t > o

P ( X za) =P ( TX z ta) =P (ett zeta)E EC et'T-eta

= e- ta Mdt)

Since this inequality holds fo- any t 70 then

PC X za ) s engine- at Mxlt )

.

Mar.

26-

Example ( Illustrating Markov, Chebyshev,

-and Chernoff inequalities )

Suppose X - Nco , i ) .

Bound P ( X Ea )

for a > o .

The exact probability is

Page 9: y Moment Generatingstat353/lectures/Week10.pdf · Examples Set t = k¥5 where k is a positive integer then Chebyshev's inequality gives PCI x-ECHL E KENT) s YFTxT= IT e.g, 4=2. For

g. at, e- """

du

Markov's inequality can be applied to txt .

Pcl Xl za ) = PC X E - a) t PCX za )= 2

'PC X za )so PC X Za) = E PCI Xl za)

E { ELLIS by Markov 's irreg .

Applying Chebyshev 's inequality we getP ( X za ) = E Pcl Xl za )

E E Vaj# by Chebyshev's ineg .

=12 92

Applying Chernoff 's bound we obtain

PCX za) E region e- ate th

t- 12 - at

= min et > 0

Minimizing the exponent by differentiating givest as the solution to t - a = 0

,or t = a

,

and at t -- a we get a÷ - a'= - E in the

exponent .So we get

P ( X za) e e- a

- k.

Examples. weaklawoflargeklun.be#

Let X , ,Xz

,. . .

be a sequence of i. i. d .

Page 10: y Moment Generatingstat353/lectures/Week10.pdf · Examples Set t = k¥5 where k is a positive integer then Chebyshev's inequality gives PCI x-ECHL E KENT) s YFTxT= IT e.g, 4=2. For

random variables with finite variance, say Tce .

Let In = T.IE,

Xi be the sample mean ,

Let µ

denote the common meanof the Xi 's

.Then

for E > 0 ,

→ P ( 1 In - nil > e ) → o as n -so,

Prout First,

ECE) - Ect . Xi) - T.EEKitt . m

= ht h M = M

Va. (E) = Var ( TE,Xi ) = LEE,

Varcxi) -- f. no'

= Ih

Then

PCI In - ml > e) = PCI In - Echl > e)E VarC

{2

=q = ÷ → o as ht P .

Convergence of Sequences of Random Variables-

Recall that a random variable X is a real - valuedfunction from some space

R to IR.The domain'

r has a probability measure P associated with it

and together ( D ,P ) is called a probability space .

Since' random variables have associated distributions

there are several useful ways that we can use the

Page 11: y Moment Generatingstat353/lectures/Week10.pdf · Examples Set t = k¥5 where k is a positive integer then Chebyshev's inequality gives PCI x-ECHL E KENT) s YFTxT= IT e.g, 4=2. For

distributions to sayhow close a random variable in'

a sequence is to a limit random variable .

Let X , ,Xz

, . . ..be a sequence of

random variable,and let X be another random variable

.Unless

otherwise specified we assume that all the Xi 's and Xare defined on the same probability space ( R , P )

Modes of Convergence-

① We say that { X1 convergesto the limit X

almost surely ,written X n X or

Xn → X a. s . ,if

P ( {wer i X. Cw) → Xlw) as n -so} ) =/.

We also say that Xn converges to X withprobability 1

,written Xn → X w . p . I .

② We say that {Xn) converges to X in' probability,

written Xn Is X,if for any givin E > 0

,

P ( I Xu - X I > e) → o as n -so.

Remaly : The weak law of large numbers is sayingthat In m .

③ We say that {Xn) converges to X in the rth mean,

where r so ,written Xn X

,if

F- [I Xn - X Ir) → o as n -so

Page 12: y Moment Generatingstat353/lectures/Week10.pdf · Examples Set t = k¥5 where k is a positive integer then Chebyshev's inequality gives PCI x-ECHL E KENT) s YFTxT= IT e.g, 4=2. For

④ We saythat {Xn) converges to X in distribution .

written Xn Is X,if-

Fn ( x ) → FCK) for all x EIRsuch that

F Cx) is continuous at K ,

where Fn is the cdf of Xn and f is the cdfof X . This mode of convergence is also calledweak convergence .