72
1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1}, such that: , or Define the simple random walk process , by: , Define: ) 1 , 1 , 0 ( q p q p q p w p p w X k . 1 . 1 0 , n S n 1 , 1 n X S n k k n 0 S 1 : 1 inf n S n T } 1 , { n n q X p X 1 1 1 1 1 1

1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

Embed Size (px)

Citation preview

Page 1: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

1

  1.1- Random walk on Z:

1.1.1- Simple random walk :

 Let be independent identically distributed random variables (iid) with only two possible values {1,-1}, such that:  ,

or

 

Define the simple random walk process , by:

 

,

Define:

To be waiting time of the first visit to state 1 .

)1,1,0( qpqp

qpw

ppwX k .1

.1

0, nSn

1,1

nXSn

kkn0S

1:1inf nSnT

}1,{ nn

qXpX 1111 11

Page 2: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

2

A state (i) is called recurrent if the chain

returns to (i) with probability 1 in a finite

number of steps, other wise the state is

transient.

If we define the waiting time variables as ,

then state i is recurrent if .

That is the returns to the state (i) are sure

events , and the state (i) is transient if:

i

1 ii

0 ii

In this case there is positive probability of never

returning to state (i), and the state (i) is a positive

recurrent if .

Hence if then the state (i)

is a null recurrent .

iXE i |

iXTEE iii |

Page 3: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

3

We say that i leads to (j) and write if

for some

We say i communicates with j and write if both and .

ji jXP ni ( .0)0 n

ji ji

ij

A Markov chain with state space S is said to be irreducible if , ji Sji ,

If x is recurrent and then y is recurrent.yx

Page 4: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

4

If the Markov chain is irreducible, and one of the states is recurrent then all the states are recurrent .

The simple random walk on is recurrent

iff

2

1qp

1) If and , then .

2) If and , then .

yx zy zx yx zy zx

Page 5: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

5

We define the state or position of random walk at time zero as , and the position at time n as0S

n

kkn XS

1, then by strong law of large number

saEXn

Sn .1

Because:are independent identically distributed and

from the definition ,then .sX i '

11 X 11 XEHence

saqpEXn

Sn

n.lim 1

And

saqpn

Sn

n.1lim

This means that qpnSn ~ and consequence of that we have

qpif

qpifSn

nlim

The state 0 is transient if the number of visit to zero is finite almost surly, which mean that with probability 1 the number of visit is finite .

Page 6: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

6

Hence zero is transient state.Now if p=q=1/2, we claim recurrence.It is enough to show that 0 is recurrent (from theorem 1.2.4), then the state 0 is recurrent iff

)0,0(G

Define

1

)(00)0,0(

n

nPG

Where 0|0)(

00 SSpp nn

1

2

1

)2(00

2

1

!!

)!2(

2)0,0(

n

n

n

nnn

nn

n

qpn

nPG

Page 7: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

7

Now using sterling formula n

e

nnn

2~!

Hence

1

2

12

2

21

11

2

1

2

24

)0,0(

n

n

nn

n

n

e

nn

e

nn

G

However

)0,0(G and 0 is recurrent, and consequently the simple random walk on Z is recurrent .Now we will show that symmetric simple random walk is null recurrent, which means that NEWhere

0|0:1inf

0

SSn

totimereturnN

n

Page 8: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

8

The probability generating function is defined as:

1

)(k

kN kNpssEsF

Then

1

)()()1(k

NPkNPF

Also..........)2()1()( 2 NPsNPssF

Hence........)2(2)1()( NPsNPsF

And

1

)()1(k

ENkNPkF

2411)( sqpsF

2

11

411)1()(

qp

qpFNP

Page 9: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

9

Put:

2

1qp

Hence:

21)(

s

ssF

and

NESFs

)(lim1

Hence zero is a null recurrent, and the simple symmetric random walk on Z is null recurrent.

We proved that simple random walk on Z is

recurrent if .2

1qp

We have:2411)( sqpsF

Then:

2412

8)(

sqp

sqpsF

Page 10: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

10

If is an infinite sequence of events such that:

1.    

2.     , provided

that are independent events.

,........, 21 AA

0suplim, nn

nn APthenAP

1suplim, nn

nn APthenAP

sAi '

Page 11: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

11

It is well known that if are

independent random variables , then

, but this may fail in

the case of infinite product , To show this we

can introduce this counter example .

n ,.......,, 21

n

ii

n

ii EE

11

Page 12: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

12

* Consider to be independent random variables such that:

That is we have infinite sequence of independent identically distributed random variables, then:

,...., 21 YY

2

102 ii yPyP

1)( 1 yE

We define anew sequence as follows:

then

n

iin y

1

: nn 2,0

Now define the waiting time

that means the first

time the sequence equal to zero.

Then :

0:1inf nXnT

11

n

kkn EYE

n

Page 13: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

13

.

:)2(&)1(

)2(1

11

1

n

kk

kk

n

kk

EE

fromhence

E

On the other hand

112

1)()(

21

21

k

ikIK

kTPTP

k

kk yyyPkTP

2

1)0,2,......,2()( 11

hence: T is finite almost surly.

)1(0)(

0)lim(

:

.0

1

kk

nn

n

YE

E

then

saX

Page 14: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

14

i.  X is finite random variable iff  ii. X is bounded iff

sa.

M

Now we introduce the notion of lebesgue measure, and Borel -field

Page 15: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

15

 The -field is a collection of sub-sets of such that

1.    

2.    

3.    

The elements of are called measurable sets or events.

CAA

sAforA ii

i ',1

* The intersection of all -fields that contain all open

intervals is called Borel -field and denoted by .

It is known that a lebesgue measure on (Borel -field)

is the only measure that assigns for every interval

(a,b] a measure , and .

baabba ,,

Provided that EX and EY are finite.

)()()( YEXEYXE

Page 16: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

16

If is finite  then:   Note : If are independent identically distributed random variables, and N is positive integer valued random variable independent of then:

iEX ni ,......2,1

n

i

n

iii EXXE

1 1

sX i '

sX i '

ENEXXEN

ii

1

1.

N

i

n

iii EXnnNXEnNXE

1 11.)/()/(

11

.))/(( EXNNXEN

ii

ENEXEXNENXEEN

ii .).())/(( 11

1

and

Hence

Page 17: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

17

If , and

Then

0nX saXX n .

EXEX n

If random variables,  then

,....)2,1(0 nX n

1 1i i

ii EXXE

We define a new sequence

then:

now

11

:

n

n

kkn YXY

1

limk

knn

XY

)1()(

lim)lim(

11

ii

ii

nn

nn

EXXE

EYYE

Page 18: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

18

But is this theorem true if ?

The following example shows that this

theorem may fail if all .

0)0( nXp

0nX

Define are independent identically

distributed random variables, such that:

sYi '

2

1)1()1( ii YpYp

and define :

 

and

n

kkn YS

1

}1:1inf{: nSnT

Page 19: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

19

)1(11

.

nTn

nn

n IYX then:  

On the other hand :

)2(.1:1

saXEhencen

n

saSYIY T

T

nnnT

nn .1.

11

1. nTnn IEYEX

Because we have symmetric simple random walk, then recurrence implies that

and

then:  Define also:

saST .1

1)( TSE

)()1(: nTnnTnn IYIYX

saT .

Page 20: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

20

the event does occur or does not

depending only on , which means

that .

Hence are independent random

variables.

}1{ nT),....,,{ 121 nYYY

),.....,,(}1{ 121 nYYYnT )1(& nTn IY

then

and

Hence from (2) & (3) we conclude that:

.

0. )1( nTnn IEYEXE)3(.0 saXE

nn

n

nn

n XEXE

The sigma-field generated by a random variable

X is: )}(:)({)( 1 BBXXwhere is the Borel . field

Page 21: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

21

A common error is repeated in some books, about linear correlation which is  

But this relation must be written as:)1(1 xybaXY

)2(.1 sabaXYxy

The following example shows that the formula (1) may not be true.

* Consider a random variable .

We define the triple where

and and

P=Lesbgue measure on .

)1,0(~ U),,( P

)1,0( }:)1,0{( B

is the Borel -field restricted on (0,1) .

Page 22: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

22

Now define

otherwiseO

irrationalisXifXY :

)()( rationalXPXYP

0))1,0(( Qmeasurelebesgue

1

.1)(

xythusand

saXYthenXYP

From (1) & (2) we get a = 0 , y = b

Suppose for the contrary that

)2........(..........003

1

)1.......(..........002

1

,

31

21

baY

X

baY

Xset

basomeforbaXY

This Contradiction because the assumption is not true.

Page 23: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

23

It is already known that if the moment

generating function do exist ,then all the

moment do exist ,but it could happen that

all the moments do exist but the moment

generating function does not exist, the

following counterexample explains this

We know if the moment generating

function is defined , then

Let )1,0(~ NX

tM

0kk MXE

Page 24: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

24

22

1

)(

)(kkX

thk

eeE

YofmomentkYE

And define

, that is Y

follows the lognormal distribution.

XYeY X log

Hence all the moment of y do exist, whereas the moment generating function of Y does not exist, as we show now .

dxe

dxee

eE

eEtM

Xet

Xet

et

ytY

X

X

X

0

221

221

2

1

2

1

)(

)()(

Then the moment generating function does not exist.

Page 25: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

25

The sequence of moments does not determine uniquely the distribution, the following example show this:

We have the following two distributions with identical sequence of moments

,......2,1;3

23

3

10)2

,.......2,1;3

23)1

12

1

2

kk

YP

YP

kk

XP

k

k

k

k

Page 26: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

26

Now:

362

12

3

23)()1

22

12

12

kkk

k

kkXEa

kkk

k

kk

k

kkXEa 12

32

3

23)()2

14

1

2

22

Obviously . 2,)( nXE n

2,)()2

3

12

3

230)()1

2

1212

1

nYEb

kkYEb

n

kk

k Also

Page 27: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

27

It is known from the literature that the sequence of moments of a random variable uniquely determines the distribution if it satisfies one of the following equivalent conditions:

),0()()3

),()()2

.2

)(suplim)1

1

2

1

1

2

1

2

2

1

2

xifm

xifm

finiteisk

m

k

kk

k

kk

kk

n

Page 28: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

28

The joint distribution of random variables determines uniquely the marginal distributions, but the converse is not necessarily true .

2

1)1(

2

1

4

1

4

1),0()0(

XP

YfXPY

This is a joint probability function

4

1;

1

0

10Y

X

4

1

4

1

4

1

4

1

Page 29: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

29

These two marginal functions are

corresponding to the joint function fir

any

Conclusion:

The marginal distributions don’t

determine the joint distribution.

4

1

10X

)(xf2

1

2

1

The marginal distribution of X is:

10Y

)(Yf2

1

2

1

The marginal distribution of Y is:

Page 30: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

30

The “positive correlated” is not transitive property.

A matrix is said to be positive

definite if it satisfies one of the

following equivalent conditions:

i.           

ii.  All the eignvalues are positive.

iii. The determinants of all left upper

sub-matrices are positive.

nnA

.00` XXAX

Page 31: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

31

Any positive definite matrix could be

variance-covariance matrix of some

random vector.

 Now Consider the matrix

1

1

1

21

41

21

21

41

21

We can make sure that is variance-covariance

matrix of a random variable .

Z

Y

X

04

1

02

1

02

1

zx

zxzx

zy

zyzy

yx

yxyx

then

Then X and Y are positively correlated, and Y and Z as well, but X and Z are negatively correlated

Page 32: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

32

We say that converges weakly to X if:

 

At all continuity points of F.

This is denoted by

or

nX

)()(lim xFxFnn

XX Wn XX D

n

The following example shows that a sequence of continuous random variable does not necessary converge weakly to a continuous random variable.

Page 33: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

33

Let Then X is a degenerate random variable with cumulative distribution function

Consider a sequence of cumulative density functions such that:

Then

saX .1

11

10)()(

x

xxXPXf

)(xFn

11

10

00

)(

x

xx

x

xF nn

11

10)(lim

x

xxFn

n

)(xFObviously, is the cumulative distribution function of a continuous random variable . Whereas, F is the cumulative distribution function of a discrete random variable X. Never the less, . XX W

n

nFnX

Page 34: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

34

The median is any x such that

If F is continuous random variable, then

the median is unique .

bxa

2

1)(:sup

2

1)(:inf

xFXb

xFXa

Let X be a random variable such that

Page 35: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

35

Consider a random variable X, such that

P(X)

210-1X

4

1

4

1

3

1

6

1

thus the median is any

Which means that a=0, and b=1.

That is, the median is not unique.

This is a disadvantage of the median as a

measure of location as the mean .

)1,0(x

21

216

5

102

1

014

1

10

)(

x

x

x

x

x

xF

The cumulative density function of X

Page 36: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

36

If

Then :

Let

then

The median does not satisfy the relation

Mod(X+Y) = Mod(X) + Mod(Y)

),(~ nX

0,)(

)( 1

xeXn

xf xnn

)1,1(~ X

0,)( texf t

)0( t

t Ie

Assume that Y is an independent copy of X, so we can find Med(X) and Med(Y) as follows

Page 37: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

37

this implies that

Med(X) =Med(Y)=log2

now to calculated Med (Z) suppose for contrary

that

Med(X+Y) = Med(X) + Med(Y)

This means that

Med(Z) = 2 log 2

)1,2(~ YXZ

But

2log

2

11)(

0

thence

edtetF tx

tX

With probability density function

Page 38: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

38

tttx

z

tz

etedxextF

and

tettf

0

1

0,)(

.If the median of Z were 2log 2, then

2

1)2log2( ZF

However,

2

14034.0

2log21)2log2( 2log22log2

eeFZ

Hence

)()()( YMedXMedZMed

Page 39: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

39

The mod is the value of X that makes the density function is maximum.

The mode is not unique, it is 0 and 1

P(X)

210X

9

4

9

4

9

1

A random variable X has the following distribution

Page 40: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

40

Suppose that X&Y follow the distribution above and they are independent.Let Z=X+Y

P(X)

10X

5

3

5

2

P(Z)

210Z

25

9

25

12

25

4

000)()(1)( YModXModZMod

We show in this example that the mode is not a linear operator.This is a disadvantage of the mode.

Page 41: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

41

and  

from (1) & (2) we conclude that the Mode is not linear operator.

Let

0,)(

)1,1(~

texf

Xt

and Y is an independent copy of X, Since f (0)=1 is the maximum value of f(x), hence

)1()(0)( YModXMod

 Now Z = X + Y, then , the probability density function of Z

)1,2(~ Z

0,)( tettf tz

,)( ttz eettf

)2()(10 ZModteet tt

Page 42: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

42

We defined a simple random walk in chapter (1), and we know that the following Markov Chain

qkkppkkp )1,(,)1,(

is recurrent iff 2

1qp

This walk is called simple symmetric random walk (SSRW).

Page 43: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

43

Consider a state i and a state j such that:

egersomeforkijij int k,

221,1 iiiiii( Transitive property) .

)1(ijjithen By the same way

kegersomeforkijijIf int,

)2(, ijjithen Hence simple symmetric random walk (SSRW) on Z is irreducible.

Simple symmetric random walk (SSRW) on Z is irreducible

Page 44: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

44

A sequence 0, nn of random variables is Martingale if:

saE nnn .,......,,| 11

and sub-martingale if

saE nnn .,......,,| 11

and a super-martingale if :

saE nnn .,......,,| 11

Page 45: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

45

Consider independent random variables

0, nn, such that 0)( iE

We claim that

n

kknS

0is a martingale, for

n

nn

nnnn

nnnnn

S

ES

ESE

SESSSE

1

1

11

||

|,,|

Page 46: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

46

Consider independent random variables 1, ndn

Define a sequence k

n

kn dZ

1

. We show that

1, nEZ

ZW

n

nn

is a martingale.

n

nnnn

nnnnn

nnnnn

nnn

nn

nnn

W

EdZEdEZ

dEZEdEZ

dZEEdEZ

ZEEZ

EZ

ZEWE

11

11

11

11

1

11

1

|1

|1

|1

||

Page 47: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

47

1Z

’s independent random

Consider a family tree where , and let

1nZ the number of children at thn )1( generation , and let

nkd the number of children of the thk

individual of the thn generation then:

n

n

Z

kZnnnnkn ddddZ

1211 ......

1,0, ZwherenZn

and 1,0, kndnk

,then nkd is a doubly indexed family .

nkdVariables, for every n and k and for fixed n they are independent identically distributed random variables.

We assume that

Page 48: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

48

nZ

knkn dZif

11

1

11

n

nn EZ

ZW

Consider 1,0, kndnk , we Show that

, then

is a martingale.

sX i '

nz

knkn dZ

11

We use the fact that: If

Independent identically distributed random variables and

N random variable and independent of sX i '

then:

11

EXNEXEN

ii

are

Integer-valued

Page 49: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

49

Since variables, and for fixed n they are independent identically distributed. Also

sd nk ' are independent random

sd nk ' are independent of nZ Then: nnn ZEdEZE 11)(

And

nn

n

nn

nn

nn

nnn

nn

n

Z

knk

nn

n

WEZ

Z

EdEZ

EdZ

EdEZ

dEZ

EdEZ

dE

EZ

ZE

n

1

1

1

1

Page 50: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

50

(Not every martingale has an almost sure limit).In a symmetric simple random walk on Z, we have

2111 XPXp , then

nSis a martingale since

n

nn

nnn

nnnnn

n

kkn

S

XES

XES

XSESE

XS

1

1

11

1

and nn

Slim

simple random walk (SSRW), is recurrent this means that it keeps oscillating between all the integers.

does not exist, because this symmetric

Page 51: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

51

If 0, nX nis a sub-martingale such that: n

nXEsup

Then for every 0

nXE

ini

p X

0max

Page 52: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

52

.

is greater than or equal tofor the first time at time k .

We define kiforXXA iKK ,:

which means that KA is the event that the process

n

kkAA

0

, where A is the event that the process

is greater than or equal to by time n.

We want to show that:

n

kk

n

APAp

XEAp

0

Since on KA , kX

n

kA

n

kk

kIE

APAp

0

0

But since sAK ' are disjoint.

Hence

Page 53: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

53

Then

nAn

An

An

n

kAn

n

kAn

n

kknn

n

kAnn

n

kAk

XEIXE

IXE

IXE

IXE

IXE

IXEE

IXEE

IXEAp

n

kk

k

k

k

k

0

0

0

0

0

0

In the following example we can see if the sequence is not a sub-martingale, then the last theorem fail.

Page 54: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

54

Consider independent identically distributed random variables such that

2

102 ii XpXp

It is clear that the sequence nX

is not a sub-martingale since obviously, 1nEX

Hence:

)1(max1max00

ini

ini

XpXp

Doob’s inequality requires that for 2

then (1) will be

211

211 n

which is not true for large n.

This validity because the sequence is not a martingale.

Page 55: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

55

In this chapter we describe convergence of sequences of random variables. Almost sure convergence, because of its relationship to pointwise convergence, and convergence in distribution, because of its being the easiest to establish. Convergence in probability is significant for weak laws of large numbers, and in statistics, for certain forms of consistency. Mean convergence is used to establish convergence of moments.

Page 56: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

56

be random variables on .Let ,....,, 21 XXX p,,

We have these modes of convergence

1. Almost sure convergence 2. Convergence in probability3. Convergence in distribution4. Convergence in mean

Almost sure convergence, also known as convergence with probability one, is the probabilistic version of pointwise convergence.

The sequence nX converges to X almost surly, denoted by

XX san . , if

1: XXp n

Page 57: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

57

The sequence nX converges to X in probability,

XX pn

0lim

XXp nn

for every .0

, ifdenoted by

0lim

XXE nn

The sequence nX converges to X in mean,

XX mn , ifdenoted by

Page 58: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

58

In distribution

In mean

In probability

Almost sure

Defining conditionMode

1: XXp n

0lim

XXp nn

for all 0 0lim

XXE n

n

tFtF XXn n

lim

at continuity points t ofXF

Table 5.1. Definition of convergence for random variables

This convergence is some times called weak convergence. We summarize these modes in the following table.

The sequence nX converges to X in distribution,

XX Dn , if

XXXn

FoftspocontinuityattFtFn

intlim

denoted by

Page 59: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

59

Almost sure convergence.

Convergence in mean.

Convergence in probability.

Convergence in distribution.

Figer 5.2- Implications Among Forms of Convergence

Page 60: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

60

The last figure depicts the implications that are always valid. None of the other implications holds in general.

XX san . , then XX p

n

Suppose that 0 , and for each n, let XXA nn

Then, convergence in probability follows:

0..,limlim

oiApApAp nmk

mm

mm

If

Page 61: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

61

If XX mn , then .XX p

n

By Chebyshev’s inequality, for each 0

XXEXXp nn

1

Therefor, 0 XXE n

implies that 0 XXp n

If XX pn , then XX D

n

The reader can see the proof in the book of Alan F. Karr p.(141).

Page 62: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

62

The implication convergence in probability does not always lead to almost sure convergence and this counter example shows us that.

of independent random variables such that ;

nnnn XpandXp 11 110

We can see that for ,0 0

1

nXp n

1, nX nConsider a sequence

Hence

XX pn

Now claim that nX does not converge to zero almost surly,

n n

n nXp

1)1(

And since nX are independent, then from Borel-Cantelli lemma,

1.1)(: oiXp n This means that nX does not converge to zero almost surly.

Then

Page 63: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

63

If is a sequence of independent random variables and

1, nX n

yprobabilitinSXSn

KKn

1

then:

SS san .

This theorem suggests that a convergence in probability is equivalent to almost sure convergence in case of martingales. But, this is false in general as we show in the following example.

Page 64: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

64

Set

211

nforand

ZX

0

0

11

1

nnn

nn

n XifZXn

XifZX

Where nn XX ,......,1

Consider a sequence 1, nZn

of independent random variables such that:

npw

npw

npw

Zn

2

1.1

2

11.0

2

1.1

Define a new sequence as follows:nX

Page 65: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

65

We claim that nX is a martingale which means that:

11 nnn XXENow

nnXnnnX

nXnnXn

nXnnXn

nXnXnnn

ZEIXnZEI

IZXnIZE

IXEIXE

IXIXEXE

nn

nn

nn

nn

0110

010

00

00

11

11

11||

Since nZ is independent of 1n

Then

1

01

010

11

11

n

Xn

nXnnXnn

Xn

IXn

ZEIXnZEIXE

n

nn

Page 66: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

66

Notice that

.00 definitionfromZX nn

n

ZpXp nn

1100

For 0 0

10

nXpXp nn

But

nn

n nXp

10

This means that

1.0 oiXp n

Which is equivalent to saying does not converge to zero almost surly.That is, convergence in probability does not imply almost sure convergence even for martingales.

nX

Page 67: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

67

Define

0011 nn XandX

And

1:0&0:1 XXHence the cumulative density function of is

nX

.11

102

1

00

xF

x

x

x

xXpxF nn

ThenXX D

n on the other hand,

11 XXp n does not converge to 0.

Hence does not converge to X in probability.nX

This example shows that convergence in distribution does not imply convergence in probability.Let each of 0 & 1 has mass = ½ 1,0

Page 68: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

68

We can see that

01

n

XE n

Hence

0 mnX

And in example (13) we show that does not converge to 0 almost surly.

nX

This example shows that convergence in mean does not imply almost sure convergence.Define

npw

npw

X n 11.0

1.1

Page 69: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

69

Hence does not converge to 0 in mean.nX

On the other hand

c

n

n

nif

ifnX

1

1

,00

,0

It is clear that

0. sanX

This example shows the converse implication of the last example, is not true in general.Define on the probability space a sequence of random variables as follows:

P,,

nX

n

InX n 1,0:

Where and P=lebesgue measure 1,0

Now

11

nn

nXpnEXXE nnn

Page 70: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

70

XX n in the pth mean if

0,0 pnasxXEp

n

This example shows that convergence in probability does not imply convergence in the pth mean.Define

npw

npwn

X n

log

11.0

log

1.

For every ,0

0log

10

nXpXp n

p

n

Page 71: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

71

This means

0 pnX

And

n

nXE

pp

n log

nXHence for every p>0, does not converge to 0 in the pth mean.

Page 72: 1 1.1- Random walk on Z: 1.1.1- Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},

72

6- The Integrals of Lebesgue Measure, by Denjoy, Perron, and Henstock/Russell A. Gordan.

1- Adventures in Stochastic Processes, by Sidney I. Resnick.

2- Introduction to Probability Theory, by Paul G.Hoel, Sidney C. Port and Charles J.Stone.

3- Markov Chains, by J.R.Norries.

4- Probability by Alan F. Karr.

5- Random Walks and Electric Networks, by Peter G.Doyle and J.Laurie Snell.