113
Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t put much eort into correcting them afterwards, so expect mistakes. I tried to improve some of the formulations, to facilitate the reading. If the lecture featured any images, likely not all of them are included. References [Ligg] Thomas M. Liggett Continuous Time Markov Processes (2010) American Mathematical Society A M S Contents 1 Markov Processes 2 1.1 Examples ................................... 3 2 Markov Property (pathwise version) 3 2.1 Extra Structure ................................ 5 3 The strong Markov Property 6 3.1 Stopping Times ................................ 6 3.2 The strong Markov Property ......................... 8 3.3 Non-examples ................................. 9 3.3.1 "Waiting, then constant speed" ................... 9 3.3.2 Brownian Motion with a twist .................... 9 3.3.3 Càglàd Path .............................. 9 1

Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

  • Upload
    others

  • View
    5

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Markov Processes

Joe Neeman

Notes Max Goldowsky

February 2, 2016

WS2015/16

N.B.: These notes were typed during the lectures. I didn’t put much effortinto correcting them afterwards, so expect mistakes. I tried to improve someof the formulations, to facilitate the reading. If the lecture featured anyimages, likely not all of them are included.

References

[Ligg] Thomas M. Liggett Continuous Time Markov Processes (2010)

American Mathematical Society AMS

Contents

1 Markov Processes 2

1.1 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

2 Markov Property (pathwise version) 3

2.1 Extra Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

3 The strong Markov Property 6

3.1 Stopping Times . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63.2 The strong Markov Property . . . . . . . . . . . . . . . . . . . . . . . . . 83.3 Non-examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

3.3.1 "Waiting, then constant speed" . . . . . . . . . . . . . . . . . . . 93.3.2 Brownian Motion with a twist . . . . . . . . . . . . . . . . . . . . 93.3.3 Càglàd Path . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

1

Page 2: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

1 MARKOV PROCESSES Joe Neeman

4 Continuous time Markov Chains 9

4.1 Basic Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104.2 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114.3 From Markov Chains to infinitesimal description . . . . . . . . . . . . . . 114.4 Convergence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314.5 Kolmogorov Forward equation . . . . . . . . . . . . . . . . . . . . . . . . 334.6 Feller Semigroup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 354.7 Generators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 394.8 Continuity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 504.9 More on Generators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 514.10 Martingales . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 564.11 Stationary Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . 594.12 Feller processes & PDE . . . . . . . . . . . . . . . . . . . . . . . . . . . . 674.13 Duality (Liggett Duality) . . . . . . . . . . . . . . . . . . . . . . . . . . . 68

5 Spin systems 70

5.1 Ergodicity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 775.2 Coupling & Attractiveness . . . . . . . . . . . . . . . . . . . . . . . . . . 805.3 Monotonicity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 825.4 Associated Measure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 845.5 Stationary distributions of voter model . . . . . . . . . . . . . . . . . . . 915.6 Contact Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101

Index 113

1 Markov Processes

(S,S) measurable space “state space”⌦ = S[0,1)

= {f : S ! [0,1)}, X 2 ⌦, X(t) 2 SF

t

filtration of F (e.g. F = �(x 7! x(s), s t), s.t. s t ) (x 7! x(s)) 2 Ft

.{P x, x 2 S} collection of probability measures on (⌦,F) with

∗ P x

(x(s) = x) = 1

∗ 8A 2 F the maps x 7! P x

(A) is measurable (meaning Borel-measurable).

Definition 1.1 (Markov Property). {P x} satisfies the Markov Property if

8s, t � 0 : 8A 2 S : P x

(X(s+ t) 2 A|Ft

) = PX(s)(x(t) 2 A)

where the right-hand side means means �(X(s)) where �(y) = P y

(X(t) 2 A

Proposition 1.2. The following Property is equivalent to Definition 1.1:

8s, t � 0 : 8x : 8f : S ! R bounded, measurable : Ex

[f(X(s+ t))|Ft

] = EX(s)[f(X(t)]

2

Page 3: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 1.1 Examples

Proof. ∗ f ⌘ 1A

∗ f can be approximated by elementary functions.m

1.1 Examples

(i) Independent Variables: µ probability measure on S, P x probability measureunder which P x

(X(0) = x) = 1 and (X(t), t > 0) iid. ⇠ µ. Also 8A 2 S, if t = 0,then the left-hand side is 1

P (s)2A which is the right-hand side. On the other hand,if t > 0, then the left-hand side is µ(A) which in turn is the right-hand side.

(ii) Constant Paths: under P x, 8t : P x

(X(t) = x) = 1

(iii) Brownian Motion: S = R,S = B, P x Brownian Motion started in x. Thenunder P x

:

a) 8t1 < · · · < tk

: {x(ti

)� x(ti�1)} are independent

b) s < t) xt

� xs

⇠ N (0, t� s)

c) P x

(X(0) = x) = 1

d) P x

(t 7! X(t) continuous) = 1

{P x} satisfies the Markov Property with respect to Ft

= �(X(s), s t). To checkthis, w.t.s.

8A 2 B, 8B 2 Fs

: Ex1B

P x

(X(s+ t) 2 A|Fs

) = Ex1B

'(X(s)) (⇤)

a) Check (⇤) for B of the form:

{X(t1) 2 B1, · · · , X(tk

) 2 Bk

}

b) Apply ⇡ � �-Theorem.

(iv) Compound Poisson Process: S = R,S = B, µ positive measure on R.

P x

(X(t) 2 A) = e�1X

k�0

tkµ⌦k

(A� x)

k!

2 Markov Property (pathwise version)

8s � 0 define ✓s

: ⌦! ⌦ to be the shift-operator (✓s

x)(t) := x(t+ s)

Proposition 2.1. {P x} satisfies the Markov Property, iff 8s > 0 : 8x 2 S : 8Y : ⌦! Rmeasurable and bounded

Ex

(Y � ✓s

|Fs

) = EX(s)(Y ) (1)

3

Page 4: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

2 MARKOV PROPERTY (PATHWISE VERSION) Joe Neeman

Proof. (if) y(x) = f(x(t)), then (Y � ✓s

)(x) = f(x(s+ t))

(only if) Monotone class theorem (MCT). If P is ⇡-system containing ⌦, then H is avector space of Random Variables satisfying(i) 8A 2 P : 1

A

2 H(ii) if Y1, Y2, · · · 2 H, Y

n

% Y, then y 2 Hthen H � {�(P )-measurable, bounded Random Variables}. Employ MCT with Pas the sets of the form

{X(t1) 2 B1, · · · , x(tk) 2 Bk

}�(P ) = FH = {bounded Random Variables satisfying (1)}

then H is a vector space which is closed under linear monotone limits. It remainsto show that 8A 2 P : 1

A

2 H

Claim: For all bounded and measurable f1, · · · , fk : S ! R and for any strictly

increasing sequence of ti

, i 2 [k], if Y (x) =kY

i=1

fi

(x(ti

)) then Y satisfies (1).

Proof. by induction: n = 1 is obvious. Assume (1) holds for any Y =

n�1Y

i=1

fi

(x(ti

)).

Let Z(x) =nY

i=1

fi

(x(ti

)) = Y fk

(x(tk

)) and g(y) := Ey

(fk

(x(s+ tk

)))~

Ex

(Z � ✓s

|Fs

) = Ex

((Y � ✓s

)fk

(x(tk

+ s))|Fs+t

k

)

= Ex

(Ex

((Y � ✓s

)fk

(x(tk

+ s))|Fs+t

k

)|Fs

)

= Ex

(Y � ✓s

Ex

(fk

(x(tk�1 + s+ t

k

� tk�1))|Fs+t

k

)|Fs

)

~= Ex

((Y � ✓s

) � g(X(tk�1 + s))|F

s

)

= Ex

(Y 0 � ✓s

|Fs

) = Ex

(g)Y 0

where Y 0=

n�1Y

i=1

fi

(x(ti

))g(x(tk�1)) yielding

8y, Ey

(Z) = Ey

(

k�1Y

i=0

fi

(x(ti

))Ex

(fk

(X(tk

))|Ft

k�1) = Ey

(Y 0)

m

m

4

Page 5: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 2.1 Extra Structure

2.1 Extra Structure

∗ right-continuous filtration

∗ càdlàg paths

Definition 2.2 (Right-Continuous Filtrations). The filtration Ft

is right-continuous

if 8t,F =

\

s>t

Fs

Every Filtration F0t

can be turned into a right-continuous filtration:

Ft

:=

\

s>t

F0t

The usual choice for a filtration is F0t

= �(X(s), s t).Assume henceforth, that S is a Polish space with S it’s Borel �-algebra.

Definition 2.3 (Càdlàg Functions). f : R+0 ! S is càdlàg, if it is right continuous and

the left limits exist in the whole domain.

Furthermore assume from now on, that ⌦ is the space of Càdlàg-Functions

Proposition 2.4. If {P x} is a Markov Process with respect to F0s

, has càdlàg pathsand for any for any bounded, continuous f and every t > 0 one obtains that ((y, h) 7!Ey

(f(x(t� h))) is continuous, then {P x} is a Markov Process with respect to Fs

.

Examples

Brownian Motion

Ey

(f(x(t� h))) =1

2⇡

Zf(z)e�

(z�y)2

z(t�h) dz

Compound Poisson Process

Ey

(f(x(t� h))) = e�11X

n=0

(t� h)kRf(z � y)dµ⌦k

(z)

k!

N.B. In the second example of the last lecture (Compound Poisson Process), S needsto be locally compact.

Proposition 2.5. Suppose that {P x}x

satisfies the Markov Property w.r.t. the naturalfiltration (F0

t

)

t

if for every t>0 and every f 2 Cb

(S) the function

(y, h) 7! Ey

[f(X(t� h)]

is constant for h 2 (0, t) then {P x} satisfies the Markov Property w.r.t. (Ft

)

t�0. If Sis locally compact, it suffices to check the above condition for continuity and compactsupported functions.

5

Page 6: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

3 THE STRONG MARKOV PROPERTY Joe Neeman

Proof. Fix t > 0 and f 2 Cb

(S) and let

�(y, h) := Ey

[f(X(t� h)]

We want to show that for every A 2 Fs

, s � 0

Ex

[1A

(x)f(X(t+ s))] = Ex

[1A

(x)�(X(s), 0)] (2)

Since Fs

⇢ F0s+h

and the Markov property holds for the filtration (F0t

)

t

, we have

Ex

[1A

f(X(t+ s))] = Ex

[1A

(x)�(X(s+ h), h)] (3)

for every h 2 (0, t). Since X has càdlàg path, X(s+h)h!0��! X(s); since � is continuous,

�(X(s+ h), h)h!0��! �(X(s), 0)

Finally, � is bounded since f 2 Cb

, hence by dominated convergence on (3), (2) follows.If S is locally compact and (y, h) 7! Ey

[f(X(t� h))] is continuous for every continuousand compactly supported f , then the argument above shows that (2) holds for everyf 2 C

c

(S). But on locally compact spaces, every f 2 Cb

can be approximated by (fn

)

n

,with f

n

2 Cc

m

Theorem 2.6 (Blumenthal’s 0-1-law). If {P x}x

satisfies the Markov property w.r.t.(F

t

)

t�0, then for all A 2 F0 and every x 2 S, P x

[A] 2 {0, 1}Proof. A is F0 measurable implies P x

[A|F0] = 1A

, P x-a.s.. By the Markov Property

1A

= P x

[A|F0] = PX(0)[A], P x-a.s.

Since X(0) = x, P x-a.s., we have 1A

= P x

[A], P x-a.s., which in turn implies

P x

[A] 2 {0, 1}m

3 The strong Markov Property

3.1 Stopping Times

Definition 3.1 (Stopping Time). A random variable ⌧ : ⌦! [0,1] is a stopping timew.r.t. (F

t

)

t�0 if8t 2 [0,1) : {⌧ t} 2 F

t

Since the filtration (Ft

)

t�0 is assumed to be right-continuous it doesn’t matter whetherwe take {⌧ t} 2 F

t

or {⌧ < t} 2 Ft

in Definition 3.1

Proposition 3.2. ⌧ is a stopping time, iff 8t 2 [0,1) : {⌧ < t} 2 Ft

6

Page 7: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 3.1 Stopping Times

Proof.

{⌧ < t} =

[

s2Q,s<t

{⌧ s}

If 8s < t : {⌧ s} 2 Fs

✓ Ft

, then {⌧ < t} 2 Ft

. On the other hand, fix " > 0 andwrite

{⌧ t} =

\

s2Q,t<s<t+"

{⌧ < s}

If {⌧ < s} 2 Fs

for all s, then the right hand side above belongs to Ft+"

. Since " > 0 isarbitrary and (F

t

)

t�0 is right continuous the claim follows. m

There are several important properties of stopping times that will be stated withoutproof:

(i) G ⇢ S open, then ⌧ = inf{t � 0|X(t) 2 G} is a stopping time of Ft

(, but not ofF0

t

)

(ii) If ⌧1, ⌧2 are stopping times, than also ⌧1 ^ ⌧2, ⌧1 _ ⌧2 are a stopping time.

(iii) If (⌧n

)

n�1 is a sequence of stopping times, then alsoa) sup

n

⌧n

b) inf

n

⌧n

c) lim sup

n

⌧n

d) lim inf

n

⌧n

e) lim

n

⌧n

exists )are stopping times.

N.B. Part (i) is more subtle for general Borel sets. We will come back to the latter.

Definition 3.3. Let ⌧ be a stopping time, then

F⌧

:= {A 2 F1|8t � 0 : A \ {⌧ t} 2 Ft

}is a �-Algebra, the �-Algebra of events before ⌧ .

(i) F⌧

is a �-Algebra

(ii) One could equivalently ask for A \ {⌧ < t} 2 Ft

(iii) ⌧ is F⌧

measurable

(iv) ⌧1 ⌧2 ) F⌧1 ⇢ F

⌧2

(v) If ⌧n

& ⌧ ) F⌧

=

\

n�1

F⌧

n

7

Page 8: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

3 THE STRONG MARKOV PROPERTY Joe Neeman

(vi) If {Z(t), t � 0} is an Ft

-adapted, right-continuous process, then Z⌧

1⌧<1 is F -

measurable.

The proof will only cover notion (vi).

Proof. We first prove this for ⌧ which only takes countably many values, call them tn

.Then

{Z(⌧)1⌧<1 a} \ {⌧ t} =

[

t

k

t

{⌧ = tk

} \ {Z(tk

) a} 2 Ft

Thus {Z⌧

1⌧<1 a} 2 F

. Hence Z⌧

1⌧<1 is F

-measurable. Now let ⌧ be arbitrary.We can approximate ⌧ from above by stopping times ⌧

n

attaining only countably manyvalues. Indeed, we set

⌧n

:=

8<

:

k + 1

2

n

⌧ 2k

2

n

,k + 1

2

n

◆k 2 N

1 ⌧ =1

Then ⌧n

& and so Z(⌧n

) ! Z(⌧) on {⌧ < 1} by right-continuity. Since ⌧n

only takescountably many values, Z

n

1⌧

n

<1 is F⌧

n

-measurable for every n, hence also F⌧

m

forevery m n. Finally

Z⌧

1⌧<1 = lim

n!1Z(⌧

n

)1⌧

n

<1

is F⌧

=

\

m�1

F⌧

m

-measurable m

3.2 The strong Markov Property

Convention: From now on, we will suppose that S is locally compact. For a jointlymeasurable: Y : [0,1)⇥ ⌦! R, we write Y

s

(·) for Y (s, ·).Definition 3.4. {P x}

x

satisfies the strong Markov Property if for every jointlymeasurable Y as above, every stopping time ⌧ and every x 2 S

Ex

[Y⌧

� ✓⌧

|F⌧

] = EX(⌧)[Y

];P x-a.s. on {⌧ <1} (4)

N.B. Note that the process X appears in three places on the right hand side since bothY and ⌧ are functions of X.

If Y is independent of s, e.g. Ys

(X) = f(X(t)) for some fixed t, then Y⌧

� ✓⌧

=

f(X(⌧ + t)) and the equation above looks like Proposition 1.2 except with a random

time instead of a fixed one. The right hand side of (4) means the function (x, t) 7! Ex

[Yt

]

evaluated at (X(⌧), ⌧). Hence it is F⌧

-measurable.

Theorem 3.5. Supposed that y 7! Ey

[f(X(t))] is continuous for every f 2 Cc

(S) andevery t > 0. Then if {P x}

x

is a Markov Process, it has the strong Markov Property.

Proof. Exercise m

8

Page 9: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 3.3 Non-examples

3.3 Non-examples

Here are a few examples where the condition y 7! E[f(X(t))] being continuous fails andthe process satisfies the Markov Property but not the strong Markov Property.

3.3.1 "Waiting, then constant speed"

S = R+, P x is defined as follows: for x > 0, let P x be the measure under whichX(t) = x + t with probability one. For x = 0, let P x be the measure under whichX waits at zero for an exponential time T and then grows with speed one. In otherwords, under P 0 we have X(t) = 0 _ (t � T ). This is a Markov Process under F

t

.Consider the stopping time

⌧ := inf{t � 0|X(t) > 0}and Y (X) = X(1) under P 0, P 0-a.s.:

X � ✓⌧

(1) = 1

as the left hand-side in Definition 3.4 on the other hand X(⌧) = 0 and E[X(1)] < 1 asthe right hand side of Definition 3.4. This implies a contradiction.

3.3.2 Brownian Motion with a twist

Let P x be a BM starting in x except for x = 0: Let P 0 assign probability on to the pathX = 0. This is a Markov Process. Consider the stopping time

⌧ := inf{t � 0|X(t) = 0}then X(⌧) = 0 and so the right hand side in Definition 3.4 always deals with this trivialprocess, while for x 6= 0 the left hand side in Definition 3.4 is the same as for standardBrownian Motion. This implies a contradiction.

3.3.3 Càglàd Path

Finally, we give an example which does satisfy the continuity in Theorem 3.5, but whichstill doesn’t satisfy the strong Markov Property, since it doesn’t have right continuouspaths.:

Let X be the left continuous version of a Poisson process starting under P x in x. Let⌧ := inf{t � 0|X(t) > 0} which is an F

t

-stopping time (but not F0t

-stopping time).However the strong Markov Property fails:

E0[Y

� ✓⌧

] = E1[Y

] 6= E0[Y

]

4 Continuous time Markov Chains

The aim of this chapter is to construct continuous time Markov Processes from infinites-imal descriptions.

9

Page 10: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

4.1 Basic Setup

∗ S is a countable set

∗ ⌦ := {! : R+0 ! S|! right continuous with finitely many jumps in any finite

(bounded) interval}∗ X(t,!) = !(t) the evaluation map

∗ ✓s

!(t) = !(s+ t) to shift

∗ F : smallest �-algebra making all maps t 7! !(t) measurable

Definition 4.1 (Markov Chain). A Markov Chain on S consists of

(i) a collection of probability measures {P x|x 2 S} on ⌦

(ii) a right-continuous filtration (Ft

)

t�0 on (⌦,F) w.r.t. which X(t) is adapted, satis-fying P x

[X(0) = x] = 1

(iii) For all x 2 S and all bounded measurable Y on ⌦:

Ex

[Y � ✓s

|Fs

] = EX(s)[Y ], P x-a.s. (5)

Definition 4.2 (Transition Function). A transition function is a function Pt

(x, y) de-fined for t 2 R+

0 and x, y 2 S satisfying

(i)

pt

(x, y) � 0,X

y

pt

(x, y) = 1, limt&0

pt

(x, x) = p0(x, x) = 1 (6)

(ii)

ps+t

(x, y) =X

z

ps

(x, z)pt

(z, y) (7)

(7) is called Chapman-Kolmogorov Equations (CK).

N.B. A transition set encodes the finite dimensional distribution of the process/chainwith initial start x by:

P x

(X(t1) = x1, · · · , X(tn

), xn

) = pt

(x, y1)pt2�t

(x1, x2) · · · ptn

�t

n�1(xn�1, xn

)

for xi

2 S and 0 t1 < · · · < tn

. Equation (7) is necessary to make this consistent.

The infinitesimal description of the Markov chain is given formally by

q(x, y) =d

dtpt

(x, y)��t=0

10

Page 11: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.2 Examples

Definition 4.3 (Q-Matrix). A Q-Matrix is a collection (q(x, y))x,y2S of real numbers

indexed by x, y 2 S that satisfy 8x 6= y : q(x, y) � 0 andX

y

q(x, y) = 0. We write

c(x) = �q(x, x) � 0

In general there is no one-to-one correspondence among Markov Chains, transitionsets and Q-Matrices. However, this is nearly the case as we will see later. Before we gointo this we consider some examples to get a better idea of what is going on.

4.2 Examples

(i) Any Zd-valued stochastic process with independent increments is a Markov Chain.

(ii) If P = (p(x, y))x,y2S is a discrete time Markov Chain then it’s possible to turn it

into a continuous time Markov Chain by letting it take steps at iid exponentialrate 1 Random Variable. (Homework)

(iii) For S = {0, 1} any Q-Matrix has the form✓�� �� ��

◆; � � 0, � � 0 (8)

(iv) Birth and Death Chain on S = N0 with Q-Matrix: q(k, k + 1) = %k

, q(k, k � 1) =

�k

, q(k, k) = �%k

��k

and for any other k, l : q(k, l) = 0 with %k

� 0,�0 = 0,�k

� 0

4.3 From Markov Chains to infinitesimal description

Theorem 4.4. Given a Markov Chain, let pt

(x, y) = P x

[X(t) = y] for t 2 R+0 and

x, y 2 S. Then

(i) pt

(x, y) is a transition set.

(ii) pt

(x, y) determines the measures P x uniquely.

Proof. (i) pt

(x, y) � 0 andX

y

pt

(x, y) = 1 are clear. By right continuity of paths,

⌧ = inf{t 2 R+0 |X(t) 6= X(0)} > 0 a.s. w.r.t. any of the measures P x. Hence

lim

t&0pt

(x, x) = 1 follows from P x

[⌧ > t] pt

(x, x). The Chapman-Kolmogorov

equations follow from the Markov Property by letting Y = 1X(t)=y

in Equation (5)and observing

P x

[X(s+ t) = y|Ft

] = PX(s)[X(t) = y] = p

t

(X(s), y), P x-a.s.

Taking the expectation w.r.t. P x proves (i).

11

Page 12: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

(ii) The Markov Property implies for an increasing sequence of times ti=1:n

P x

[x(t1) = x1, · · · , X(tn

) = xn

] = pt1(x, x1) · · · pt

n

�t

n�1(xn�1, xn

)

Hence, the transition function determines the finite dimensional distribution of P x.But this determines P x by the ⇡ � �-Theorem.

m

Reminder of setting Let S be a countable set, then there are three objects of interest:

(i) Markov Chain X

∗ right-continuous filtration∗ càdlàg paths∗ in any bounded time interval, there are only finitely many jumps

(ii) transition function usually denoted pt

: S ⇥ S ! R.∗ p

t

(x, y) � 0

∗X

y

pt

(x, y) = 1 “stochastic”

∗ pt

(x, x)t!0��! 1

∗ pt+s

(x, y) =X

z

pt

(x, z)ps

(z, y) Chapman-Kolmogorov (CK)

(iii) Q-Matrix, q : S ⇥ S ! R∗ 8x 6= y : q(x, y) � 0

∗X

y

q(x, y) = 0

∗ c(x) = �q(x, x) � 0

∗ and something like “q(x, y) =d

dtpt

(x, y)��t=0

what we did last time, was X ! pt

, now we’ll do pt

! Q

Properties of pt

(i) 8t, x : pt

(x, x) > 0

(ii) 9t > 0 : pt

(x, x) > 0) 8t � 0 : pt

(x, x) = 1 “x Absorbing State”

(iii) 8x, y, s, t : |pt

(x, y)� ps

(x, y)| 1� p|t�s|(x, x)

(iv) c(x) := �qt

(x, x) := � d

dtpt

(x, y)��t=0

exists on R+0

12

Page 13: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.3 From Markov Chains to infinitesimal description

(v) pt

(x, x) � e�c(x)t

(vi) if c(x) <1, then

q(x, y) :=d

dtpt

(x, y)��t=02 R+

0

(vii) if c(x) <1, then X

y

q(x, y) 0

(viii) ifX

y

q(x, y) = 0, then pt

(x, y) is continuously differentiable in t and Kolmogorov’s

Backward Equation (KBE) holds:d

dtpt

(x, y) =X

z 6=x

q(x, z)pt

(z, y)

Proof. (i) For small enough t � 0, pt

(x, x) > 0. For any t > 0 and any k 2 Z oneobtains:

pt

(x, x) � pkt

k

(x, x) > 0

because of the Chapman-Kolmogorov EquationsX

z1,··· ,zk�2

p t

k

(x, z1)p t

k

(z1, z2) · · · p t

k

(zk�2, x)

(ii)

ps+t

(x, x) =X

z

ps

(x, z)pt

(z, x)

X

z 6=x

ps

(x, z) + ps

(x, x)pt

(x, x)

= 1� ps

(x, x) + ps

(x, x)pt

(x, x)

= 1� ps

(x, x)(1� pt

(x, x))

pt+s

(x, x) = 1) pt

(x, x) = 1

Thus I = {t|pt

(x, x) = 1} is an Interval

pt

(x, x) � p t

k

(x, x)) (

t

k2 I ) t 2 I

(iii)

pt+s

(x, y)� pt

(x, y) =X

z

ps

(x, z)pt

(z, y)� pt

(x, y)

=

X

z 6=x

ps

(x, z)pt

(z, y) + (ps

(x, x)� 1)pt

(x, y)

1� ps

(x, x)| {z }2[0,1�p

s

(x,x)]

+(ps

(x, x)� 1)pt

(x, y)| {z }2[0,1�p

s

(x,x)]

) |pt+s

(x, y)� pt

(x, y)| 1� ps

(x, x)

13

Page 14: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

(iv) f(t) := � log pt

(x, x), f is continuous and subadditive

(f(s+ t) f(s) + f(t))

thus one obtains: c(x) := lim

t!0

f(t)

texists 2 R+

0 . Chain rule provides c(x) =

� d

dtpt

(x, x)��t=0

(v) pt

(x, x) � e�c(x)t.f is subadditive and thus f(t) c(x)t) pt

(x, x) � e�c(x)t

(vi) by v,

1� pt

(x, x) 1� e�c(x)t

c(x)t

1

t

X

x 6=y

pt

(x, y) =1

t(1� p

t

(x, x))

c(x)

thus one obtains8y :

pt

(x, y)

t c(x)

Letq(x, y) := lim sup

t!0

pt

(x, y)

t<1

Goal: lim inf

t!0

pt

(x, y)

t� q(x, y). By CK

8�, n : �n

t : pt

(x, y) �n�1X

k=0

pk�

(x, x)| {z }�e

�c(x)t

p�

(x, y) pt�(k+1)�(y, y)| {z }

�infs2[0,t]

ps

(y, y)

� e�c(x)inf

s2[0,t]ps

(y, y)np�

(x, y)

pt

(x, y)

t� e�c(x)t

inf

s2[0,t]ps

(y, y)n�

t

ps

(x, y)

Take � ! 0 so thatp�

(x, y)

�! q(x, y)

Set n =

�t

⌫) n�

t! 1.

pt

(x, y)

t� e�c(x)t

inf

s2[0,t]ps

(y, y)q(x, y)

thus lim inf

t!0

pt

(x, y)

t� q(x, y)

14

Page 15: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.3 From Markov Chains to infinitesimal description

(vii)

X

z 6=x

pt

(x, z)

t c(x)

X

z 6=x

q(x, z) =X

z 6=x

lim

t!0

pt

(x, z)

t

lim

t!0

X

z 6=x

pt

(x, z)

t

c(x) = �q(x, x)

(viii)pt+s

(x, y)� pt

(x, y)

s�X

z

q(x, z)pt

(z, y)

By the CK one obtainsX

z

ps

(x, z)� p0(x, z)

s� q(x, z)

�pt

(z, y).

Now let T 3 x be a finite set, then the claim (⇤) is

X

z /2T

����ps

(x, z)� p0(x, z)

s� q(x, z)

����T!S���! 0

(⇤) X

z /2T

ps

(x, z)� p0(x, z)

s+

X

z /2T

q(x, z)

=

1

s(1�

X

z2T

ps

(x, z))�X

z2T

q(x, z)

s!0��! �2X

z2T

q(x, z)

T!S���! 0

m

Blackwell’s Example Transition function pt

, such that 8x : c(x) =1. Choose positivesequences �

i

, �i

and Xi

(t) a Markov Chain with

Qi

=

✓��i

�i

�i

��i

also,X(t) = (X1(t), X2(t), · · · ) 2 {0, 1}1

15

Page 16: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

thus our Statespace is countable:

S = {(x1, x2, · · · , ) 2 {0, 1}1|1X

i=1

xi

}

8x, y 2 S : pt

(x, y) = P x

(X(t) = y)

P x

=

YP x

i

pt

(x, y) =Y

P x

i

(Xi

(t) = yi

)

N.B. IfX �

i

�i

+ �i

<1, then pt

is a transition function.

Proof. (i)X

y2S

pt

(x, y) = 1, P x

(X(t) 2 S) = 1 Fix x, let I = {i : xi

= 0}( 1X

i=1

Xi

(t) <1)

=

(X

i2I

Xt

<1)

P x

X

i2I

Xi

(t) =1!

= P x

(Xi

(t) = 1 for infinitely many i 2 I)

P x

(Xi

(t) = 1) = P 0(X

i

(t) = 1) =

�i

�i

+ �i

(1� e�t(�i

+�

i

) �i

�i

+ �i

Borel-Cantelli provdes:

P x

(Xi

(t) = 1 infinitely often) = 0

(ii) W.t.s.: pt

(x, x)t!0��! 1. Choose n, s.t. 8i > n : x

i

= 0

pt

(x, x) �nY

i=1

P x

i

(Xi

(t) = xi

)

Y

i>n

✓1� �

i

�i

+ �i

lim inf

t!0pt

(x, x) �Y

i>n

✓1� �

i

�i

+ �i

◆> 0

n!1���! 1

(iii) (CK) equations:

p(n)t

(x, y) = P x

(8i n : Xi

(t) = yi

)

p(n)t

(x, y) # pt

(x, y) as n!1pt+s

(x, y) = lim

n!1p(n)t+s

(x, y)

lim

n!1

X

z

p(n)t

(x, z)p(n)s

(z, y)

=

X

z

pt

(x, z)ps

(z, y)

m

16

Page 17: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.3 From Markov Chains to infinitesimal description

Theorem 4.5. SupposeX �

i

�i

+ �i

<1,X

�i

=1, then

(i) 8x : c(x) =1(ii) 8" > 0 : P x

(8t 2 (0, ") \Q : X(t) = x) = 0

(iii) There is no Markov Chain with transition function pt

.

�i

, �i

positive sequences, Xi

(t) Markov Chain with

Qi

=

✓��i

�i

�i

��i

and

X(t) = (X1(t), X2(t), · · · )

S = {x 2 {0, 1}1|1X

i=1

xi

<1}

8x, y 2 S : pt

(x, y) := P x

(X(t) = y)

Proposition 4.6. IfX �

i

�i

+ �i

<1, then pt

is a transition function.

Theorem 4.7. SupposeX �

i

�i

+ �i

<1 andX

�i

=1, then

(i) 8x : c(x) =1(ii) 8" > 0 : P x

(8t 2 Q \ [0, ") : X(t) = x) = 0

(iii) there is no Markov Chain with transition function pt

Proof. (i) Fix x 2 S and choose m n, s.t. 8i � m : xi

= 0

pt

(x, x) =1Y

i=1

P x

i

(Xi

(t)

= xi

) nY

i=m

P 0(X(t) = 0)

=

nY

i=m

0

B@1� �i

�i

+ �i

(1� e�t(�i

+�

i

))| {z }

⇠t(�i

+�

i

) as t!1

1

CA

⇠nY

i=m

(1� t�i

) as t!1

exp

�t

nX

i=m

�i

!

17

Page 18: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

We know pt

(x, x) � e�c(x)t, and thus

c(x) nX

i=m

�i

) c(x) =1

(ii) Since all Xi

have càdlàg path, one obtains:

{8t 2 Q \ [0, ") : X(t) = x} = {8t 2 [0, ") : X(t) = x} := A

and thus

P x

(A) =1Y

i=1

P x

i

(8t 2 [0, ") : Xi

(t) = xi

)

1Y

i=m

P 0(8t 2 [0, ") : X

i

(t) = 0)

=

1Y

i=m

e��

i

"

= exp

⇣�"X

�i

⌘= 0

(iii) by part (ii) it follows:

P x

⇣lim

t!0X(t) = x

⌘= 0

P x

(8t 2 Q \ [0, ") : X(t) = x)

depends only on pt

. Thus:

=

\

n

{8i n : X(ti

) = x}

hence for any Y with transition function pt

, the paths of any Y are not càdlàg.m

The goal is to understand the relationships as depicted in the following chart.

X

pt

��

QMM

��

uu

As the next step we’ll do Q ! pt

. Remember the Kolmogorov-Backward-Equation(KBE):

d

dtpt

(x, y) =X

z

q(x, z)pt

(z, y)

18

Page 19: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.3 From Markov Chains to infinitesimal description

Proposition 4.8. Suppose pt

(x, y) 2 [0, 1]. the following are equivalent (TFAE):

(i) pt

satisfies KBE with p0(x, y) = �(x, y)

(ii) pt

is continuous in t and

pt

(x, y) = �(x, y)e�c(x)t+

Zt

0

e�c(x)(t�s)X

z 6=x

q(x, z)ps

(z, y)ds

Proof. (i)) (ii) multiply KBE by ec(x)t and integrate.

(ii)) (i) differenciatem

Let p(0)t

(x, y) ⌘ 0 and

p(n+1)t

(x, y) = �(x, y)e�c(x)t+

Zt

0

e�c(x)(t�s)X

z 6=x

q(x, z)p(n)s

(z, y)ds

Proposition 4.9. (i) p(n)t

(x, y) � 0

(ii)X

y

p(n)t

(x, y) 1

(iii) p(n+1)t

(x, y) � p(n)t

(x, y)

Proof. (i) Since every summand is bigger then 0 the total sum also is bigger then 0.

(ii)X

y

p(n+1)t

(x, y) = e�c(x)t+

Zt

0

e�c(x)(t�s)X

z 6=x

q(x, z) p(n)s

(z, y)| {z }1

ds

e�c(x)t+

Zt

0

c(x)e�c(x)(t�s)ds = 1

(iii) p(i)t

(x, y) = e�c(x)t�(x, y)

� p(0)t

(x, y)

p(n+1)t

(x, y)� p(n+1)t

(x, y) =

Zt

0

e�c(x)(t�s)X

z 6=x

q(x, z)(p(n)(z, y)� p(n�1)(z, y)ds

� 0

m

Letp⇤t

(x, y) := lim

n!1p(n)t

(x, y)

19

Page 20: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

Theorem 4.10. (i) p⇤t

(x, y) � 0

(ii)X

y

p⇤t

(x, y) 1

(iii) p⇤t+s

(x, y) =X

z

p⇤t

(x, z)p⇤s

(z, y) (CK)

(iv)@

@tp⇤t

(x, y) =X

z

q(x, z)pt

(z, y) (KBE)

(v) If pt

� 0 satisfies KBE, then p⇤t

(x, y) pt

(x, y)

(vi) If 8x :

X

y

p⇤t

(x, y) = 1, then p⇤t

is the unique transition function satisfying the

KBE.

Proof. (i) as above

(ii) as above

(iii) later

(iv) by definition of p(n)t

(x, y), taking limits:

p⇤t

= �(x, y)e�c(x)t+

Zt

0

e�c(x)(t�s)X

z 6=x

q(x, z)p⇤s

(z, y)ds

(v) The claim is, that for any n

p(n)t

(x, y) pt

(x, y)

by induction: pt

(x, y) � p(0)t

(x, y)

p(n+1)t

(x, y) = �(x, y)e�c(x)t+

Zt

0

e�c(x)(t�s)X

z 6=x

q(x, z) p(n)t

(z, y)| {z }p

t

ds

pt

(x, y)

(vi) If pt

(x, y) is another solution and pt

6⌘ p⇤t

, then 9x, t, s.t.:

1 =

X

y

pt

(x, y) >X

y

p⇤t

(x, y) = 1

providing a contradiction.m

20

Page 21: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.3 From Markov Chains to infinitesimal description

Now we’ll deal with Q! X, the “embedded discrete time chain”. Given q, c define

p(x, y) =

8>>><

>>>:

�(x, y) c(x) = 0

q(x, y)

c(x)c(x) > 0, x 6= y

0 c(x) > 0, x = y

then

∗ p(x, y) � 0

∗X

y

p(x, y) = 1

Let Z0, Z1, · · · be the discrete-time Markov Chain with transition function p. Let P x bethe distribution of (z0, z1, · · · ) given Z0 = x. Let ⇠0, ⇠1, · · · be i.i.d. ⇠ exp(1) and

⌧i

:=

⇠i

c(Zi

)

N(t) := inf

(n � 0

�����

nX

i=0

⌧i

> t

)

which is a “càdlàg clock”X(t) = Z

N(t)

where N(t) <1

Proposition 4.11.

(i) p(n)t

(x, y) = P x

(X(t) = y,N(t) < n)

(ii) p⇤t

(x, y) = P x

(X(t) = y,N(t) <1)

(iii)X

y

p⇤t

(x, y) = P x

(N(t) <1)

Proof. Proving (i) proves the whole Proposition by simple implications. Thus w.t.s.only (i)

P x

(X(t) = y,N(t) < n+ 1|⌧0 = s, Z = z)

21

Page 22: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

s < t then ⌧0 < t) N(t) � 1

=

nX

m=1

P x

(Zm

= y,N(t) = m|⌧0, Z = z)

=

nX

m=1

P x

(Zm

= y,

m�1X

i=1

⌧i

t� s <

mX

i=1

⌧i

|⌧0 = s, Z1 = z)

=

nX

m=1

P z

(zm�1 = y,

m�2X

i=0

⌧i

t� s <m�1X

t=0

⌧i

)

=

n�1X

m=0

P x

(Zm

= y,N(t� s) = m)

= P z

(X(t� s) = y,N(t� s) < n)

shift in time provides

P x

(X(t) = y,N(t) < n+ 1) =

Zt

0

c(x)e�c(x)s

| {z }density of ⌧0

X

z 6=x

q(x, y)

c(x)| {z }P

x(Z1=z)

P x

(X(t) = y,N(t) < n+ 1|⌧0 = s, Z1 = z)| {z }=P

x(⌧0>t)�(x,y)

=

Zt

0

e�c(x)sp(n)t

(x, y) + �(x, y)e�c(x)t

= p(n+1)t

(x, y)

m

Corollary p⇤t

satisfies CK integrated KBE

pt

(x, y) = e�c(x)t�(x, y) +

Zt

0

e�c(x)(t�s)X

z 6=x

q(x, z)ps

(z, y)ds

constructed

p(n)t

(x, y)

p⇤t

(x, y) = lim

n!1p(n)t

(x, y)

∗ p⇤t

(x, y) � 0

∗X

y

p⇤t

(x, y) 1

∗ p⇤t

satisfies the CK

22

Page 23: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.3 From Markov Chains to infinitesimal description

∗ p⇤t

satisfies the KBE

N.B. If pt

2 [0, 1] satisfies integrated KBE then pt

(x, y) is continuous in t.

(s < t)

|pt

(x, y)� ps

(x, y)| �(x, y)|e�c(x)t � e�c(x)s|+Z

t

s

e�c(x)(t�s)X

z 6=

q(x, z)ps

(z, y)ds

�(x, y)|e�c(x)t � e�c(x)s|+ c(x)|s� t|Discrete-time Markov Chain

z0, · · ·with transition Matrix

p(x, y) =

8>>><

>>>:

q(x, y)

c(x)y 6= x, c(x) > 0

�(x, y) c(x) = 0

0 otherwise

⇠0, · · · iid. ⇠ exp(1)

⌧i

=

⇠i

c(Zi

)

N(t) = inf

(n � 0|

nX

i=0

⌧i

> t

)

p⇤t

(x, y) = P x

(X(t) = y,N(t) <1)

Theorem 4.12. TFAE

(i) p⇤t

is stochastic X

y

p⇤t

(x, y) = 1

(ii) P x

(N(t) <1) = 1

(iii) 8n :

X

n�0

⌧n

=1, P x-a.s.

(iv) 8n :

X

n�0

1

c(Zn

)

=1, P x-a.s.

Proof. (i), (ii)X

y

p⇤t

(x, y) = P x

(N(t) <1)

{N(t) <1} = {1X

n=0

⌧n

> t}

so (ii), (iii)

23

Page 24: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

(iii), (iv)

Ex

(exp(��nX

i=0

⌧i

)|Z0, Z1, · · · )

= Ex

(exp(��nX

i=0

⇠i

c(Zi

)

)|Z0, · · · )

=

nY

i=0

c(Zi

)

�+ c(Zi

)

by E(exp(�µ⇠) = 1

1 + µ

Ex

(exp(��1X

i=0

⌧i

)|⌧0, · · · ) =1Y

i=0

c(Zi

)

�+ c(Zi

)

Ex

(exp(��X

⌧i

))

| {z }�!0��!1{

P⌧

i

<1}

= Ex

1Y

i=0

c(Zi

)

�+ c(Zi

)

= Ex

1Y

i=0

✓1� �

�+ c(Zi

)

| {z }�!0��!1{

P 1c(Z

i

)<1}

Thus one obtains:

P x

(

X⌧i

=1) = P x

(

X1

c(Zi

)

=1)

m

Theorem 4.13. If p⇤t

is stochastic then it is the unique transition function satisfyingthe KBE and there exists a unique Markov Chain with transition function p⇤

t

Proof. To check p⇤t

is a transition function

p⇤t

(x, x) � P x

(⌧0 > t) = e�c(x)t

One proves that p⇤t

stochastic implies uniqueness of the transition function. The MarkovChain is determined by p⇤

t

m

Claim: If p⇤t

is not stochastic then there exist finitely many stochastic solutions to theKBE.

24

Page 25: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.3 From Markov Chains to infinitesimal description

Sketch of Proof: Fix u 2 S and take independent copies

X(0), X(1), · · ·

of X. X(n) comes of ⌧ (n)i

Given x 2 S, start X(0) at x, X(i) at u for any i � 1.

⌧ (k) =1X

i=0

⌧(k)i

Fori�1X

j=0

⌧ (j) t <iX

j=0

⌧ (j), then

Y (t) = X(i)

t�

i�1X

i=0

⌧ (j)

!

pn(x, y) := P x

(Y (t) = y)

Claim: pn(x, y) transition function satisfies KBE, distinct for different n. Assume fromnow on, that p⇤

t

is stochastic and call it pt

.

Strong Markov Property holds for Markov Chains:

∗ right-continuous filtration

∗ càdlàg paths

∗ y 7! Ey

(f(X(t)))

Definition 4.14 (Stationary Measure). A measure ⇡ on S is said to be stationary, if

8y, t : ⇡(y) =X

x

⇡(x)pt

(x, y)

Homework: IfX

i

⇡(x)c(x) <1 then ⇡ is stationary, iff ⇡(y) =X

x

⇡(x)q(x, y)

Definition 4.15 (Reversible Measure). ⇡ is reversible, if

8x, y, t : ⇡(x)pt

(x, y) = ⇡(y)pt

(y, x)

25

Page 26: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

Fact:

X

y

pt

(x, y)c(y) <1, then reversible, iff

⇡(x)q(x, y) = ⇡(y)q(y, x)

N.B. (i) if ⇡ stationary,P ⇡

:=

X

x2S

⇡(x)P x

E⇡f(X(0)) = E⇡

(f(X(t)))

(ii) reversible implies stationary

(iii) reversible implies

P ⇡

(X(0) = x,X(t) = y) = P ⇡

(X(0) = y,X(t) = x)

(iv) If exists a reversible measure, it is easy to find. Fix ⇡(x) > 0, for any 8y : q(x, y) 6=0 define

⇡(y) = ⇡(x)q(y, x)

q(x, y)

Definition 4.16 (Irreducible Transition Function). The pt

is irreducible, if

8x, y, t : pt

(x, y) > 0

Definition 4.17 (Recurent/Transient State). x is a recurrent State, if

8t0 : 9t > t0 : Px

(X(t) = x) = 1

otherwise x is called transient

Definition 4.18 (Green’s function).

G(x, y) = Ex

Z 1

0

1{X(t)=y}dt

=

Z 1

0

pt

(x, y)dt

Theorem 4.19. x transient, iff G(x, x) <1Proof.

⌧0 = inf{t � 0|X(t) = x}s0 = inf{t � ⌧0|X(t) 6= x}

⌧i+1 = inf{t � s

i

|X(t) = x}si+1 = inf{t � ⌧

i+1|X(t) 6= x}G(x, x) = Ex

X

i:⌧i

<1

si

� ⌧i

=

X

i

Ex1{⌧i

<1}(si � ⌧i)

26

Page 27: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.3 From Markov Chains to infinitesimal description

Strong Markov Property:

Ex

(1{⌧i

<1}(si � ⌧i)|F⌧

i

) = 1{⌧i

<1}Ex

(s0 � ⌧0)) Ex

(1{⌧i

<1}(si � ⌧i)) = P x

(⌧i

<1)e�c(x)

Strong Markov Property provides:

P x

(⌧i+1 <1|F

i

)

= P x

(⌧i

<1, ⌧i+1 <1|F

i

)

= 1{⌧i

<1}Px

(⌧1 <1)

P x

(⌧i+1 <1) = P x

(⌧i

<1)P x

(⌧1 <1)

) P x

(⌧i

<1) = (P x

(⌧i

<1))

i

G(x, x) =1X

i=0

e�c(x)(P x

(⌧i

<1))

i

Thus TFAE(i) x is recurrent

(ii) P x

(8i : ⌧i

<1) = 1

(iii) P x

(⌧1 <1) = 1

(iv) G(x, x) =1m

N.B. Let Z be the embedded discrete time chain.x is recurrent, iff x is recurrent in Z

Theorem 4.20. If X is irreducible, then either∗ every x is recurrent.

∗ every x is transient,8x, y : G(x, y) <1

Proof. p2t+s

(x, x) � pt

(x, y)ps

(y, y)pt

(y, x)

G(x, x) �Z 1

0

p2t+s

(x, x)ds

� pt

(x, y)pt

(y, x)G(y, y)

Thus G(x, x) <1) G(y, y) <1. Let ⌧y

= inf{t � 0|X(t) = y}

G(x, y) = Ex

(

Z 1

0

1{X(t)=y}dt)

SMP

= P x

(⌧y

<1)Ey

(

Z 1

0

1{X(t)=y}dt)

= P x

(⌧y

<1)G(y, y) <1the last step is under the assumption that every step is transient. m

27

Page 28: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

Definition 4.21 (Superharmonic Function). A function f : S ! R is called superhar-

monic if

(i) 8x, t : Ex

(f(X(t))) f(x)

(ii) 8x, t : Ex|f(X(t))| <1

Proposition 4.22. f is superharmonic, iff f(X(t)) is a supermartingale

Proof.

Ex

(f(X(t+ s))|Fs

) = EX(s)(f(X(t)))

f superharmonic implies

Ex

(f(X(t+ s))|Fs

) f(X(s))

by property (ii)Ex|EX(s)f(X(t))| <1

f(X(t)) supermartingale implies

EX(s)f(X(t)) f(X(s)), P -a.s.

also P x

(X(s) = x) > 0 implies

Ex

(f(X(t)) < f(x)

integrability of supermartingale implies (ii) m

Theorem 4.23. Assume X is irreducible, then X is transient, iff there exists a non-constant, bounded and superharmonic function

Proof. Assume transient, fix y

f(x) = G(x, y)

Suppose f bounded and superharmonic. Then f(X(t)) converges a.s.. Recurrent impliesthat f constant. m

Suppose pt

has one absorbing state 0 2 S, but 8x 6= 0, y, t > 0 : pt

(x, y) > 0. Say thischain “survives”, if 8x 6= 0 : P x

(9t : X(t) = 0) < 1.

Theorem 4.24. The chain survives iff

9 non-constant, bounded, superharmonic function f : 8x : f(0) � f(x)

28

Page 29: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.3 From Markov Chains to infinitesimal description

Proof. Suppose the chain survives

f(x) = P x

(9t : X(t) = 0)

Ex

(f(X(t))) = Ex

(PX(t)(9s : X(s) = 0))

= P x

(9s : X(s+ t) = 0)

= P x

(9s � t : X(s) = 0) f(x)

furthermoref(0) = 1

and since the chain survives8x 6= 0 : f(x) < 1

thus we obtain the first part of the equality, since the function is bounded, non-constantand superharmonic.

For the other direction suppose there exists a function with the necessary properties.Fix a state different from 0, say 1. Then

q(x, y) =

8><

>:

1 (x, y) = (0, 1)

�1 (x, y) = (0, 0)

q(x, y) otherwise

Let Y be the Markov chain with Q-Matrix q. Y is irreducible

⌧ = inf{t|X(t) = 0}

can couple X with Y such that: 8t ⌧X(t) = Y (t)

⌧ = inf{t|Y (t) = 0}

Now

Ex

(f(Y (t))) = Ex

(1{t<⌧}f(Y (t))) + Ex

(1{t�⌧}f(Y (t)))

Ex

(1{t<⌧}f(X(t)) + Ex

(1{t�⌧} f(X(t))| {z }f(0)

) = Ex

(f(X(t))) f(x)

Thus Y is transient in turn implying that X survives. m

Theorem 4.25. Suppose tat X is irreducible. If there exists a stationary distributionthen X is recurrent. And if X is recurrent, then there exists a stationary measure.

Proof. Suppose there exists a stationary distribution ⇡, i.e.

⇡(y) =X

⇡(x)1

t

Zt

0

ps

(x, y)ds

29

Page 30: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

Assume transient, thenZ

t

0

ps

(x, y)ds! G(x, y) <1 and thus

1

t

Zt

0

ps

(x, y)ds! 0

implying ⇡(y) = 0.Now suppose recurrent and fix z 2 S.

s = inf{t|X(t) 6= z}⌧ = inf{t � s|X(t) = z}

⇡(x) := Ex

✓Z⌧

0

1{X(t)=x}ds

where ⌧ <1 P z-a.s.. Now the SMP provides

Ez

✓Zu

0

1{X(t)=x}dt

= Ez

✓Z⌧+u

1{X(t)=x}dt

⇡(x) = Ez

✓Zu+⌧

u

1{X(t)=x}dt

=

Z 1

u

P z

(t u+ ⌧, X(t) = x)dt

=

Z 1

0

P z

(t ⌧, X(t+ u) = x)dt

=

Z 1

0

X

y

P z

(t ⌧, X(t) = y,X(t+ u) = x)dt

=

Z 1

0

X

y

P z

(t ⌧, X(t) = y)pu

(y, x)dt

=

X

y

⇡(y)pu

(y, x)

Now to deal with the case that the integrands are not bounded:

⇡(z) = Ez

✓Z⌧

0

1{X(t)=z}dt

= Ez

✓Zs

0

· · ·+Z

s

· · ·◆

=

1

c(z)2 (0,1)

and thus8x : ⇡(x) 2 R+

m

30

Page 31: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.4 Convergence

N.B.

X

x2S

⇡(x) = Ez

(⌧), so we obtain a finite measure iff Ez

(⌧) <1.

Theorem 4.26. Assume irreducible. The stationary measure is unique (up to multipli-cation by constants)

Corollary. Suppose irreducible, recurrent. Either

∗ 8z : Ez

(⌧) =1, all stationary measures are infinite, or

∗ 8z : Ez

(⌧) <1, there exists a unique stationary distribution.

Proof. Let ⇡1, ⇡2 be stationary for pt

pt

(x, y) =⇡i

(y)

⇡i

(x)pt

(y, x)

X

y

pt

(x, y) =1

⇡i

(x)

X

y

⇡i

(y)pt

(y, x) = 1

CK-equations. pt

is recurrent andZ

pt

(x, x)dt =

Zpt

(x, x)dt =1

Let ↵(x) =⇡2(x)

⇡1(x), then

X

y

pt

(x, y)↵(y) =X

y

⇡1(y)

⇡1(x)

⇡2(y)

⇡1(y)pt

(y, x)

=

⇡2(x)

⇡1(x)

= ↵(x)

Let Y0, · · · be discrete-time Markov Chain with transition Matrix pt

, then ↵(Yn

) is anon-negative martingale, hence ↵(Y

n

) converges. This implies: ↵ is constant, whichprovides that: ⇡1 is a multiple of ⇡2. m

4.4 Convergence

Theorem 4.27. Suppose irreducible chain with stationary distribution ⇡, then

8x, y : |pt

(x, y)� ⇡(y)| t!1���! 0

31

Page 32: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

Proof. Let X, Y be independent copies of the chain.

Z(t) = (X(t), Y (t))

Z is a Markov chain, recurrent, stationary distribution

⇡(x, y) = ⇡(x)⇡(y)⌧ = inf{t|X(t) = Y (t)}8x, y : ⌧ <1 P (x,y)-a.s.

W (t) =

(Y (t) t < ⌧

X(t) t � ⌧

m

W (t) os a Markov Chain with transition function pt

P �

x

⇥⇡

(X(t) = y) = pt

(x, y)

P �

x

⇥⇡

(Y (t) = y) = ⇡(y)

P �

x

⇥⇡

(W (t) = y) = ⇡(y)

|pt

(x, y)� ⇡(y)| = |P (X(t) = y)� P (W (t) = y)| 2P (X(t) 6= W (t))

= 2P (t < ⌧)t!1���! 0

Example: branching chain, called Galton-Watson Process

{pk

|k � 0}, pk

� 0,X

pk

= 1, p0 2 (0, 1), p1 = 0,X

kpk

= m <1

and X(t) is the number of individuals in some population. Each individual dies and isreplaced by a number of offspring drawn from {p

k

} at rate 1.

q(k, l) =

(kp

l�k+1 l 6= k

�k l = k

not irreducible, because 0 is absorbing.

Claim: This defines a Markov Chain.

Theorem 4.28. The above construction defines a Markov Chain, if1X

n=0

1

c(Zn

)

=1

where Zn

is the embedded discreet time chain.Let ⇠

i

be indepedent:

32

Page 33: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.5 Kolmogorov Forward equation

Proof.

P (⇠i

= k) = pk+1, k � �1

Zn

= z0 +

nX

i=1

⇠i

Strong law of large numbers provides

1

nZ

n

! m� 1, P -a.s.

thus there is a N , such that8n � N : Z

n

mn

then1X

n=0

1

c(Zn

)

�1X

n=N

1

mn!1 m

4.5 Kolmogorov Forward equation

Theorem 4.29. Assume 8t :X

y

pt

(x, y)c(y) < 1, then the Kolmogorov Forward

Equation (KFE) isd

dtpt

(x, y) =X

z

pt

(x, z)q(z, y)

Proof. do yourself. m

Claim: Ek

(X(t)) = ke(m�1)t

Proof. Assume the KFE validX

l 6=k

q(k, l)l = kX

l�0

pl

(l + k � 1)

= km+ k2 � kX

l

q(k, l)l = k(m� 1)

d

dtEk

(X(t)) =d

dt

X

l

pt

(k, l)

=

X

l

lX

z

pt

(k, z)q(z, l)

=

X

z

pt

(k, z)z(m� 1) = (m� 1)Ek

(X(t))

and thusEk

(X(t)) = ke(m�1)t

33

Page 34: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

We also need Ek

(X(t)) =X

l

pt

(k, l)l <1.

Let Xn

(t) by

qn

(k, l) =

(0 k � n

q(k, l) otherwise

prove Ek

(Xn

(t)) kemt m

∗ m < 1 implies no survival

∗ m = 1 also implies no survival (Proof needs that X(t) is non-negative Martingale

∗ m > 1

⌧0 = inf{t|X(t) = 0} (✓) =

Xpj

✓j � ✓= p0 � ✓ + p2✓

2+ p3✓

3

is convex, (0) > 0, (1) = 0, 0(1) = m� 1 > 0 f(k) = ✓k⇤

Claim: f is harmonicX

pt

(x, y)f(y) = f(x)() f(X(t)) is a martingale. Thusf(X(t)) converges.

✓k⇤ = f(k) = P k

(f(X(t))! 1) = P k

(X ! 0)

Reminder of conditions of the beginning S locally compact polish space. ⌦ thespace of càdlàg functions from R+

0 to the state space S. F = �(t 7! X(t)|t 2 R+0 , s.t.

Ft

is a right-continuous filtration and X(t) is Ft

-measurable.

C(S) = {f 2 C(S,R)|8" > 0 : 9K compact : x /2 K ) |f(x)| "}f 2 C(S) also implies that f is uniformly continuous, since every continuous, boundedf can be written as a pointwise limit of f

n

2 C(S).Let ||f || = sup

k

|f(x)|, then (C(S), || · ||) is a Banach Space. Also recall the Shift

Operator ✓s

: ⌦! ⌦

(✓s

X)(t) = X(s+ t)

Definition 4.30 (Feller Process). A Feller Process is a collection {P x} satisfying

(i) P x

(X(0) = x) = 1

(ii) For any bounded, F -measurable Y : ⌦! R, the Markov Property is satisfied:

Ex

(Y � ✓s

|Fs

) = EX(s)(Y )

34

Page 35: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.6 Feller Semigroup

(iii) The Feller Property (FP)

8t � 0 : 8f 2 C(S) : x 7! Ex

(f(X(t))

is continuous.

N.B. (ii) , (ii) with the property Y (x) = f(X(t))(ii) and (ii) ) Strong Markov Property (SMP)

4.6 Feller Semigroup

Definition 4.31 (Feller Semigroup). {T (t)|t � 0} bounded, linear Operators fromC(S)! C(S) is a Feller semigroup, if

(i) 8f : T (0)f = f

(ii) 8f : lim

t!0T (t)f = f , which means T (t) ! T (0) in the strong operator Topology

Feller Property

(iii) T (t+ s) = T (t)T (s) semigroup property

(iv) if f � 0, then T (t)f � 0

(v) 9fk

2 C(S) : supn

||fn

|| <1 : 8t � 0 : T (t)fn

n!1���! 1 pointwise.

N.B. ∗ (c)) T (s) and T (t) commute

∗ (d) and the Riesz Representation Theorem provides

f 7! (T (t)f)(x)) 8x, t : 9 measureµx,t

: T (t)f(x) =

Zf(y)dµ

x,t

(y)

∗ (e)) µx,t

is a probability measure.Z

fn

(y)dµx,t

n!1���! 1

#Z

dµx,t

= pt

(x, ·)

∗ in (e), if S is compat, can take fn

⌘ 1

∗ in general, can take

fn

(x) =

(1 x 2 K

n

1,� 0 x /2 Kn

Proposition 4.32. (d) ^ (e)) ||T (t)f || ||f ||

35

Page 36: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

Proof. Assume f � 0, compact support. For large enough n, one obtains pointwise||f ||f

n

� f .||f ||T (t)f

n

� T (t)f = T (t)(||f ||fn

� f) � 0

Thus one obtains pointwise

||f ||� T (t)f

) T (t)f ||f ||) ||T (t)f || ||f ||

Assume f has compact support. f = f+ � f�, where f+, f� are the usual negative andpositive part and f+

(x) 6= 0) f�(x) = 0.

T (t)f = T (t)f+ � T (t)f�

|T (t)f(x)| max{|T (t)f+(x)|, |T (t)f�

(x)|} max{||f+||, ||f�||}= ||f ||

Since functions of compact support are dense in C(S) this finishes the proof. m

Example (T (t)f)(x) =X

y

f(y)pt

(x, y)

Definition 4.33 (Resolvent Operator). Let ↵ > 0, then U(↵) : C(S)! C(S)

U(↵)f =

Z 1

0

e�↵tT (t)fdt

||T (s)f � T (s+ t)f || = ||T (s)(f � T (t)f)|| ||f � T (t)f ||

by (b)! 0

thus ||U(↵)f || 1

↵||f || because

||U(↵)f || Z 1

0

e�↵t||T (t)f ||dt

||f || 1↵

Proposition 4.34. 8f 2 C(S) : ↵U(↵)f↵!1���! f

36

Page 37: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.6 Feller Semigroup

Proof.

||↵U(↵)f � f || =����

����↵Z 1

0

e�↵tT (t)f � fdt

����

����

Z 1

0

e↵t||T (t)f � f ||dt

Take " > 0 then there exists a � > 0, s.t. for every t < �

||T (t)f � f || < "

thus

Z�

0

e�↵t"dt+ ↵

Z 1

e�↵t

2||f ||dt "+ 2||f ||e��↵

↵!1) lim sup

↵!1||↵U(↵)f � f || " m

Proposition 4.35.

(U(↵)� U(�) = (� � ↵)U(↵)U(�)

Corollary U(↵), U(�) anticommute.

Proof.

U(↵)U(�)f =

Z 1

0

e�↵tT (t)

Z 1

0

e��sT (s)fdsdt

=

Z 1

0

e�↵t

Z 1

0

e��sT (s+ t)fdsdt

=

Z 1

0

T (r)f

Zr

0

e�↵t��(r�t)dtdr

=

Z 1

0

T (r)f(e�↵r � e��r

)dr

= U(↵)f � U(�)f

m

Examples Lévy Processes: � 2 R, � � 0, ⌫ measure on RZ 1

�1

x2

1 + x2d⌫(x) <1

Let (u) = i�u+

�2u2

2

+

Z 1

�1

✓eiux � 1� iux

1 + x2

◆d⌫(x)

37

Page 38: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

Proposition 4.36. e (u) is the characteristic function of a probability measure, i.e.

9µ : (u) = log

Zeiuxdµ(x)

Proof. Assume ⌫ = 0, Take Gaussian, mean �, variance �2.Next assume ⌫ finite ⌫ 6⌘ 0. Take � = ⌫(R), µ =

�as a probability measure.

N ⇠ Poisson(�) and Y1, · · · i.i.d.⇠ µ

Z =

NX

i=1

Yi

thenlogE(eiuZ) =

Z 1

�1(eiux � 1)d⌫(x)

in the form of with ⌫ = ⌫, � =

Z 1

�1

x

1 + x2d⌫(x).

For any �, � and finite ⌫e (u)

is characteristic function.For general �, �, ⌫, define ⌫

"

by ⌫"

(A) = ⌫(A \ [�", "]), then

"

(u) := ipu+

�2u2

2

Z 1

�1

✓eiux � 1� iux

1 + x2

◆d⌫

"

(x)

one also obtains "

(u)! (u). There is a Theorem stating that the pointwise charac-teristic function is the characteristic function of the limit in distribution of the measurecorresponding to the sequence of characteristic functions. This provides the claim. m

Corollary et (u) is a characteristic function.

N.B. If X has characteristic function ePsi(u), then X is infinitely divisible, i.e. 8nthere are Random Variables Y1, · · · , Yn

, s.t.

X ⇠X

i

Yi

by taking Y with characteristic function

e (u)n

For any t, let ⇠t

be the Random Variable with characteristic function

et (u)

Definition 4.37 (Lévy Semigroup). Let T (t) be

(T (t)f)(x) = E(f(x+ ⇠t

))

38

Page 39: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.7 Generators

Claim T (t) is a Feller semigroup.

N.B. (i) There is a process here.

(ii) Brownian Motion is a Lévy Process (� = 0, � = 1, ⌫ ⌘ 0)

(iii) Compound Poisson Process also

(iv) Feller Process with stationary independent increments is equivalent to a LévyProcess.

4.7 Generators

Let D be the Domain and R be the range.

Definition 4.38 (Generator). For D ⇢ C(S), a linear Operator L : D ! C(S) is calleda Generator, if

(i) D(L) is dense in C(S)

(ii) f 2 D(L),� � 0 andinf

x

f(x) � inf

x

((I � �L)f)(x)

(iii) for any sufficiently small � > 0

R(I � �L) = C(S)

(iv) if S is compact check for 1 2 C(S) and L1 = 0. Otherwise, for any sufficientlysmall � > 0 there exists a f

n

2 D(L) where gn

:= (I � �L)fn

, such that:

sup ||fn

|| <1; sup ||gn

|| <1; fn

pointwise�����! 1; gn

pointwise�����! 1

N.B. If f uniquely has a minimum at x, then (Lf)(x) � 0

Proposition 4.39. (ii) implies for any f 2 D(L)

||(I � �L)f || � ||f ||(equivalently (iii) implies ||(I � �L)�1|| 1

Proof.

inf

x

(I � �L)f(x) inf

x

f(x)

sup

x

(I � �L)f(x) � sup

x

f(x)

m

The Goal is to prove

process, semigroup, Generator

39

Page 40: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

Process to Semigroup

Theorem 4.40. Given a Feller Process, define

(T (t)f)(x) = Exf(X(t))

This defines a Feller semigroup.Proof. (i) complete

(iii)

(T (s+ t)f)(x) = Exf(X(s+ t))

= Ex

(EX(s)(f(X(t))))

= Ex

((T (t)f)(X(s)))

= T (s)(T (t)f)(x)

(iv) complete

(v) Take increasing compact sets Kn

, such that[

Kn

= S, set fn

= 1K

n

, smoothedand f

n

" 1 pointwise. By monotone convergence Theorem:

(T (t)fn

)(x) = Ex

(fn

(X(t)))! 1

(ii)Ex

(f(X(t)))t!0��! f(x)

by right-continuity of the paths of X(t). Define˜U(↵) : L1

(S,R)! L1(S,R)

by˜U(↵)f(x) :=

Z 1

0

e�↵t

(T (t)f)(x)dt

(T (t) can be extended to L1(S,R)). Some facts are:

∗ for any f 2 C(S) : ↵ ˜U(↵)f ! f pointwise∗ ˜U(↵)� ˜U(�) = (� � ↵) ˜U(↵) ˜U(�)

∗ || ˜U(↵)|| 1

Let R =

˜U(↵)C(S) independent of ↵. Take f 2 R, f =

˜U(↵)g, where g 2 C(S)

(T (t)f)(x) = (T (t) ˜U(↵)g)(x)

=

Z 1

0

e�s↵T (s+ t)g(x)ds

=

Z 1

t

e�(r�t)↵T (s)g(x)dr

f(x) =

Z 1

0

e�r↵T (r)g(x)dr

) ||T (t)f � f ||! 0

40

Page 41: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.7 Generators

Claim R is dense in C(S)

Proof every f 2 C(S) can be written as a pointwise limit of fn

2 R, sup ||fn

|| <1 (f

n

= n ˜U(n)f).Thus R is weakly dense in C(S) (because every linear function on C(S) is of theform

f 7!Z

f(x)dµ(x)

Finally, every strongly closed subspace is weakly closed. Thus C(S) is dense in Rm

Semigroup to Generator

Theorem 4.41. Given {T (t)}, define

Lf = lim

t!0

T (t)f � f

t

on D(L) = {f |limit exists}. Then

(i) L is a Generator

(ii) 8↵ > 0 : ↵U(↵) =

✓I � 1

↵L

◆�1

(iii) If f 2 D(L) then T (t)f 2 D(L)

d

dtT (t)f = LT (t)f = T (t)Lf

(iv) 8f 2 C(S), t > 0 :

✓I � t

nL

◆�n

fn!1���! T (t)f ; “T (t) = etL”

Proof. Suppose f = ↵U(↵)g

T (t)f � f

t=

t

Z 1

0

e�↵sT (s+ t)gds� ↵

t

Z 1

0

e�↵sT (s)gds

= ↵e↵t � 1

t

Z 1

t

e�↵sT (s)gds� ↵

t

Zt

0

e�↵sT (s)gds

t!0��! ↵2U(↵)g � ↵g= ↵f � ↵g

Thus f 2 D(L) and Lf = ↵f � ↵g

(i) a) done, because R(U(↵))dense⇢ D(L)

41

Page 42: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

b) gt

= f � �T (t)f � f

tthus

f

✓1 +

t

◆= g

t

+

tT (t)f

✓1 +

t

◆inf

x

f(x) � inf

x

gt

(x) +�

tinf

x

T (t)f(x)

� inf

x

gt

(x) +�

tinf

x

f(x)

) inf

x

f(x) � lim

t!0inf

x

gt

(x) = inf

x

(I � �L)f(x)

So now for any g 2 C(S) one obtains✓I � 1

↵L

◆↵U(↵)g = g

) R✓I � 1

↵L

◆= C(S)

d) Choose gn

, supn

||gn

|| <1, T (t)gn

! 1 pointwise.

fn

= ↵U(↵)gn

) gn

=

✓I � 1

◆fn

Claim fn

! 1 pointwise

fn

(x) = ↵

Z 1

0

e�↵sT (s)gn

(x)dsn!1���! 1

(ii) Know✓I � 1

↵L

◆↵U(↵) = I. Claim that also

8f 2 D(L) : ↵U(↵)

✓I � 1

↵L

◆= f

Let g =

✓I � 1

↵L

◆f, h = ↵U(↵)g, g =

✓I � 1

↵L

◆h but I � 1

↵L is injective.

(iii) f 2 D(L)

T (t+ s)f � T (t)f

s| {z }s!0��! d

dt

T (t)f

= T (t)T (s)f � f

s| {z }s!0��!T (t)Lf

=

T (s)(T (t)f)� T (t)t

s| {z }s!0��!LT (t)f

42

Page 43: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.7 Generators

(iv)✓I � 1

↵L

◆�n

f = ↵nUn

(↵)f . Let ⇠i

be iid. Exp(1) variables

↵U(↵)f = E(T

✓⇠1↵

◆f

↵nU(↵)nf = E

T

1

nX

i=1

⇠i

!f

!

✓I � t

n

◆�n

f = E

T

t

n

nX

i=1

⇠i

!f

!

8" > 0 : P (|Yn

� 1| > ")! 0

8"9� : 8s 2 [(1� �)t, (1 + �)t] : ||T (t)f � T (s)f || < "

||E(T (tYn

)f � T (t)f || E||T (tYn

)f � T (t)f || E1{Y

n

2[1��,1+�]}"+ 2E1{Yn

/2[1��,1+�]}||f ||lim sup

n!1||E(T (tY

n

)f � T (t)f)|| "

m

(i) Brownian Motion

(T (t)f)(x) =1p2⇡t

Zf(x� y)e

y

2

2t dy = Ex

(f(X(t))

Lf = �f

D = {f 2 C(S)|�f 2 C(S)}

(ii) Cauchy Process Cauchy distribution has density

x 7! 1

t

x2+ t2

Define(T (t)f)(x) :=

t

Zf(x+ y)

t2 + y2dy

Claim T (t) is a Feller Semigroup.

T (t)f � f

t=

1

Zf(x+ y)� f(x)� f 0

(x)'(y)

t2 + y2dy

where ' 2 C2,' odd, '0(0) = 1. Then if f 2 C2 one obtains:

T (t)f � f

t=

1

Zf(x+ y)� f(x)� f 0

(x)'(y)

y2dy

43

Page 44: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

Today: If L is a generator, then there exists a Feller semigroup T , s.t.

Lf = lim

t!0

T (t)f � f

t

If T Feller semigroup, then there exists a Feller process such that

(T (t)f)(x) = Ex

(f(X(t)))

Motivation: Start with an infinitesimal description

@u

@t= �u

what does the infinitesimal description imply?

u(x, t) =1p2⇡t

Z 1

�1u(x� y, 0)e�

y

2

2t dy

This can be hard, as you can see from the Navier-Stokes equation for incompressibleflows.

@u

@t= �u� (u ·r)u+ f

Feller Processes allow to characterize the infinitesimal descriptions.

Generator to semigroup Let L be a generator. (I � "L)�1 exists for small " > 0, itis a contraction. Define

L"

= L(1� "L)�1

Note f 2 R((I � "L)�1) then there exists a g, s.t. (I � ")f = g and D(L) 3 f .

Proposition 4.42.

||L"

|| 2

"

Proof. Take g 2 C(S), f = (I � "L)�1g , "Lf = f � g. Thus

||L"

g|| = ||Lf || = 1

"||f � g||

||f ||+ ||g||"

2||g||"

m

Define T"

(t) =X

k�0

tkLk

"

k!Homework: L

"

is a generator and T"

is a Feller semigroup:

L"

f = lim

t!0

T"

(t)f � f

t

44

Page 45: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.7 Generators

Proposition 4.43. If � <"

2

, then

R(I � �L"

) = C(S)

Proof. For g 2 C(S). Solve (I � �L"

)f = g for f .

f =

X�kLk

"

g

converges if �||L"

|| < 1 m

Proposition 4.44. (i) T"

(t), T�

(s), L"

, L�

commute

(ii) (I � ")�1f"0�! f

(iii) 8f 2 D(L) : L"

f ! Lf

Proof. (i) "L"

f = (I � "L)�1f = g

∗ I � "L)�1 and (I � �L)�1 commute, because g = (I � "L)�1(I � �L)�1f

, f = g � �Lg � "Lg + "�L2g

∗ thus L"

and L�

commute

∗ also T"

(s) and T�

(t) commute, and they commute with L"

and L�

(ii) for f 2 D(L) then

(I � "L)�1f � f = "(I"

L)�1Lf

||(I � "L)�1f � f || "||Lf ||"!0��! 0

which is true for any f 2 C(S) by density

(iii) for f 2 D(L), L"

f = (I � "L)�1Lf"!0��! Lf

m

Theorem 4.45 (Hille-Yosida). (i) 8f 2 C(S) : lim"!0

T"

(t)f exists uniformly for boundedt, i.e.

9g8C : lim

"!0sup

t2[0,C]||T

"

(t)f � g|| = 0

(ii) T (t)f := lim

"!0T"

(t)f defines a Feller semigroup.

(iii) L is a generator of T .

45

Page 46: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

Proof. (i) T"

(t)f � T�

(t)f =

Zt

0

d

dsT"

(s)T�

(t� s)fds

=

Zt

0

(L"

T"

(s)T�

(t� s)� T"

(s)L"

T�

(t� s))fds

=

Zt

0

T"

(s)T�

(t� s)(L"

� L�

)fds

||T"

(t)f � T�

(t)f || t||(L"

� L�

)||

Thus T"

(t)f is Cauchy as "! 0. This implies that the limit exists. For f 2 C(S),by density and T

"

(t) contraction provides the claim.

(ii) a) obvious

b) s 7! T"

(s)f on [0, t] is continuous, converges uniformly to s 7! T (s)f whichin turn is continuous.

c) T (s+ t)f = lim

"!0T"

(s+ t)f = lim

"!0T"

(s)T"

(t)f

||T"

(s)T"

(t)f � T (s)T (t)f || ||T"

(s)T"

(t)f � T"

(s)T (t)f ||+ ||T

"

(s)T (t)f � T (s)T (t)|| ||T

"

(t)f � T (t)f ||+ ||T"

(s)� T (s)f ||"!0��! 0

) lim

"!0T"

(s)T"

(t)f = T (s)T (t)f

d) obvious

e) ∗ Define U"

(↵) =

Z 1

0

e�↵tT"

(t)dt

U(↵) =

Z 1

0

e�↵tT (t)fdt

∗ 8f 2 C(S) : U"

(↵)f ! U(↵)f . Thus taking the Limit in

↵U"

(↵) =

✓I � 1

↵L"

◆�1

yields

↵U(↵) =

✓I � 1

↵L

◆�1

(Corollary: 8� > 0 : R(I � �L) = C(S)).

46

Page 47: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.7 Generators

Take fn

, gn

=

✓I � 1

↵L

◆fn

, supn

||gn

|| <1, fn

, gn

pointwise�����! 1.

fn

= ↵U(↵)gn

= ↵

Z 1

0

e�↵tT (t)gn

dt

1

n!1 ��� fn

(x) = ↵

Z 1

0

e�↵t

(T (t)gn

)(x)dx

Let µx,t

be the measure, such that 8f : (T (t)f)(x) =

Zfdµ

x,t

1 = ↵ lim

n!1

Z 1

0

e�↵t

✓Zgn

dµx,t

◆dt

= ↵

Z 1

0

e�↵t µx,t

(S)| {z }

continuous in t

dt

Then the Feller Property implies µs,x

distributionally,s!t�����������! µt,x

, implying:

lim

s!t

µs,x

(S) = µx,t

(S)

thus µx,t

⌘ 1, in turn implying (T (t)gn

)(x)! 1

(iii) Take f 2 C(L)

T"

(t)f � f =

Zt

0

T"

(s)L"

fds

#

T (t)f � f =

Zt

0

T (s)Lfds

T (t)f � f

t=

1

t

Zt

0

T (s)Lfds

t!0��! Lf

Thus D(L) �⇢f | lim T (t)f � f

texists

�. Furthermore, L is an extension of ˜L, the

generator of T . This implies that L is the generator of T .m

Semigroup to Process Let T (t) be a Feller semigroup. For every finite collection oftimes t1, · · · , tk, for any x 2 S, there exists a Probability measure on Sk

: Pt1···t

k

, suchthat:

Ex

t1···tk

⇧fi

(Xi

) = T (t1)[f1 · T (t2 � t1)[f2 · T (t3 � t2)[· · · ]]](x)

47

Page 48: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

Note: This is a consistent family in the sense that: If

i

: (x1 · · · xk

) 7! (x1 · · · xi�1, xi+1 · · · xk

)

then⇧

i

P x

t1···tk

= P x

t1···ti�1,ti+1···t

k

Kolmogorov extension Theorem provides the existence of a probability measure P x onSQ+ , s.t.

8f 2 C(S) : 8t 2 Q+: Exf(X(t)) = (T (t)f)(x) (⇤)

Proposition 4.46. if (M(t), t 2 Q+) is a bounded super-/submartingale, then a.s. for

s 2 Q+

8t > 0 : lim

s%t

M(t) exists

8t 0 : lim

s&t

M(t) exists

Proof. There are not many upcrossings close to t. m

Proposition 4.47. Let (M(s), s 2 Q+ \ [0, t]) be a non-negative supermartingale, then

P

✓M(t) > 0, inf

s2[0,t]M(s) = 0

◆= 0

Proof. Take s1, · · · , sn 2 [0, t] \Q+. The following sets are pairwise disjoint.

Ak,"

= {M(si

) � ", i 2 [k � 1],M(sk

) < "}

E[M(t)1A

k,"

] E[M(sk

)1A

k,"

]

"P (Ak,"

)

E[M(t)1Sk

A

k,"

] "

Take s1, s2, · · · to be an enumeration of [0, t] \Q+

n[

k=1

Ak,"

%⇢

inf

s2[0,t]M(t) < "

which implies

E[M(t)1{infs2[0,t] M(t)<"}] "

E[M(t)1{infs

M(s)=0}] = 0

m

48

Page 49: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.7 Generators

Let (Y (t) : t 2 Q+) be a process satisfying (⇤). Then for any f 2 C(S)

e�↵tT (t)U(↵)f = e�↵t

Z 1

0

e�↵sT (s+ t)ds

=

Z 1

t

e�↵sT (s)fds

e�↵tEx

(U(↵)f)(Y (t)) U(↵)f

where the last line is just the first l.h.s. rewritten. Thus e�↵tU(↵)f(Y (t)) is a super-martingale, in turn implying that for any t 2 R it has left and right limits a.s..

Let F ✓ C(S) be countable and dense (Possible because C(S) is assumed to be Polish).Then a.s.

8f 2 F, 8n 2 N : e�ntU(n)f(Y (t)) has left & right limits everywhere

Also for almost every f , nU(n)fn!1���! f and thus

8f 2 C(S) : 9 : fn

: e�ntU(n)fn

n!1���! f

Hence a.s. for any f 2 C(S), e�ntU(n)f(Y (t)) has left and right limits everywhere. Nowsuppose that for any t, with probability one, there exists a compact set K, s.t. Y (s) 2 Kfor s 2 [0, t]. If there exists a sequence s

n

& s, sn

& s, s.t. Y (sn

)! y, Y (sn

)! y, thenchoosing f 2 C(S) with f(y) = 1, f(y) = 0. This implies that

e�ntU(n)f(Y (r))

has no limits as r ! s.

Claim: 8t 2 R : P (9K compact : 8s 2 [0, t] : Y (s) 2 K) = 1

Proof. Take f > 0, f 2 C(S)

M(t) = e�tU(1)f(Y (t))

is a positive Supermartingale. By one of the previous Lemmas about Martingales, weobtain with probability one:

inf

s2[0,t]M(s) > 0

Take " := inf

s2[0,t]M(s) and g = e�tU(1)f , then 8s 2 [0, t] : g(Y (s)) � ". Thus there exists

a compact K, s.t. x /2 K ! |g(x)| < " which in turn implies 8s 2 [0, t] : Y (s) 2 K.Define X(s) = lim

t&s

Y (s), then this defines the process X on some abstract probabilityspace. In this way we can define X on the space of all càdlàg functions

⌦ := {f : [0,1)! S|f càdlàg}m

49

Page 50: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

4.8 Continuity

Theorem 4.48. Suppose that X(t) is an S-valued stochastic process, ⇢ metric on S

E|⇢(X(t), X(s))|� c|s� t|1+�

for some �, �, c > 0, then càdlàg paths are actually continuous.

Proof. Fix N 2 N. LetD = {x 2 R|9j, n 2 N : 2

jn}W.t.s. Continuity on [0, N ] \D. Let

m

= max

0jN2m�1⇢(X(j2�m

), X((j + 1)2

�m

))

E��

m

N2m�1X

y=0

E⇢(X(j2�m

), X((j + 1)2

�m

))

N2

mC2

�m(1+�)

= CN2

�m�

P (�

m

� CN2

�m

2) 2

�m

2

Borel-Cantelli provides a.s. the existence of an m⇤, s.t. for any m � m⇤: �

m

CN2

�m

2 . Take s, t 2 D \ [0, N ]

|s� t| 2

�n

sm

=

bs2mc2

m

tm

=

bt2mc2

m

For sufficiently large m, one obtains sm

= s, tm

= t, |sm

� tm

| 2

�n

⇢(X(s), X(t)) ⇢(X(sn

), X(tn

)) +

X

m�n

⇢(X(sm

), X(sm+1))

+

X

m�n

⇢(X(tm

), X(tm+1))

�n

+ 2

X

m�n

m

2

X

m�n

m

n!1���! 0

m

50

Page 51: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.9 More on Generators

Non-example Compound Poisson Process with jump measure µ.

Ex|X(t)�X(t+ s)|� = E0|X(s)|�

E0f(X(s)) =X

k�0

sk

k!E(f(⇠1 + · · ·+ ⇠

k

))

where the ⇠i

iid⇠ µ. ThusE0|X(s)|� � sE|⇠1|�

So continuity-Theorem does not apply, which is consistent with the construction of theCPP.

4.9 More on Generators

Motivation: Is it enough to specify the generator on a subset of it’s domain?

Definition 4.49 (closed Linear Operator). A linear operator L : C(S) ! C(S) isclosed, if it’s graph

graph(L) := {(f, Lf)|f 2 D(L)}is closed.

Definition 4.50 (closure of a Linear Operator). If L,M linear operators on C(S) andgraph(M) = graph(L), then M is called the closure of L.

N.B. ∗ closure is unique

∗ may not exist

Homework: L has a closure, iff fn

! 0, Lfn

! g ) g = 0. Remember the definitionof Generators. Some more characteristics:

Proposition 4.51. (i) (i)+(ii)) L exists and L satisfies (i) and (ii)

(ii) (i)+(ii)+(iii)) L closed

(iii) (ii)+(iii)) 8� > 0 : R(I � �L) = C(S)

(iv) (ii)+closed) R(I � �L) closed.

Proof. (i) Supposed fn

! 0, Lfn

! h. Want h = 0. Choose g 2 D(L) “g = h”

||(I � �L)(fn

+ �g)|| � ||fn

+ �||#

||�g � �h� �2Lg|| � �||g||

51

Page 52: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

so one obtains for any � > 0

||g � h� �Lg|| � ||g||g ! h) ||h|| 0

) h ⌘ 0

L satisfies (i). Want: If f 2 D(L) ) ||(I � �L)f || � ||f ||. Take D(L) 3 fn

!f, Lf

n

! Lf , then

(I � �L)fn

! (I � �L)f||(I � �L)f

n

|| � ||fn

||||(i� �L)f || � lim inf ||f

n

||= ||f ||

(ii) f 2 D(L),� > 0 small, then there exists an h 2 D(L), s.t.

(I � �L)h = (I � �L)f) (I � �L)(f � h) = 0

) f = h

) f 2 D(L)

(iii) Fix g 2 C(S), take � > 0 small, � > 0. Want to find an f , s.t. (I � �L)f = g,which holds, iff

�(I � �L)f = �g + (� � �)fwhich in turn holds, iff

�f = �(I � �L)�1g + (� � �)(I � �L)�1f

Now let

� : C(S)! C(S); ��h = �(I � �L)�1g + (� � �)(I � �L)�1h

then

�||�h1 � �h2|| (� � �)||(I � �L)�1(h1 � h2)||

(� � �)||h1 � h2||thus by Banach’s fixed point Thm. there exists an f 2 C(S) : �f = f

(iv) gn

2 R(I��L), gm

! g. We want g 2 R(I��L). There exists an fn

: gn

(I��L)fn

,s.t. f

n

! f , say. L closed implies f 2 D(L), (I � �L)f = gm

Question: Given a generator L, which is determined by L��D

for some dense D ✓ D(L)?

52

Page 53: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.9 More on Generators

Warning: Knowing what L does on a dense subset of D(L) does not determine L!

Corollary. If L is a generator and D ✓ D(L) then

L��D

= L, (I � �L)D is dense

Proof. L��D

satisfies every axiom of a generator except for (iii). Thus L��D

satisfies ev-erything except for (iii). Thus R(I � �L��

D

) = C(S). If (I � �L)D not dense then

R(I � �L��D

) ✓ R(I � �L��D

) 6= C(S)

m

Definition 4.52 (Core of a Generator). D ✓ D(L) is a core for L, if

L��D

= L

Example (Absorbed Brownian Motion). X(t) Brownian Motion

⌧ := inf{t � 0|X(t) = 0}Let

Xa

(t) = X(t ^ ⌧) =(X(t) t < ⌧

0 t � ⌧

on the state space R+0

∗ Xa

(t) is a Markov Process

∗ Xa

(t) is a Feller Process: For any f 2 CR+0

define

f0(x) =

(f(x) x > 0

2f(0)� f(�x) x �

Exf(Xa

(t)) = Exf0(X(t))

(Ta

(t)f)(x) = Exf(Xa

(t))

(Ta

(t)f)0 = T (t)f0

Therefore Xa

(t) is a Feller Process. Recall : L is the generator of X(t), Lf =

f 00

2

andD(L) = {f 2 C(S)|f 0, f 00 2 C(S)}. On R+

0

Ta

(t)f � f

t=

T (t)f0 � f0t

! f 00

2

if f0 2 D(L), which is iff f 0, f 00 2 CR+0 , f

00(0) = f(0) = 0 and f0 2 D(L) implies

Ta

(t)f � f

t

t!0��! f 00

2

53

Page 54: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

Example (Reflected Brownian Motion).

Xr

(t) = |X(t)|∗ X

r

(t) is a Markov Process

∗ Xr

(t) is a Feller Process

for any f 2 CR+0 let

fe

(x) = f(|x|) =(f(x) x > 0

f(�x) x 0

Exf(Xr

(t)) = Ex

(fe

(X(t)))

8t 2 R+0 : T

r

(t)f = T (t)fe

Tr

(t)f � f

t=

T (t)fe

� fe

t#

f 00e

2

=

f 00

2

if fe

2 D(L). One obtains: Lr

f =

f 00

2

,D(Lr

) = {f 2 CR+0 , f

0, f 00 2 C[0,1), f 0(0) = 0}.

N.B. D(Lr

) \D(La

) = {f 2 CR+0 |f 0, f 00 2 CR+

0 , f0(0) = 0, f 00

(0) = 0} is not a core forLa

or Lr

.

Example (Brownian Motion with sticky boundary). Fix c > 0, Lc

f =

f 00

2

,D(Lc

) =

{f 2 CR+0 |f 0, f 00 2 CR+

0 , f0(0) = cf 00

(0)}. For c = 0 we obtain Lr

, for c ! 1 oneobtains L

a

.

Proposition 4.53. Lc

is a generator

Proof. (i) easy

(ii) inf

x

✓f(x)� �

2

f 00(x)

◆ inf

x

f(x) Suppose x0 is the global minimum of f

∗ x0 > 0) f 00(x0) � 0

∗ x0 = 0) f 0(x0) � 0) f 00

(x0) � 0

The case where there is no global Minimum is left as an exercise.

(iii) R(I � �Lc

) = C(S), fix g 2 C(S) and let fa

2 D(La

), fr

2 D(Lr

) satisfy

(I � �La

)fa

= g

(I � �Lr

)fr

= g

Let � satisfy�f 0

a

(0) = c(1� �)f 00r

(0)

then f = �fa

+ (1� �)fr

2 D(Lc

) and (I � �Lc

)f = g

54

Page 55: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.9 More on Generators

(iv) f 0(0) = cf 00

(0) = 0

m

If T (t) is the Brownian semigroup, then

8f : R! R, f 00 2 C(R) : T (t)f � f

t! f 00

2

Lc

f =

f 00

2

D(Lc

) = {f 2 CR+0 |f 0, f 00 2 CR+

0 , f0(0) = cf 00

(0)}(I � �L

c

)f = g

can be solved by taking(I � �L

a

)fa

= g

(I � �Lr

)fr

= g

Proposition 4.54. � =

2c

2c+p2�

Proof.

h = fa

� fr

h� �

2

h00= 0

) h(x) = h(0)e�p

2�

x

f 0a

(0) = h0(0) = �

r2

�h(0)

�f 00r

(0) = h00(0) =

2

�h(0)

�f 0a

(0) = c(1� �)f 00r

(0)

r2

�= c(1� �) 2

m

Proposition 4.55. 8t > 0 : P 0(X

c

(t) = 0) 2 (0, 1)

Proof. Take g 2 CR+0 ,↵ > 0,

✓I � 1

↵L

◆f = g

f(x) = ↵2cf

a

(x) +q

2↵

fr

(x)

2c+q

2↵

fa

(x) = ↵

Z 1

0

e�↵tExg(Xa

(t))dt

fr

(x) = ↵

Z 1

0

e�↵tExg(Xr

(t))dt

55

Page 56: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

Take g % 1R+ . This implies fa

(0) = 0,

f(0)% ↵

Z 1

0

e�↵tP 0(X

c

(t) > 0)dt

fr

(0)% 1

Z 1

0

e�↵tP 0(X

c

(t) > 0)dt =1

1 +

p2↵c

Claim: lim

t!0P 0

(Xc

(t) = 0) = 1

, lim

t!0P 0

(Xc

(t) > 0) = 0

= lim

↵!1↵

Z 1

0

e�↵tP (Xc

(t) > 0)dt

m

Proposition 4.56. Let Y (t) be a Feller process on R+0 with continuous paths.

⌧ = inf{t > 0|Y (t) > 0}then P 0

(0 < ⌧ <1) = 0.

Proof. Z"

= 1{80<t<":Y (t)>0}. Then by the SMP, on {⌧ <1} one obtains:

E0Z"

= E0(Z

"

� ✓⌧

|F⌧

)

we know Z"

� ✓⌧

"!0��! 1. Thus the whole term goes to 1 as "! 0.

lim

"!0Z

"

1{⌧=0} ) E01{⌧=0} � 1

still on {⌧ <1}. m

N.B. Xc

can be constructed as a Brownian Motion time-changed using the local timeat 0.

N(t) = t� ⇢L(t)X

c

(t) = X(N(t))

4.10 Martingales

Let X(t) Feller Process and L the generator.

Theorem 4.57. For any f 2 D(L) and any x 2 S, the following is a P x-Martingale.

8x 2 S : M(t) = f(X(t))�Z

t

0

Lf(X(s))ds

56

Page 57: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.10 Martingales

N.B. P x

(M(0) = f(x)) = 1

Proof.

ExM(t) = Exf(X(t))�Z

t

0

Ex

(f(X(s))ds

= (T (t)f)(x)�Z

t

0

(LT (s)f)(x)ds

= (T (t)f)(x)�Z

t

0

d

ds(T (s)f)(x)ds

= f(x) = M(0)

Call this last equality (⇤). For s < t

Ex

(M(t)|Fs

) = Ex

(f(X(t� s)) � ✓s

�Z

t�s

0

Lf(X(t� s)) � ✓s

ds�Z

s

0

Lf(X(s))ds|Fs

)

= EX(s)f(X(t� s))�Z

t�s

0

EX(s)Lf(X(t� s))ds�Z

s

0

Lf(X(s))ds

= EX(s)M(t� s)�Z

s

0

Lf(X(s))ds

(⇤)= f(X(s))�

Zs

0

Lf(X(s))ds

= M(s)

Moreover |M(t)| ||f ||+ t||Lf || m

Theorem 4.58. If P is a Probability measure on ⌦ = {càdlàg functions on R+0 ! S}

satisfying

∗ P (X(0) = x) = 1

8f 2 D(L) : f(X(t))�Z

t

0

Lf(X(s))ds is a martingale (⇤⇤)

Then P = P x

N.B. Two approaches to construct {P x} from L

(i) L! L"

! T"

! T ! {P x} by Hille-Yosida

(ii) Given L, prove that there exists a unique {P x} satisfying (⇤⇤).“Martingale Problem: Given L, find {P x} satisfying (⇤)”

57

Page 58: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

Proof. Take ↵ > 0, g 2 C(S),

✓I � 1

↵L

◆f = g, then by (⇤⇤)

E(f(X(t))�Z

t

s

Lf(X(r))dr � f(X(s))

| {z }M(t)�M(s)

|Fs

) = 0

Multiply by ↵e�↵t and integrate over [s,1), then

Z 1

s

e�↵t

Zt

s

Lf(X(r))drdt =

Z 1

s

Z 1

r

↵e�↵tLf(X(r))dtdr

=

Z 1

s

e�↵rLf(X(r))dr

E

✓Z 1

s

↵e�↵tf(X(t))� e�↵tLf(X(t))dt

����Fs

◆=

Z 1

s

↵e�↵tf(X(s))dt

= e�↵sf(X(s))

LHS = E

✓Z 1

s

e�↵tg(X(t))dt

����Fs

Thus we obtain

8A 2 Fs

: E1A

Z 1

s

e�↵tg(X(t))dt = E�1A

e�↵sf(X(s))�

as well as8A 2 F

s

: E1A

Z 1

0

e�↵tg(X(s+ t))dt = E (1A

f(X(s)))

For A = ⌦, s = 0 one obtains:Z 1

0

e�↵tEg(X(t))dt = f(x)

Same computation with P x instead of P providesZ 1

0

e�↵tExg(X(t))dt = f(x)

Uniqueness of the Laplace transform provides

8g 2 C(S) : 8t � 0 : Eg(X(t)) = Exg(X(t))

thus for any t, X(t) has the same distribution under P and P x. Suppose that for somefinite collection of times t

i

, (X(t1), · · · , X(tk

)) has the same distribution under P andP x. Take A depending only on (X(t1), · · · , X(t

k

)), s = tk

, t = tk+1 � t

k

, then

E1A

Z 1

0

e�↵tg(X(s+ t))dt = E(1A

f(X(tk

)))

= Ex

(1A

f(X(tk

)))

58

Page 59: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.11 Stationary Distributions

by induction hypothesis. One obtains:

Ex1A

Z 1

0

e�↵tg(X(s+ t))dt

Take g1, · · · , gk 2 C(S), s = tk

, then

EkY

i=1

gi

(X(ti

))

Z 1

0

e�↵tgk+1(X(t

k

+ t))dt = E

kY

i=1

gi

(X(ti

))fk+1(X(t

k

))

!

where✓I � 1

↵L

◆fk+1 = g

k+1. This also holds if you replace E by Ex. Thus

Z 1

0

e�↵tE

kY

i=1

gi

(X(ti

))gk+1(X(t

k

+ t))

!dt =

Z 1

0

e�↵tEx

kY

i=1

gi

(X(ti

))gk+1(X(t

k

+ t))

!dt

) E

k+1Y

i=1

gi

(X(ti

)) = Ex

k+1Y

i=1

gi

(X(ti

))

by the uniqueness of the Laplace Transform. Thus P and P x have the same finitedimensional distributions. F is generated by maps X 7! X(t). Now use the monotoneclass Theorem: Consider

{Y : ⌦! R bounded, measurable, EY = ExY }this class is a vector space closed under monotone convergence and contains

Y (x) =kY

i=1

gi

(X(ti

))

Now for any bounded and measurable Y we obtain EY = ExY , thus P x

= P m

4.11 Stationary Distributions

Definition 4.59 (Stationary Measure). The probability measure µ on S is stationaryfor T (t), if

8f 2 C(S) : 8t � 0 :

Zfdµ =

ZT (t)fdµ

Definition 4.60. Write µT (t) for the probablity measure satisfyingZ

fd(µT (t)) =

ZT (t)fdµ

N.B. ∗ This does define a probability measure

∗ µ stationary iff 8µ = µT (t)

59

Page 60: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

∗ A functional analyst would write µT (t) = T (t)⇤µ, where the star denoted theadjoint operator.

Theorem 4.61. Let F be a core of L, (L��D

= L), then

µ stationary, 8f 2 D :

ZLfdµ = 0

Proof. Suppose µ stationary, f 2 D, then for any t

ZT (t)� f

tdµ = 0

thus ZLfdµ =

Zlim

t!0

T (t)f � f

tdµ = 0

Suppose 8f 2 D

ZLfdµ = 0. Take f 2 D(L), let f

n

2 D, fn

n!1���! f, Lfn

n!1���! Lf ,

then 0 =

ZLf

n

dµ =

ZLfdµ

For f 2 D(L)

T (t)f � f =

Zt

0

d

dsT (s)fds

=

Zt

0

LT (s)ds

ZT (t)fdµ�

Zfdµ =

Z Zt

0

LT (s)fdsdµ

=

Zt

0

ZLT (s)fdµds = 0

Thus 8f 2 D(L) :

ZT (t)fdµ =

Zfdµ. For f 2 C(S) take f

n

2 D(L), that converges

to f and s.t. T (t)fn

converges to T (t)f , thenZ

T (t)fdµ =

Zfdµ

m

Stationary distributions may not exist: Process can “go to infinity”.

Example (Drift to the left).

P x

(X(t) = x� t) = 1

Any stationary distribution is translation invariant, but no such distribution exists.

60

Page 61: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.11 Stationary Distributions

Theorem 4.62. If S is compact, then there exists a stationary distribution.

Proof. S is compact implies that probability measures on S are compact with respectto weak convergence (Prokhorov’s Theorem). Take µ probability measure on S

⌫n

=

1

n

Zn

0

µT (t)dt

which is equivalent to Zfd⌫

n

=

1

n

Zn

0

ZT (t)fdµdt

Now there exists a sequence nk

, s.t. ⌫n

k

* ⌫.Z

T (t)fd⌫n

�Z

fd⌫n

=

1

n

Zn

0

Z

S

T (s+ t)fdµds� 1

n

Zn

0

Z

S

T (s)fdµds

=

Z1

n

✓Zn+t

t

T (s)fds�Z

n

0

T (s)fds

◆dµ

=

1

n

Z ✓Zn+t

n

T (s)fds�Z

t

0

T (s)fds

◆dµ

=

1

n

✓Zn+t

n

ZT (s)fdµds�

Zt

0

ZT (s)fdµds

����Z

T (t)fd⌫n

�Z

fd⌫n

���� 1

n2t||f ||

Since ⌫n

l

* ⌫ impliesZ

T (t)fd⌫n

!Z

T (t)fd⌫Z

fd⌫n

!Z

fd⌫

)Z

T (t)fd⌫ =

Zfd⌫

m

N.B. We really do need Cesaro averages, e.g.

⌫n

= µT (n)

Compactness still applies, but⌫ = lim

k!1nu

n

k

may not be stationary.

Example. Clockwise drift on S1 with speed 2⇡ �x

T (1) = �x

. On the other hand, �x

is

not stationary, since �x

T

✓1

2

◆= ��x

61

Page 62: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

N.B. The set of stationary distributions is convex.Z

T (t)fdµ =

Zfdµ

Example (Fisher-Wright Diffusion). Heuristic derivation: Consider a population of Nindividuals in two types, A, a. Let Z

n

be the number of A’s in generation n. Eachindividual in generation n+ 1 has type A with probability

Zn

n, i.e.

Zn+1 ⇠ Binom

✓N,

Zn

N

This defines a finite-state Markov Chain.

f

✓Z1

N

◆= f(x) +

✓Z1

N� x

◆f 0(x) +

1

2

✓Z1

N� x

◆2

f 00(x) + o

✓Z1

N� x

◆2!

Ek

✓f

✓Z1

N

◆� f

✓Z0

N

◆◆= f 0

(x)Ek

✓Z1

N� x

◆+

1

2

f 00(x)Ek

✓Z1

N� x

◆2

+ o

✓1

N

NEk

✓f

✓Z1

N

◆� f(x)

◆! 1

2

f 00(x)x(1� x)

For f 2 C2([0, 1]), consider Ek

✓f

✓Z1

N

◆� f

✓Z0

N

◆◆which is “= (T (1)f)

✓k

N

◆”, let

x =

k

Nthus

lim

N!1

Ek

�f�Z1N

�� f(x)�

1N

=

1

2

f 00(x)x(1� x)

N.B. It is possible to show convergence of processes from convergence of generators.Let (Lf)(x) =

1

2

f 00(x)x(1 � x) on D = {polynomials f : [0, 1] ! R}. L is called the

Fisher-Wright Generator

Theorem 4.63. (i) L is a generator

(ii) X(t) has continuous paths

(iii) Let ⌧ = inf{t|X(t) = 0 _X(t) = 1}a) P x

(X⌧

= 1) = x

b) Ex

Z 1

0

X(t)(1�X(t))dt = x(1� x)

c) Ex⌧ = 2x log1

x+ 2(1� x) log

1

1� x

62

Page 63: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.11 Stationary Distributions

Proof. a) D is dense.

b) 8f 2 D : ||(I � �L)f || � ||f ||. Suppose x0 is a minimum of f .

∗ if x0 2 (0, 1) then f 00(x) � 0 and also (Lf)(x0) =

f 00(x0)

2

x0(1 � x0) � 0

and thus(I

L)f(x0) f(x0) = inf

x

f(x)

∗ if x0 2 {0, 1}, then Lf(x0) = x0(1� x0)f 00

(x0)

2

= 0, thus

(I � �L)f(x0) f(x0)

c) enough to show, that (I � �L)D is dense in C[0, 1]. The claim is: (I � �L)Dcontains all polynomials. Let g(x) =

nX

k=0

ak

xk, want to find f(x) =

nX

k=0

bk

xk,

s.t. (I � �L)f = g.

(I � �L)f(x) =nX

k=0

xk

✓bk

� �

2

(k(k + 1)bk+1 � k(k � 1)b

k

)

where bn+1 = 0. We can solve recursively

bk

� �

2

(k(k + 1)bk+1 � k(k � 1)b

k

= ak

starting from bn+1 = 0

d) L1 = 0

Claim: Ex

(X(t) � X(s))4 3

16

(t � s)2. By our theorem about continuity, X(t)

has continuous paths. Fix x 2 [0, 1], f(y) = (y � x)2, (Lf)(y) = y(1� y) then oneobtains:

(X(t)� x)2 �Z

t

0

X(s)(1�X(s))ds

is a martingale. Hence

Ex

(X(t)� x)2 =

Zt

0

Ex

(X(s)(1�X(s)))ds t

4

Let f(y) = (x� y)4, (Lf)(y) = (x� y)2y(1� y) then

(X(t)� x)4 � 6

Zt

0

(X(s)� x)2X(s)(1�X(s))ds

63

Page 64: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

is a martingale. This implies

E(X(t)� x)4 = 6

Zt

0

Ex

((X(s)� x)2X(s)(1�X(s))ds

3

2

Zt

0

Zt

0

Ex

(X(s)� x)2ds

3

2

Zt

0

s

4

ds 3

16

t2

Finally if s < t

Ex

(X(t)�X(s))4 = ExEX(s)(X(t� s)�X(0))

4

Ex

3(t� s)2

16

=

3

16

(t� s)2

(i)(ii) a) f(x) = x, Lf = 0, then X(t) is a martingale and X(t) has a limit a.s.

f(x) = x(1� x), (Lf)(x) = �x(1� x)

then X(t)(1 �X(t)) +

Zt

0

X(s)(1 �X(s))ds is a martingale (non-negative),

this also has a (finite) limit a.s.. If X(t) ! a 2 (0, 1), thenZ 1

0

X(s)(1 �X(s))ds =1, which is a contradiction. Hence, lim

t!1X(t) 2 {0, 1}.

lim

t!1ExX(t) = x

= Ex

lim

t!1X(t)

) P x

(X(t)! 1) = x

Once we prove (c), this implies P x

(X(t) = 1) = x

b) X(t)(1�X(t)) +

Zt

0

X(s)(1�X(s))ds is a martingale.

) x(1� x) = Ex

(X(t)(1�X(t)) +

Zt

0

Ex

(X(s)(1�X(s)))ds

The first summand of the RHS converges to 0 by (a) and thus:

x(1� x) =

Z 1

0

Ex

(X(s)(1�X(s))ds

64

Page 65: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.11 Stationary Distributions

c) f(x) = 2x log1

x+ 2(1 � x) log

1

1� x. Careful, since f /2 D(L) because

1

2

f 0(x) = � 1

x(1� x). For " > 0, take f

"

2 D(L), s.t. f"

= f on [", 1 � "]

(because D(L) � C2[0, 1]). Set

⌧"

:= inf{t|X(t) /2 [", 1� "]}

f"

(X(t))�Z

T

0

Lf"

(X(s))ds is martingale

f"

(X(t ^ ⌧"

))�Z

t

0

Lf"

(X(s ^ ⌧"

))ds is martingale

If x 2 [", 1� "] then X(t ^ ⌧"

) 2 [", 1� "] a.s. and thus

f"

(X(t ^ ⌧"

)) = f(X(t ^ ⌧"

))

Lf"

(X(t ^ ⌧"

)) = �1

becausef 00"

(x)

2

x(1�x) = ff 00

(x)

2

x(1�x) = �1 on [", 1� "]. For x 2 [", 1� "]one obtains:

Ex

(f(X(t ^ ⌧"

)) + t ^ ⌧"

) =f(x)

t!1Ex

(f(X(⌧"

)) + ⌧"

) = f(x)

for "! 0) ⌧"

% ⌧ , thus

X(⌧"

) 2 {", 1� "}) f(X(⌧"

))! 0

implyingEx⌧ = f(x)

m

N.B. If X(t) is BM, Y (t) = X(c(t)), c > 0, then Y (t) moves c times as fast as X(t).

lim

t!0

Ef(Y (t))� f(x)

t= c lim

t!0

Ef(X(ct))� f(x)

ct

= c limt!0

Ef(X(t))� f(x)

t

then LY

= cLX

. Intuitively, if I want Y to move c(x) times as fast as X, when Y is nearx, then

(LY

f)(x) = c(x)(LX

f)(x)

=

c(x)

2

f 00(x)

65

Page 66: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

N.B. Another way to do the same thing, is:

dY (t) =p

c(Y (t))dX(t)

(square root comes from Brownian scalingpcB(t)

d

= B(ct))N.B. Suppose that c 2 C(R and 8x 2 R : c(x) 2 [0, K]. If

(Lf)(x) = c(x)f 00

(x)

2

is a generator, then it’s process has continuous paths.Proof. First, pretend f(y) = (x� y)2 _ f(y) = (x� y)4 are in D(L). Then the followingare martingales:

(X(t)� x)2 �Z

t

0

c(x(s))ds

(X(t)� x)4 �Z

t

0

(X(s)� x)2c(X(s))ds

Taking the expectation provides:

Ex

(X(t)� x)2 =

Zt

0

Exc(X(s))ds

tK

Ex

(X(t)� x)4 = 6

Zt

0

Exc(X(s))(X(s)� x)2ds

6K

Zt

0

Ex

(X(s)� x)2ds

6K2

Zt

0

sds

= 3K2t2

To make this rigorous, choose fn

, gn

, s.t.

8y : fn

(y)! (x� y)2, fn

, f 00n

2 C(R, f 00n

(y) 2

8y : gn

(y)! (x� y)4, gn

, g00n

2 C(R, g00n

(y) 12fn

(y)

It’s possible to choose those because of Bernstein polynomials. These allow for approxi-mation of a function and it’s derivatives uniformly on compact sets. Doing the last fewcalculations analogously for f

n

, gn

, we only used:

f 00(y) 2 f(y) = (y � x)2

g00(y) 12f(y) g(y) = (y � x)4

Thus Exgn

(X(t)) K2t2 and by Fatou Ex

(X(t)� x)4 3K2t2, by MP

Ex

(X(t)�X(s))4 ExEX(s)(X(t� s)�X(0))

4

3K2(t� s)2

for any s < t. thus X has continuous paths. m

66

Page 67: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.12 Feller processes & PDE

4.12 Feller processes & PDE

Theorem 4.64. Let L be a generator, T (t) it’s semigroup. The unique solution u(t, x),to

(⇤)

8>>><

>>>:

@u

@t(t, x) = (Lu(t, ·))(x)

u(0, x) = f(x)

sup

t,x

|u(t, x)| <1

is given by

u(t, x) = T (t)f(x)

Example. Lf =

f 00

2

@u

@t=

1

2

@2

@x2u

u(0, x) = f(x)

The unique bounded solution is

u(t, x) =1p2⇡t

Ze�

y

2

2t f(x+ y)dy

But if we forget boundedness, the solution is not unique.

Proof. Remark: This version of the proof is incorrect. Please refer to lecture notes.u(t, x) = T (t)f(x) is a solution because:

@u

@t=

@

@tT (t)f(x)

= LT (t)f(x)

= Lu

For uniqueness, it is enough to consider the case f ⌘ 0. Suppose u(t, x) is a solution to

67

Page 68: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

4 CONTINUOUS TIME MARKOV CHAINS Joe Neeman

(⇤), 8x : u(0, x) = 0, then

u(t, x) =

Zt

0

Lu(s, x)ds

ZN

0

↵e�↵tu(t, x)dt =

ZN

0

↵e�↵t

Zt

0

Lu(s, x)dsdt

=

ZN

0

Lu(s, x)

ZN

s

↵e�↵tdtds

=

ZN

0

Lu(s, x)[e�↵s � e�↵N

]ds

ZN

0

e�↵t

[↵u(t, x)� Lu(t, x)]dt = �e�↵N

ZN

0

Lu(t, x)dt

= �e�↵Nu(N, x)

N!1���! 0

converging, since u is bounded. Hence one obtains:

8↵ > 0 :

Z 1

0

↵e�↵t

✓I � 1

↵L

◆u(t, x)dt = 0

by uniqueness of the Laplace transform this yields:✓I � 1

↵L

◆u(t, x) ⌘ 0

which in turn implies, that u ⌘ 0 m

4.13 Duality (Liggett Duality)

Let (X1(t), T1(t), L1), (X2(t), T2(t), L2) be Feller processes on S1, S2. Also let H : S1 ⇥S2 ! R be bounded and measurable.

Definition 4.65 (Dual Processes). X1 and X2 are dual with respect to H, if

8x1 2 S1, x2 2 S2, t 2 R+0 : Ex1

1 H(X1(t), x2) = Ex22 H(x1, X2(t))

Theorem 4.66. Suppose for any x1, x2, t

(T1(t)H)(x1, ·) 2 D(L2)

(T2(t)H)(·, x2) 2 D(L1)

If (L1H(·, x2))(x1) = (L2H(·, x1))(x2) then X1, X2 are dual with respect to H.

N.B. The converse is obvious, by differentiating the definition of duality.

68

Page 69: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 4.13 Duality (Liggett Duality)

Proof. We will write L1H(x1, x2) instead of (L1H(·, x2))(x1). Note that T1(s)T2(t)H =

T2(t)t1(s)H by using Fubini. Hence, T1(t)L2H = L2T1(t)H. Define u(t, x1, x2) =

T1(t)H(x1, x2). The goal is to show: u(t, x1.x2) = T2(t)H(x1, x2).@u

@t(t, x1, x2) = T1(t)L1H(x1, x2)

= T1(t)L2H(x1, x2)

= L2T1(t)H(x1, x2)

= L2u(t, x1, x2)

Fix x1, set v(t, x2) = u(t, x1, x2), then@v

@t= L2v. Then apply Theorem 4.64 and obtain

sup

t,x2

|v(t, x2)| < 1 because sup

x2

|H(x, x2)| < 1 and T1(t) is a contraction. All of this

yields:

v(0, x2) = H(x1, x2)

v(t, x2) = T2(t)H(x1, x2

LHS = u(t, x1, x2)

This is all there was to show, since

T1(t)H(x1, x2) = Ex11 H(X1(t), x2)

= T2(t)H(x1, x2) = Ex22 H(x1, X2(t))

m

The definition of duality still makes sense if X1 is a Markov chain and X2 is a FellerProcess.

Ex11 H(X1(t), x2) = Ex2

2 H(x1, X2(t))

The Theorem still works in this case

(T1(t)f)(x) =X

y2S1

pt

(x, y)f(y)

= Exf(X1(t))

(L1f)(x) =X

y2S1

q(x, y)f(y)

on D(L1) = {f |for any x 2 S1 the above sum converges}Theorem 4.67. If T1(t)H 2 D(L2), T2(t)H 2 D(L1) and L1H = L2H, then X1 and X2

are dual with respect to H.Proof. The same as the previous proof, noting:

∗ T1 and T2 still commute (by Fubini)

∗ T1 still commutes with L2

✓L2 =

d

dt

��t=0

T2(t)

∗@u

@t= T1(t)L1H, which follows from the KBE.

69

Page 70: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

Claim: For any bounded f : S1 ! R, f 2 D(L1), one gets:

T1(t)L1f(x) =@

@tT1(t)f(x)

=

@

@t

X

y

pt

(x, y)f(y)

T1(t)L1f(x) =X

y

pt

(x, y)X

z

q(y, z)f(z)

These are equal by the KBE. m

5 Spin systems

Denote by S a countable set which is called the sites. Then denote the state space

by {0, 1}S. Elemenents in the state space are called configurations. For x, y 2 S let⌘, ⇠ 2 {0, 1}S then denote by ⌘(x) the xth element in the state space. Also define

⌘x

(y) =

(⌘(y) y 6= x

1� ⌘(x) y = x

A spin system is a Feller process on S := {0, 1}S that only changes one site at a time.Let c : S ⇥ S ! R+ be a bounded function.

Definition 5.1 (Generator). Given c, define L by

(Lf)(⌘) =X

x

c(x, ⌘)(f(⌘x

)� f(⌘))

on D =

(f 2 CS

�����X

x

sup

|f(⌘x

)� f(⌘)| <1)

Meaning of c: in configuration ⌘, c(x, ⌘) is the rate of flippings of x.

N.B. ∗ If S is finite, this defines a Markov Chain and c(x, ⌘) is as above:

q(⌘, ⇠) =

8>><

>>:

c(x, ⌘) ⇠ = ⌘x

0 ⇠ 6= ⌘ _ ⇠ 6= ⌘x

�X

x

c(x, ⌘) ⇠ = ⌘

∗ Define |||f ||| =X

x

sup

|f(⌘x

)� f(⌘)|

70

Page 71: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky

Claim: If f 2 CS then f 2 D, iff there exists an ↵ : S ! R+ withX

x

↵(x) <1s.t. f is Lipschitz with respect to

⇢(⌘, ⇠) =X

x

↵(x)|⌘(x)� ⇠(x)|

If this is a metric, it metrizes the topology.

Proof. Suppose there exists an ↵, s.t. f is ⇢-Lipschitz

|f(⌘)� f(⌘x

)| ⇢(⌘, ⌘x

)

= ↵(x)

) sup

|f(⌘)� f(⌘x

)| sup

⇢(⌘, ⌘x

)

) |||f ||| X

x

↵(x) <1

Now suppose that |||f ||| <1. Set ↵(x) = sup

|f(⌘x

)� f(⌘)|, thenX

x

↵(x) <1.

Given ⌘, ⇠ 2 S, let x1, x2, · · · be an enumeration of S. Let

⌘i

(xj

) =

(⌘(x

j

) j � i

⇠(xj

) j < i

⌘0 = ⌘,⌘i

i!1���! ⇠

f continuous implies

f(⇠)� f(⌘) =1X

i=0

f(⌘i+1)� f(⌘

i

)

|f(⇠)� f(⌘)| 1X

i=0

|f(⌘i+1)� f(⌘

i

)|

X

i�0;⌘(xi

) 6=⇠(xi

)

↵(xi

)

=

1X

i=0

↵(xi

)|⌘(xi

)� ⇠(xi

)|

= ⇢(⌘, ⇠)

m

∗ |||f ||| < 1 doesn’t imply that f is continuous. A suitable counterexample wouldbe

f(⌘) =

(1 infinitely many ⌘(x) = 1

0 otherwise

71

Page 72: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

8x, ⌘ : |f(⌘x

)� f(⌘)| = 0

|||f ||| = 0

But f is not continuous⌘i

= (0, · · · , 0| {z }i

, 1, · · · )

then ⌘i

i!1���! (0, · · · ) but 8i : f(⌘i

) = 1. If f 2 D, thenX

x

c(x, ⌘)(f(⌘x

)� f(⌘))

converges uniformly in ⌘. This implies that Lf is continuous.

∗ If c(x, ·) ⌘ 0 except for finitely many x, then we can define L on CS.

Theorem 5.2. Let �(x, ⌘) = sup

|c(x, ⌘)�c(x, ⌘u

)| and M = sup

x

X

u 6=x

�(x, u). If M <1

let L be a generator.

Motivation: We will show L is a generator by approximating the process with finiteversions. Control on � will imply convergence.

N.B. M <1 is not necessary.

Proof. (i) D is dense in CS. Will use Stone-Weierstrass:∗ D is an Algebra.

– clearly, D is closed under sums.– D is closed under products.

|||fg||| =X

x

sup

|f(⌘x

)g(⌘x

)� f(⌘)g(⌘)|

X

x

sup

|f(⌘x

)g(⌘x

)� f(⌘)g(⌘x

)|+ sup

|f(⌘)g(⌘x

)� f(⌘)g(⌘)|

||g|||||f |||+ ||f |||||g||| <1

∗ D has separate points.

⌘ 6= ⇠ ) 9x :⌘(x) = ⇠(x)

f(⌘) = ⌘(x)

) f 2 D,f(⌘) 6= f(⇠)

∗ S compact∗ D contains constant functions (f constant implies |||f ||| = 0)

72

Page 73: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky

Thus D is dense in CS by Stone-Weierstrass.

(ii) inf

(I � �L)f(⌘) inf

f(⌘). Take f 2 D. Since S compact there exists an ⌘ 2 S,

s.t. f(⌘) = inf

f(⇠) then

Lf(⌘) =X

x

c(x, ⌘)| {z }�0

(f(⌘x

)� f(⌘))| {z }�0

� 0

inf

(I � �L)f(⌘) inf

f(⌘)

(iv) 1 2 D,L1 = 0

m

Definition 5.3. ∗ l1(S) is the space of functions ↵ : S ! R satisfying:X

x2S

|↵(x)| =: ||↵||l1 <1

∗ for f : S ! R, define �f

by

f

(x) := sup

|f(⌘x

)� f(⌘)|

N.B. ||�f

||l1 = |||f |||

∗ " = inf

u,⌘

(c(u, ⌘) + c(u, ⌘u

)). " > 0 means, no sites are “stuck”, which is useful forproving uniqueness of stationary measures.

∗ �(x, u) = sup

|c(x, ⌘u

)� c(x, u)|

∗ M = sup

x

X

u 6=x

�(x, u)

∗ M <1, then define � : l1(S)! l1(S) by

�↵(u) =X

x 6=u

↵(x)�(x, u)

then ||�|| := sup

↵ 6=0

||�↵||l1

||↵||l1

= M

Proposition 5.4. Suppose either

(i) f 2 D or

73

Page 74: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

(ii) c(x, ·) ⌘ 0 except for finitely many x 2 S

holds. If (I � �L)f = g,� > 0,�M < 1 + "�, then

f

((1 + �")I � ��)�1�g

coordinatewise.

Where

((1 + �")I � ��)�1↵ =

1

1 + �"

1X

k=0

✓�

1 + �"

◆k

k↵

✓converges since

�||�||1 + �"

< 1

Proof.

g(⌘u

)� g(u) = f(⌘u

)� f(⌘)� �Lf(⌘u

) + �Lf(⌘)

= f(⌘u

)� f(u)� �X

x

(c(x, ⌘u

)(f((⌘u

)

x

)� f(⌘u

))� c(x, ⌘)(f(⌘x

)� f(⌘)))

= (f(⌘u

)� f(⌘))(1 + �(c(x, ⌘) + c(x, ⌘x

)))

� �X

x 6=u

(c(x, ⌘u

)(f((⌘u

)

x

)� f(⌘u

))� c(x, ⌘)(f(⌘x

)� f(⌘)))

since(⌘

u

)

u

= ⌘

this implies

(f(⌘u

)� f(u))(1 + �c(x, ⌘) + �c(x, ⌘x

)) = g(⌘u

)� g(u)

+ �X

x 6=u

(c(x, ⌘u

)(f((⌘u

)

x

)� f(⌘u

))� c(x, ⌘)(f(⌘x

)� f(⌘)))

Let u fixed, then f(⌘u

)� f(⌘) is continuous in ⌘ which implies the existence of a ⇠, s.t.for any ⌘:

f(⌘u

)� f(⌘) f(⇠u

)� f(⇠)

Then

f((⇠u

)

x

)� f(⇠u

) = f((⇠x

)

u

)� f(⇠u

)

= f((⇠x

)

u

)� f(⇠x

) + f(⇠x

)� f(⇠u

)

f(⇠u

)� f(⇠) + f(⇠x

)� f(⇠u

)

= f(⇠x

)� f(⇠)

74

Page 75: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky

Now

f

(u)(1 + �") (f(⇠u

)� f(⇠)(1 + �c(x, ⌘) + �c(x, ⌘))

�g

(u) + �X

x 6=u

(c(⌘, ⌘u

)(f(⇠x

)� f(⇠))� c(x, ⌘)(f(⇠x

)� f(⇠))

�g

(u) + �X

x 6=u

|c(x, ⌘u

)� c(x, ⌘)|�f

(x)

= �

g

(u) + �(��f

)(u)

Thus (1 + �")�f

�g

+ ���f

coordinatewise.

N.B. � monotone in the sense that ↵ � coordinatewise.

Then also �↵ �� coordinatewise.

f

g

1 + �"+

���f

1 + �"

by induction

f

1

1 + "�

nX

k=0

✓�

1 + �"

◆k

k

g

+

✓�

1 + �"

◆n+1

n+1�

f

Letting n!1 provides�����

�����

✓�

1 + �"

◆n+1

n+1�

f

�����

����� ✓

�M

1 + �"

◆n+1

||�f

||

! 0

This implies�

f

((1 + �")I � ��)�1�

g

m

Theorem 5.5. If M <1, then L is a generator and

|||T (t)f ||| et(M�")|||f |||

Spoiler: If M < ", the estimate will be very useful.

N.B. ∗ f 2 D ) T (t)f 2 D

∗ compare this to: ||T (t)f || ||f ||

75

Page 76: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

Proof. (i) L is a generator. Only need to show R(I��L) is dense in CS. Let Sn

, n � 1

be finite sets increasing towards S.

Ln

f(n) =X

x2Sn

c(x, ⌘)(f(⌘x

)� f(⌘))

is defined and bounded on all CS. Thus R(I � �Ln

) = CS. Now take g 2 D.Will show g 2 R(I � �L). Take f

n

solving

(i� �Ln

)fn

= g

gn

= (I � �L)fn

) gn

2 R(I � �L)

The claim is, that gn

n!1���! g.

||gn

� g||1 = �||(Ln

� L)fn

||1= � sup

|(Ln

� L)fn

(⌘)|

= � sup⌘

�����X

x/2Sn

c(x, ⌘)(fn

(⌘x

)� fn

(⌘))

�����

�KX

x/2Sn

sup

|fn

(⌘x

)� fn

(⌘)|

K = sup

x,⌘

c(x, ⌘)

= �KX

x/2Sn

�fn

(x)

Apply the previous proposition using the second assumption one obtains withf = f

n

, L = Ln

f

n

((1 + �")I � ��)�1�

g

(⇤)||g

n

� g||1 �KX

x/2Sn

((1 + �")I � ��)�1�

g

(x)| {z }

2l1(S)n!1���! 0

Which implies that R(I � �L) is dense in D in turn implying that R(I � �L) isdense in S.

(ii) First, claim thatR(I � �L) � D

Take g 2 D and let f solve (I � �L)f = g. Note that (I � �L)fn

! g, (I � �L is

76

Page 77: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 5.1 Ergodicity

an expansion, thus

fn

! f

) 8x :�

f

n

(x)! �

f

(x)

sup

n

||�f

n

||l1(S) <1) �

f

2 l1(S)

) f 2 D

) g 2 R(I � �L)and by (⇤) and �

f

n

! �

f

ptw. one obtains

f

((1 + �")I � ��)�1�

g

(

I� t

n

L

)

�n

g

✓(1 + �")I � t

n�

◆�n

g

by Induction using (I � �L)�1g 2 D and for↵ � ptw. we obtain:

((1 + �")I � ��)�1↵ ((1 + �")I � ��)�1�

Letting n!1,

✓I � t

nL)�ng

◆! T (t)g

✓✓1 +

t

n"

◆I � t

n�

◆�n

g

! e�"tet��g

e"tetM ||�g

||l1(S)

) �

T (t)g e(M�")t�

g

) ||�T (t)g||l1(S) e(M�")t||�

g

||l1(S)

) |||T (t)g||| e(M�")t|||g|||for any g 2 D.

m

5.1 Ergodicity

Definition 5.6 (Ergodicity). A spin system is called ergodic, if

∗ It has a unique stationary distribution µ.

∗ 8f 2 CS, ⌘ 2 S : T (t)f(⌘)t!1���!

Zfdµ

N.B. A spin system being ergodic is equivalent to

8⌫ : ⌫T (t)(d)�! µ

77

Page 78: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

Theorem 5.7. If M < ", then T (t) is ergodic.

Recall the definitions of M, "

M = sup

x

X

u 6=x

sup

|c(x, ⌘u

)� c(x, u)|

" = inf

u,⌘

(c(u, ⌘) + c(u, ⌘u

))

Proof. Recall that|f(⌘)� f(⇠)|

X

x:⌘(x) 6=⇠(x)

f

(x)

This was the “Lipschitz-condition” from the last time.

sup

⌘,⇠

|f(⌘)� f(⇠)| X

x2S

f

(x) = |||f |||

sup

⌘,⇠

|T (t)f(⌘)� T (t)f(⇠)| |||T (t)f |||

e(M�")t|||f |||The last step follows by the previous Theorem. By an earlier Theorem, there exists astationary distribution µ (since S is compact). Take f 2 D, ⌫ probability measure

����Z

fdµ�Z

fd⌫T (t)

���� =����Z

T (t)fdµ�Z

T (t)fd⌫

����

=

����Z Z

T (t)f(⌘)� T (t)f(⇠)dµ(⌘)d⌫(⇠)

����

Z Z

|T (t)f(⌘)� T (t)f(⇠)|dµ(⌘)d⌫(⇠) e(M�")t|||f |||t!1���! 0

) 8f 2 D :

Zfd⌫T (t)

t!1���!Z

fd⌫

) ⌫T (t)! µ

Where the last step is a result of the density of D m

N.B. ∗ This is very useful but not very sharp.

∗ Ergodicity and Lack of it is of great interest.

Example. Some previous examples of Processes and if they are ergodic or not.

∗ Clockwise motion on S1 is not ergodic.

∗ Fisher-Wright diffusion �0, � are stationary distributions.

78

Page 79: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 5.1 Ergodicity

∗ “nice” Processes are ergodic.

∗ Contact Process

Let S be a graph, write x ⇠ y, if x, y are neighbours. Denote the usual degree ofa node n by deg(n) and assume the degree function is bounded by D. Let

c(x, ⌘) =

(1 ⌘(x) = 1

�|{y ⇠ x|⌘(y) = 1}| ⌘(x) = 0

Intuition: S people

⌘(x) =

(1 x has the flu0 otherwise

If you have the flu, you get cured at rate 1. If you are healthy, you get infected ata rate which is proportional to the number of ill friends you have.

|c(x, ⌘)� c(x, ⌘x

)| (1 u ⇠ x

0 otherwise

)M = � supx

X

u 6=x

sup

|c(x, ⌘)� c(x, ⌘u

)|

� supx

deg(x) �D

)M = �D

" = inf

x,⌘

(c(x, ⌘) + c(x, ⌘x

))

For any n, either c(x, ⌘) = 1 or c(x, ⌘x

) = 1, thus " � 1 (in fact it is 1).

Theorem 5.8. This is a Feller Process for any � > 0. It’s also ergodic, if � <1

D.

Note �0 is always stationary.

∗ Ising Model

S = Zd, d � 1, � > 0, x ⇠ y ifX

i

|xi

� yi

| = 1. Call1

�the “temperature”.

c(x, ⌘) = exp

��X

y⇠x

(�1)⌘(x)+⌘(y)

!

= exp

��X

y⇠x

(2⌘(x)� 1)(2⌘(y)� 1)

!

79

Page 80: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

Motivation: Toy model for magnetization in a solid. ⌘(x) is the “spin” of moleculex. If ⌘(x) has same spin as most ⌘(y), then y ⇠ x, c(x, ⌘) is small. If ⌘(x) hasopposite spin, c(x, ⌘) is large. Large � magnifies the effect. If u ⌧ x

c(x, ⌘)� c(x, ⌘u

) = 0

For k =

X

x⇠y

(�1)⌘(x)+⌘(y) one obtains:

u ⇠ x) c(x, ⌘)� c(x, ⌘u

) = e��k � e��(k±2)

= exp(��k)(1� exp(±2�))

M = sup

|c(x, ⌘)� c(x, ⌘u

)|

= exp(2d�)(1� exp(�2�))"

2

=

1

2

inf

⌘,x

exp

��X

y⇠x

(�1)⌘(y)!

+ exp

�X

y⇠x

(�1)⌘(y)!!

�p

exp(0) = 1

Where the last line follows bya+ b

2

�pab and " � 2.

Theorem 5.9. This is a Feller Process for any �. Also it is ergodic if � �⇤(d)

N.B. ∗ If d = 1 then ergodic for any � > 0 (will prove this later)

∗ If d = 2 then ergodic iff � 1

2

log(1 +

p2)

∗ If d � 3, then non-ergodic for large �

∗ The critical value for � in the case of d � 3 is still unknown.

5.2 Coupling & Attractiveness

Definition 5.10 (Coupling of Measures). A coupling of probability measures µ and ⌫is a random variable (X, Y ) s.t.

X ⇠ µ;Y ⇠ ⌫

Definition 5.11. ⌘, ⇠ 2 {0, 1}S; ⌘ ⇠ if 8x 2 S : ⌘(x) ⇠(x). This induces a partialorder on STheorem 5.12. Let ⌘

t

, ⇠t

be spin systems with rates c1, c2 respectively. Suppose forany ⌘ ⇠, the following holds

c1(x, ⌘) c2(x, ⇠) if ⌘(x) = ⇠(x) = 0

c1(x, ⌘) � c2(x, ⇠) if ⌘(x) = ⇠(x) = 1

80

Page 81: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 5.2 Coupling & Attractiveness

Theorem 5.13. 8⌘0 ⇠0 : 9 coupling of ⌘t

, ⇠t

: P (8t⌘t

⇠t

) = 1

Here only a sketch of proof is of interest: Define a Feller process with state space{(0, 0), (0, 1), (1, 1)}S and rate function

(0, 0) 7!((1, 1) with rate c1(x, ⌘)

(0, 1) with rate c2(x, ⇠)� c1(x, ⌘)

(0, 1) 7!((0, 0) with rate c2(x, ⇠)

(1, 1) with rate c1(x, ⌘)

(1, 1) 7!((0, 0) with rate c2(x, ⇠)

(0, 1) with rate c1(x, ⌘)� c2(x, ⇠)

for � 2 {(0, 0), (0, 1), (1, 1)} let (⌘, ⇠)(y) =

((⌘(y), ⇠(y)) y 6= x

� x = yDefine c(x, �, ⌘, ⇠) ac-

cording to the table, e.g. if (⌘(x), ⇠(x)) = (0, 0) then c(x, (1, 1), ⌘, ⇠) = c1(x, ⌘) andc(x, (0, 1), ⌘, ⇠) = c2(x, ⇠)� c1(x, ⌘)

(Lf)(⌘, ⇠) =X

x,�

c(x, �, ⌘, ⇠)[f((⌘, ⇠)x,�

)� f(⌘, ⇠)]]

The part of the proof this argumentation is going to skip is that L is a generator. Soassume this as a fact.

∗ if f1(⌘, ⇠) = g(⌘) then (Lf1)(⌘, ⇠) = L1g(⌘)

∗ if f2(⌘, ⇠) = g(⇠) then (Lf2)(⌘, ⇠) = L2g(⇠)

Hence

T (t)f1(⌘, ⇠) = T1(t)g(⌘)

T (t)f2(⌘, ⇠) = T2(t)g(⇠)

by uniqueness ofd

dtu(t, x) = Lu. Also one gets

E(⌘0,⇠0)g(⌘t

) = E⌘01 g(⌘

t

)

which means, that ⌘t

has some distribution under P (⌘0,⇠0) and P ⌘0 which implies that{⌘

t

|t � t} is identically distributed w.r.t. P (⌘0,⇠0) and P ⌘0 . So one obtains that P (⌘0,⇠0)

is a coupling of P ⌘01 and P ⇠0

2 . Also P (⌘0,⇠0)(8t : ⌘0 ⇠

t

) = 1.Definition 5.14 (Attractive Spin System). The spin system with rates c is attractive,if for any ⌘ ⇠

c(x, ⌘) c(x, ⇠) if ⌘(x) = ⇠(x) = 0

c(x, ⌘) � c(x, ⇠) if ⌘(x) = ⇠(x) = 1

Corollary. If ⌘t

, ⇠t

are spin systems with rates c, c attractive, if ⌘0 ⇠0 then thereexists a coupling (⌘

t

, ⇠t

), s.t. for any t the relation ⌘t

⇠t

holds.

81

Page 82: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

Examples

∗ Contact Process

Let S be a graph, write x ⇠ y, if x, y are neighbours. Denote the usual degree ofa node n by deg(n) and assume the degree function is bounded by D. Let

c(x, ⌘) =

(1 ⌘(x) = 1

�|{y ⇠ x|⌘(y) = 1}| ⌘(x) = 0

if ⌘(x) = ⇠(x) = 1 then c(x, ⌘) = c(x, ⇠) = 1. If ⌘(x) = ⇠(x) = 0 and ⌘ ⇠

c(x, ⌘) = �|{y ⇠ x|⌘(y) = 1}| �|{y ⇠ x|⇠(y) = 1}| = c(x, ⇠)

∗ Ising Model

S = Zd, d � 1, � > 0, x ⇠ y ifX

i

|xi

� yi

| = 1. Call1

�the “temperature”.

c(x, ⌘) = exp

��X

y⇠x

(�1)⌘(x)+⌘(y)

!

Suppose ⌘ ⇠, ⌘(x) = ⇠(x) = 0. Note deg(x) = 2d. Denote by k = |{y 2 |y ⇠x ^ ⌘(y) = 1}| and m = k = |{y 2 |y ⇠ x ^ ⇠(y) = 1}|. Our assumption assertsk m. Now we can conclude

c(x, ⌘) = exp(�(k � (2d� k)))

= exp(2�(k � d))

c(x, ⇠) = exp(2�(m� d)) � c(x, ⌘)

thus the process is attractive.

∗ Noisy voter model (homework)

5.3 Monotonicity

Definition 5.15 (Increasing function on Spin Systems). Denote f 2 CS to be increas-

ing if8⌘ ⇠ ) f(⌘) f(⇠)

Denote the set of all increasing functions as M

Definition 5.16. Let µ, ⌫ be probability measures on S then denote µ ⌫, if

8f 2M :

Zfdµ 2 fd⌫

82

Page 83: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 5.3 Monotonicity

N.B. The definitions make sense if S us replaced by any partially ordered set. In the caseof R, this gives the “usual” definition of increasing and yields the property of “stochasticdomination”:

µ ⌫ , 8t : µ[t,1) ⌫[t,1)

Define �0 be the unit mass on 0

S and �1 the unit mass on 1

S. Then for any µ oneobtains: �0 µ �1.

Theorem 5.17. Let T (t) be the semigroup of an attractive spin system.

(i) f 2M) T (t)f 2M(ii) µ ⌫ ) µT (t) ⌫T (t)

(iii) 8s t : �0T (s) �0T (t) ^ �1T (s) � �1T (t)

(iv) ⌫ := lim

t!1�0T (t) and ⌫ := lim

t!1�1T (t) exist and are stationary

(v) 8µ, t : �0T (t) µT (t) �1T (t)

(vi) if µT (tk

)

t

k

k!1���!1������! ⌫ then ⌫ ⌫ ⌫

(vii) ergodic iff ⌫ = ⌫

Application: Ising Model S = Zd, d � 1, � > 0, x ⇠ y ifX

i

|xi

� yi

| = 1. Call1

�the

“temperature”.

c(x, ⌘) = exp

��X

y⇠x

(�1)⌘(x)+⌘(y)

!

Theorem 5.18 (1925, Ising). 8� > 0 the model is ergodic.

Proof. Ising model is attractive. The goal is to construct µ, s.t.

µ ⌫, µ � ⌫ ) µ = µ = µ

Define the following transition matrix (on {0,1})

P =

1

e� + e��

✓e� e��

e�� e�

Define µ by

µ{⌘|8k x l : ⌘(x) = ⇠(x)} =

1

2

l�1Y

x=k

P (⇠(x), ⇠(x+ 1))

83

Page 84: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

Also, µ is the distribution of the stationary, doubly-infinite Markov-Chain with transitionMatrix P , i.e. P gives a discrete-time Markov Chain with stationary distribution ⇡(0) =⇡(1) =

1

2

. So start a Markov Chain X with X(0) ⇠ ⇡. Starting another one backwards,defines a stationary process

{X(i)|i 2 Z} ⇠ µ

Claim: µ � ⌫ (µ ⌫ follows by symmetry). For m 2 N the following holds

cm

(x, ⌘) =

8><

>:

c(x, ⌘) |x| < m

0 |x| � m ^ ⌘(x) = 1

e2� |x| � m ^ ⌘(x) = 0

Now 8x, ⌘ : e2� � c(x, ⌘), thus for any ⌘ ⇠

c(x, ⌘) cm

(x, ⇠) if ⌘(x) = ⇠(x) = 0

c(x, ⌘) � cm

(x, ⇠) if ⌘(x) = ⇠(x) = 1

Start with ⌘0 = ⇠0 = 1

Z, couple (⌘t

, ⇠t

) so 8t : ⌘t

⇠t

, where c is the rate of ⌘t

and cm

isthe rate of ⇠

t

.

∗ ⇠t

is constant outside of (�m,m).

∗ ⇠t

is a finite-state Markov Chain. This implies that there exists a stationarymeasure µ

m

.

∗ µm

{⌘|8 �m x m : ⌘(x) = ⇠(x)}

= µ{⌘|8 �m x m : ⌘(x) = ⇠(x)|8|x| � m : ⌘(x) = 1}

which is possible to check by using Reversibility.

∗ ⌘t

! ⌫ in distribution

∗ ⇠t

! µm

in distribution

∗ ) µm

� ⌫

∗ µm

m!1���! µ

Which provides the claim. m

5.4 Associated Measure

Definition 5.19 ((Positively) Associated Measure). µ is (positively) assoicated if

8f, g 2M :

Zfgdµ �

Zfdµ

Zgdµ

84

Page 85: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 5.4 Associated Measure

Example “Chebyshev’s Rearrangement inequality”: If f, g : [0, 1] ! R are increas-ing, then Z 1

0

f(x)g(x)dx �Z 1

0

f(x)dx

Z 1

0

g(x)dx

N.B. Every unit mass is associated.Z

fgd�n

= f(⌘)g(⌘)

=

Zfd�

Zgd�

Theorem 5.20 (Harris, Fortuin, Kasteleyn, Giniboe (FKG)). Every Product measureis associated.Theorem 5.21. The following are equivalent

(i) ⌘t

is attractive

(ii) For any associated µ, µT (t) is associated.Corollary. ⌘

t

attractive implies that ⌫, ⌫ are associated.Proof. W.l.o.g. pick �0

∗ �0 is associated

∗ By the Theorem, it follows, that �0T (t) is associated

∗ �0T (t)! ⌫

∗ Thus ⌫ is associatedm

The following proves the Theorem.

Proof. Consider (i)) (ii).

Proposition 5.22. ⌘t

attractive and f, g 2M implies

T (t)(fg) � T (t)fT (t)g

Proof. Take f, g 2M \D, then

8⌘, x : [f(⌘x

)� f(⌘)][g(⌘x

)� g(⌘)] � 0

L(fg)(⌘)� f(⌘)Lg(⌘)� g(⌘)Lf(⌘) =X

c(x, ⌘)[f(⌘x

)g(⌘x

)� f(⌘)g(⌘)]

�X

x

c(x, ⌘)[f(⌘)g(⌘x

)� f(⌘)g(⌘)]

�X

x

c(x, ⌘)[f(⌘x

)g(⌘)� f(⌘)g(⌘)]

=

X

x

c(x, ⌘)[f(⌘x

)� f(⌘)][g(⌘x

)� g(⌘)] � 0

85

Page 86: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

Thus

d

dsT (s)(T (t� s)fT (t� s)g) = T (s)L(T (t� s)fT (t� s)g)� T (s)(LT (t� s)fT (t� s)g)

� T (s)(T (t� s)fLT (t� s)g)

= T (s)[L(T (t� s)fT (t� s)g)� T (t� s)fLT (t� s)g

� T (t� s)gLT (t� s)f ]

since f, g 2M we have T (t� s)f, T (t� s)g 2M and thus

d

dsT (s)(T (t� s)fT (t� s)g) � 0

T (t)fT (t)g T (t)(fg)

Claim: 8f 2M : 9fn

2M\D : fn

& f . This will prove the Lemma for any f, g 2M.

Proof. Take Sn

finite, increasing sequence of sets, Sn

! S. Define �n

(⌘) 2 S, by

�n

(⌘)(x) =

(⌘(x) x 2 S

n

1 otherwise

fn

(⌘) = f(�n

)(⌘)

∗ fn

2M because f 2M∗ f

n

2 D because it depends only on ⌘(x) for x 2 Sn

.

∗ fn

(⌘) is non-increasing in n because �n

(⌘) is non-increasing in n.

∗ fn

(⌘)n!1���! f(⌘) because �

n

(⌘)n!1���! ⌘ in the Product Topology of S and f 2 CS.

This proves the claim. m

The claim proves the Proposition. m

Take µ associated, f, g 2MZ

fgdµT (t) =

ZT (t)(fg)dµ

(by Proposition) �Z

T (t)fT (t)gdµ

(by µ being associated) �Z

T (t)fdµ

ZT (t)gdµ

=

ZfdµT (t)

ZgdµT (t)

m

86

Page 87: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 5.4 Associated Measure

The following is a sketch of proofs of Harris’ Theorem:

Proof. ∗ For every Product measure µ on S, there exists a spin system ⌘t

, s.t. ⌘t

isergodic with stationary measure µ.

∗ ⌘t

has independent coordinates. This implies that ⌘t

is attractiveThus ⌫ is associated, so µ is associated. m

Proof. The proof of (ii)) (i) is homework.

∗ If for any product measure µ, µT (t) is assoicated, then ⌘t

is attractive.

∗ For any associated µ, µT (t) is associated. This implies that for any productmeasure µ, µT (t) is associated.

m

Application of associated measures: If ⌘t

is attractive, then

⌫{⌘(x1) = ⌘(x2) = 1} � ⌫{⌘|⌘(x1) = 1}⌫{⌘|⌘(x2) = 1}Proof.

f(⌘) = ⌘(x1)

g(⌘) = ⌘(x2)

m

Application: Voter Model Take q(x, y), q : S ⇥ S ! R∗ 8x 6= y : q(x, y) � 0

∗ q(x, x) = �X

y 6=x

q(x, y) =: �c(x)

∗ sup

x

c(x) <1All of those form a sufficient condition for the existence of a Markov Chain with Q-Matrixq. The Voter Model is the spin system

c(x, ⌘) =X

y:⌘(y) 6=⌘(x)

q(x, y)

Interpretation: x 2 S are “voters” that have binary opinions. Voter x changes theiropinion to agree with voter y at rate q(x, y). Alternatively one could think of thisscenario as a model of competing species.

Proposition 5.23. The voter model defines a spin system, i.e.

M := sup

x

X

u

sup

|c(x, ⌘)� c(x, ⌘u

)|| {z }

q(x,u)| {z }

c(x)

<1

87

Page 88: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

Coalescing Markov Chains Markov Chain on

{A ✓ S||A| <1}8x 2 A, y /2 A :q(A, (A \ {x}) [ {y}) = q(x, y)

q(A,A \ {x}) =X

y2A,y 6=x

q(x, y)

Proposition 5.24. This defines a Markov Chain.

Proof.

c(A) =X

B 6=A

q(A,B)

=

X

x2A;y/2A

q(x, y) +X

x2A;y2A;y 6=x

q(x, y)

=

X

x2A

X

y 6=x

q(x, y)

X

x2A

sup

x

X

y 6=x

q(x, y)

| {z }:=M<1

M · |A|

Let An

be the embedded discreet time chain. By the construction of the process |Ak

| isnon-increasing. By the previous analysis of Markov Chains

1X

k=0

1

c(Ak

)

=1, Markov Chain exists

Now it’s possible to conclude:

1X

k=0

1

c(Ak

)

�1X

k=0

1

M |Ak

| �1X

k=0

1

M |A0| =1

m

Let At

be the Markov Chain.

Alternative construction: Let {Yx

(t)|t � 0, x 2 S} be defined by

∗ for any x 2 S, Yx

(t) is a Markov chain on S with Q-Matrix q(x, y), starting fromx.

∗ Yx

(t) and Yy

(t) are independent until they meet, at which point they’ll always beequal to one another.

88

Page 89: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 5.4 Associated Measure

Then At

d

= {Yx

(t)|x 2 A0}.Now considering the set, Y

x

(t) moves to y with rate q(Yx

(t), y). If y /2 A, then{Y

x

(t)|x 2 A1} moves to {Yx

(t)|x 2 A0} \ {Yx

(t)} [ {y}.Regarding the Voter Model, there is another remark:

N.B. �0 and �1 are stationary, so the Model is not ergodic and for any � 2 [0, 1],��1 + (1 � �)�0 is also stationary. The question is, if there are any other stationarydistributions.

On the other hand if y 2 A, then {Yx

(t)} moves to {Yx

(t)|x 2 A0} \ {Yx

(t)}.Theorem 5.25. ⌘

t

and At

are dual w.r.t.

H(⌘, A) =Y

x2A

⌘(x) = 1{⌘⌘1 on A}

Reminder: This means, that

8t, ⌘0, A0 : E⌘0H(⌘

t

, A0) = EA0(⌘0, At

)

N.B. To prove the Theorem, it is enough to show:

(i) E⌘0H(⌘t

, ·) 2 D(LA

)

(ii) EA0H(·, At

) 2 D(L⌘

)

(iii) 8⌘, A : L⌘

H(⌘, A) = LA

H(⌘, A), where the l.h.s. and r.h.s. respectively are(L

H(·, A))(⌘) and (LA

H(⌘, ·))(A)Proof. Let L

be the generator of ⌘t

(L⌘

f)(⌘) =X

x

c(x, ⌘)(f(⌘x

)� f(⌘))

for f 2 D = {g 2 CS|X

x

sup

|f(⌘x

)� f(⌘)| <1}.

(LA

f)(A) =X

B⇢S:|B|<1

q(A,B)(f(B)� f(A))

on DA

= {g|8B : (LA

g)(B) converges}. Refer to the Remark above for the threenecessary steps to complete the proof.

(i)X

B

q(A,B)|E⌘0(H(⌘

t

, B)�H(⌘t

, A))| X

B

q(A,B)

=

X

x2A

X

y 6=x

q(x, y)

|A| supx

c(x) <1

89

Page 90: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

(ii)X

x

sup

|EA0H(⌘x

, At

)�H(⌘, At

)| EA0X

x

sup

|H(⌘x

, At

)�H(⌘, At

)|

x /2 At

) H(⌘x

, At

) = H(⌘, At

)

) = EA0X

x2At

sup

|H(⌘x

, At

)�H(⌘, At

)|

EA0 |At

| |A0| <1

(iii) L⌘

H(⌘, A) =X

x

c(x, ⌘)[H(⌘x

, A)�H(⌘, A)]

=

X

x2A,y2S⌘(x) 6=⌘(y)

q(x, y)[H(⌘x

)�H(⌘, A)]

∗ if ⌘ 6⌘ 1 on A \ {x}, then H(⌘, A) = H(⌘x

, A) = 0.∗ if ⌘ ⌘ 1 on A \ {x}, then H(⌘, A) = ⌘(x) and H(⌘

x

, A) = 1� ⌘(x)∗ if ⌘ ⌘ 1 on A \ {x} and ⌘(x) 6= ⌘(y), then H(⌘

x

, A) = H(⌘, A \ {x} [ {y}).This implies, that

L⌘

(⌘, A) =X

x2A,y2S

1{⌘(x) 6=⌘(y)}[H(⌘x

, A)�H(⌘, A)]q(x, y)

=

X

x2A,y2Sy 6=x

q(x, y)[H(⌘, A|{x} [ {y})�H(⌘, A)]

=

X

B

q(A,B)[H(⌘, B)�H(⌘, A)]

= LA

H(⌘, A)

m

Probabilistic interpretation: Imagine one keeps track of which voters cause the othervoters to change. Run for time t0 and fix x0 2 S. Let t1 < t0 be the last time that x0

changed. Let x1 be the influencer.Let t2 < t1 be the last time that x1 changed. Let x2 be the influencer.Continue this inductively and obtain: a decreasing sequence of times t

n

and a sequencein S of x

n

.

Claim: Set {Zx

(t0 � s)|s 2 [0, t0]} by Zx

(s) = xk

if s 2 [tk+1, tk) then

Zx

(0) = x0, Zk

(t0) = xk

Then {Zx

(t)|x 2 A, t 2 [0, t0]} d

= {Yx

(t)|x 2 A, t 2 [0, t0]}.In other words, A

t�s

“is” the set of people who at time s affected the opinions of votersx 2 A0 at time t.

90

Page 91: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 5.5 Stationary distributions of voter model

t

Figure 1: Basic Voter Scenario

N.B. For actual calculations use the Theorem above and the equivalent formula of thefollowing Remark.

Application: Consider P ⌘

(⌘t

(x) = ⌘t

(y) = 1). Pick A0 = {x, y}, then

= E⌘H(⌘t

, A0)

= EA0H(⌘, At

)

= P (⌘(Yx

(t)) = ⌘(Yy

(t)) = 1)

5.5 Stationary distributions of voter model

From now on:

∗ S = Zd

∗ 8x, y : q(x, y) = q(0, y � x)

∗ q is irreducible

Enumerate elements in S as xn

and pick Xx

n

(t) independent Markov Chains startingfrom x

n

respectively. Z(t) = Xx1(t)�X

x2(t).

Theorem 5.26. Suppose Z(t) is recurrent.

(i) 8⌘0 : 8x, y : P ⌘0(⌘

t

(x) = ⌘t

(y))t!1���! 1

(ii) The set of measures {��1 + (1 � �)�0|� 2 [0, 1]} contains the only stationarymeasures for the Voter Model.

(iii) Suppose that pt

(x, y) is the transition function of Xx

(t), then the following areequivalent:

a) µT (t)! ��1 + (1� �)�0

91

Page 92: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

b)X

y

pt

(x, y)µ{⌘|⌘(y) = 1} t!1���! �

N.B. Z is recurrent iff

8x, y : Xx

(t) = Xy

(t) infinitely often

Proof. (i) P ⌘0(⌘

t

(x = ⌘t

(y) = 1) = PA0(⌘ ⌘ 1 on A

t

) for A0 = {x, y}. Similarly, onefinds

P ⌘0(⌘

t

(x = ⌘t

(y) = 0) = PA0(⌘ ⌘ 0 on A

t

)

Define ⌧ = inf{t > 0|Xx

(t) = Xy

(t)}. Recurrence yields

P (⌧ > t)t!1���! 0

and also

P ⌘0(⌘

t

(x) = ⌘t

(y)) = PA0(⌘0 constant on A

t

)

=� PA0(|A

t

| = 1)

= P (⌧ t)t!1���! 1

(ii) Take µ stationary. Fix x, y, then

µ(⌘(x) 6= ⌘(y)) = µT (t){⌘(x) 6= ⌘(y)}=

ZP ⌘

(⌘t

(x) 6= ⌘t

(y))dµ(⌘)

t!1���! 0

where the last step was concluded by the first part and dominated convergence.Thus for any x, y, one obtains:

µ{⌘(x) 6= ⌘(y)} = 0

which yields

µ{⌘ not constant} X

x,y2S

µ{⌘(x) 6= ⌘(y)}

= 0

and thus independent of the starting value x, for � := µ{⌘|⌘(x) = 1}

µ = µ = ��1 + (1� �)�0

(iii) Take any µ, thenµT (t)

t!1���! ��1 + (1� �)�0

92

Page 93: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 5.5 Stationary distributions of voter model

by monotone class Theorem (no rigorous verification given) holds if and only if

8 finite A : µT (t){⌘|⌘ ⌘ 1 on A} t!1���! �

If A0 = {x} then At

= {Xx

(t)}.

µT (t){⌘|⌘(x) = 1} =

ZP ⌘

(⌘t

(x) = 1)dµ(⌘)

=

ZPA0

(⌘ ⌘ 1 on At

)dµ(⌘)

=

ZP (⌘(X

x

(t)) = 1)dµ(⌘)

Fubini

=

X

y

pt

(x, y)µ{⌘|⌘(y) = 1}

This proves that the following are equivalent for |A| = 1

∗ µ(t){⌘|⌘ ⌘ 1 on A} t!1���! �

∗X

y

pt

(x, y)µ{⌘|⌘(y) = 1} t!1���! �

Now take an A0 = {x1, · · · , xk

}. By the first part

8⌘ : P ⌘

(⌘t

(x1) = · · · = ⌘t

(xk

))

t!1���! 1

Also

µT (t){⌘|⌘ ⌘ 1 on A}� µT (t){⌘|⌘(x1) = 1}=

ZP ⌘

(⌘t

⌘ 1 on A)� P ⌘

(⌘t

(x1) = 1)| {z }t!1���!0

dµ(⌘)

m

Exercise Construct ⌘0 such that �⌘0T (t) has no limit. The next goal is to show that

Z(t) = Xx

(t)�Xy

(t) is transient. Let ⌫⇢

be the product measure on S, ⇢ 2 [0, 1], s.t.

8x : ⌫⇢

{⌘|⌘(x) = 1} = ⇢

Proposition 5.27. µ⇢

:= lim

t!1⌫⇢

T (t) exists and satisfies 8x : µ⇢

{⌘|⌘(x) = 1} = ⇢

Proof. Recall that |At

| is non-increasing. This means, that for any finite |A0|, |A1|

93

Page 94: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

exists a.s. as a limit.

⌫⇢

T (t){⌘|⌘ ⌘ 1 on A0} =

ZP ⌘

(⌘ ⌘ 1 on A0)d⌘⇢(⌘)

duality

=

ZPA0

(⌘ ⌘ 1 on At

)d⌫⇢(⌘)

=

X

B

PA0(A

t

= B)⌫⇢

{⌘|⌘ ⌘ 1 on B}

=

X

B

PA0(A

t

= B)⇢|B|

= EA0⇢|At

| t!1���! EA0⇢|A1|

a.s.. Thus for µ⇢

{⌘|⌘ ⌘ 1 on A} = EA⇢|A1| one obtains ⌫⇢

T (t)t!1���! µ

. If A = {x},then µ

{⌘|⌘(x) = 1} = E{x}⇢|A1|= ⇢. m

Theorem 5.28. All µ⇢

are “the extreme points” of the set of all stationary measures.Definition 5.29.

g(A) := PA

(|A1| < |A|) = PA

(9t : |At

| < |A|)Proposition 5.30. (i) A ⇢ B ) g(A) g(B)

(ii) 8|A| � 2 : g(A) X

B⇢A,|B|=2

g(B)

(iii) lim

x!1g({0, x}) = 0

(iv) 8{x1, · · · , xk

} : lim

t!1g({X

x1(t), · · · , Xx

k

(t)}) = 0 a.s.

(v) 8A0 : g(At

)

t!1���! 0 a.s.Proof. (i) Clear. However, note that

g({x1, · · · , xk

}) = P (9i, j, t > 0 : Xx

i

(t) = Xx

j

(t))

(ii) g({x1, · · · , xk

}) X

i,j2[k]

P (9i, j, t > 0 : Xx

i

(t) = Xx

j

(t)) =X

i<j

g({xi

, xj

})

(iii) Note: Z(t) is symmetric and translation-invariant, i.e.

P z

(Z(t) = y) = P y

(Z(t) = z)

P z

(Z(t) = y) = P 0(Z(t) = y � z)

P x

(Z(2t) = y) =X

z

P x

(Z(t) = z)P z

(Z(t) = y)

CS" X

z

P x

(Z(t) = z)2

! X

z

P z

(Z(t) = y)2

!# 12

(⇤)

= P 0(Z(2t) = 0)

94

Page 95: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 5.5 Stationary distributions of voter model

Where the first sum is equal to P x

(Z(2t) = x) and the second to P y

(Z(2t) = y) bysymmetry. The last step exploits translation invariance. The following “Green’sfunction” is finite because Z is transient.

G(x, y) =

Z 1

0

P x

(Z(t) = y)dy = Ex

Z 1

0

1{Z(t)=y}dt1

By (⇤), SMP and the fact that G(0, 0) = G(y, y), one obtains:

P x

(Z(s) = y for some s � t)G(y, y)SMP

=

Z 1

t

P x

(Z(s) = y)ds

(⇤)Z 1

t

P 0(Z(s) = 0)ds = P 0

(Z(s) = 0 for some s � t)G(0, 0)

) P x

(Z(s) = y for some s � t) P 0(Z(s) = 0 for some s � t)

g({0, x}) = P x

(Z(s) = 0 for some s � t)

P x

(Z(s) hits 0 in s t) + P x

(Z(s) hits 0 for s > t)

P x

(Z(s) hits 0 in s t)| {z }=P

0(Z(s) hits 0 in st)

+P 0(Z(s) hits 0 for s > t)| {z }

t!1���!0

Recall that M , sup

x

c(x) <1. Thus one obtains:

E(|{jumps of Xx

(s) up to time t}) Mt

E(|{jumps of Z(s) up to time t} 2MtX

x

P 0(Z hits x before t) = E|{distinct sites visited before t}|

E|{jumps before t}| Mt <1) lim

x!0P 0

(Z hits x before t) = 0

Fix " > 0. Choose t large enought, then x large enough.

(iv) g({Xx1(t), · · · , Xx

k

(t))(ii)

X

i<j

g({Xx

i

(t), Xx

j

(t)})}

=

X

i<j

g({0, Xx

j

(t)�Xx

i

(t)}) (c),t!1�����! 0

a.s.. where Xx

j

(t)�Xx

i

(t)t!1���!1 since Z(t) started at x

j

�xi

and Z is transient.

(v) Can couple At

with {Xx

(t)|x 2 A0} in such a way, that

At

⇢ {Xx

(t)|x 2 A0}

95

Page 96: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

Exercise: Write down a Q-Matrix for {At

, Xx

(t)|x 2 A0}.

g(At

)

(a)

g({Xx

(t)|x 2 A0}) (d),t!1�����! 0 a.s.

m

Properties of µ⇢

= lim

t!1⌫⇢

T (t)

Definition 5.31 (Mixing Probability Measure). A probability measure µ on S is calledmixing, if for each |A|, |B| <1

lim

x!1µ{⌘|⌘ ⌘ 1 on A [ (B + x)} = µ{⌘ ⌘ 1 on A}µ{⌘ ⌘ 1 on B}

Proposition 5.32. (i) 8|A| <1 : 0 µ⇢

{⌘ ⌘ 1 on A}� ⇢|A| g(A)

(ii) µ⇢

is translation invariant and mixing.

(iii) Cov(⌘(x)⌘(y)) = µ⇢

{⌘(x) = ⌘(y) = 1}� µ⇢

{⌘(x) = 1}µ⇢

{⌘(y) = 1}| {z }

2

= ⇢(1� ⇢) G(x, y)

G(0, 0)| {z }1

⇢(1� ⇢)

Proof. (i) Recall µ⇢

{⌘ ⌘ 1 on A} = EA⇢|A1|

0

|A1||A| µ

{⌘ ⌘ 1 on A}� ⇢|A|= EA

(⇢|A1| � ⇢|A|)

EA1|A1|<|A| = g(A)

(ii) µ⇢

= lim

t!1⌫⇢

T (t), where ⌫⇢

T (t) is already translation-invariant, hence µ⇢

is trans-lation invariant. Let A

t

, Bt

, Ct

be coupled coalescent Markov Chains, s.t.∗ A

t

, Bt

are independent.∗ C0 = A0 [ B0

∗ Let ⌧ := inf{s � 0|As

\ Bs

6= ;, then 8t ⌧ : Ct

= At

[ Bt

.It follows:

µ⇢

{⌘ ⌘ 1 on A0 [B0}� µ⇢

{⌘ ⌘ 1 on A0}µ⇢

{⌘ ⌘ 1 on B0}= EC0⇢|C1| � EA0⇢|A1|EB0⇢|B1|

= E(A0,B0,C0)(⇢|C1| � ⇢|A1|+|B1|

)

E1{⌧<1}

= P (⌧ <1) X

u2A0,v2B0

g({u, v})

96

Page 97: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 5.5 Stationary distributions of voter model

Now take ˜B0 = B0 + x, then

µ⇢

{⌘ ⌘ 1 on A0[ ˜B0}�µ⇢

{⌘ ⌘ 1 on A0}µ⇢

{⌘ ⌘ 1 on B0} X

u2A0,v2B0

g({u, v+x})

Where the right term goes to zero for x ! 1. While this is just one direction,one can fix this by applying the absolute value function on the difference above.

(iii) Cov(⌘(x), ⌘(y)) = µ⇢

{⌘(x) = ⌘(y) = 1}� ⇢2

E{x,y}⇢|A1| � ⇢2= ⇢(1� ⇢)P {x,y}

(|A1| = 1)

= ⇢(1� ⇢)P x�y

(Z hits 0)

= ⇢(1� ⇢)G(x, y)

G(0, 0)

where the last step follows by using

G(x, y) =

Z 1

0

P x

(Z(s) = y)dsSMB

= P x

(Z hits y)G(y, y) = P x�y

(Z hits 0)G(0, 0)

m

The goal for now is to prove the following assertion.

Theorem 5.33. (i) For any stationary µ there exists a probability measure � on[0, 1], s.t.

µ =

Zµ⇢

d�(⇢)

(ii) If µ⇢

=

Zµr

d�(r) then

�(r) = �⇢

(r)

A non-precise description of this would be the following: “µ⇢

are the extreme pointsof the set of all stationary measures”.

Tools: Let I1 := [0, 1]

Theorem 5.34 (Moment problem on I1). Given c(0), c(1), · · · � 0, the following asser-tions are equivalent

(i) There exists a probability measure � on [0, 1], s.t.

c(n) =

Zxnd�(x)

97

Page 98: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

(ii) c(0) = 1 ^ 8m,n � 0 :

nX

k=0

✓n

k

◆(�1)kc(m+ k) � 0

Definition 5.35 (Harmonic Function). ↵ is a harmonic function for X(t), if

8x, t : Ex

(↵(X(t)) = ↵(x)

Theorem 5.36. If X(t) is irreducible and translation invariant Markov Chain on Zd,then all bounded harmonic functions are constant.

Theorem 5.37 (Hewitt-Savage 0 � 1 Law). If ⇠1, ⇠2, · · · are iid., A “exchangable” (i.e.for any bijection � : N ! N permutating only finitely many coordinates, one obtains:{(⇠1, ⇠2, · · · ) 2 A} = {(⇠

�(1), ⇠�(2), · · · ) 2 A}), then P (A) 2 {0, 1}

The following proof belongs to Theorem 5.36

Proof. Let ↵ be bounded and harmonic. Then ↵(X(n)) is a martingale, which impliesfor any x, that ↵(X(n)) has a limit in L1(P

x

):

Ex

(↵(X(n)) = ↵(x)

Since X(n) is translation invariant, one obtains:

X(n) = x+

nX

i=1

⇠i

Also ⇠i

d

= X(1)�X(0). lim

n!1↵

x+

nX

i=1

⇠i

!is exchangable, thus it is constant which in

turn implies ↵(Xn

) converges to some constant P x-a.s.. Call this constant ↵(x). Also

↵(Xn

) = ↵

(x+ ⇠1) +

nX

i=2

⇠i

!! ↵(x+⇠1) P

x-a.s.. Thus ↵(x) = ↵(x+⇠1). This means

that ↵ has to be a constant by irreducibility. m

Let A ⇢ S be finite. Recall µ is determined by

µ{⌘|⌘ ⌘ 1 on A}

Definition 5.38.

h(A) = µ{⌘|⌘ ⌘ 1 on A}Ut

f(A) := Ef({Xx

(t)|x 2 A})Vt

f(A) := EAf(At

)

98

Page 99: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 5.5 Stationary distributions of voter model

Suppose µ is stationary, then

h(A) = µ{⌘|⌘ ⌘ 1 on A}= µT (t){⌘|⌘ ⌘ 1 on A}=

ZP x

({⌘|⌘ ⌘ 1 on A})dµ(⌘)

=

ZPA

({⌘|⌘ ⌘ 1 on A})dµ(⌘)Fubini

=

X

B

PA

(At

= B)µ{⌘ ⌘ 1 on B}| {z }h(B)

= EAh(At

) = Vt

h(A)

Recall At

may be coupled with {Xx

(t)|x 2 A0} such that 8t < inf{s||As

| < A0} : At

=

{Xx

(t)|x 2 A0}. Thus

|Ut

h(A)� Vt

h(A)| = |E[h({Xx

(t)|x 2 A})� h(At

)]| E1{|A1|<|A|} = g(A)

Thus for Vt

h = h, one obtains:

|h(A)� Ut

h(A)| g(A)

) |Us

h(A)� Us+t

h(A)| Us

|h(A)� Ut

h(A)| U

s

g(A)

= Eg({Xx

(s)|x 2 A})The last term goes to zero by proposition 5.30 part iv. Thus U

s

h(A) coverges as s!1.Call '(A) := lim

s!1Us

h(A).

Claim: '(A) depends only on |A|. For this purpose, fix A = {x1, · · · , xk

}, then

(Xx1 , · · · , Xx

k

)

is a translation invariant, irreducible Markov Chain on Zkd. The function

(x1, · · · , xk

) 7! '({x1, · · · , xk

})is harmonic and bounded (as |h| 1)

E'({Xx1(t), · · · , Xx

k

(t)}) = Ut

'({x1, · · · , xk

})= U

t

lim

s!1Us

h({x1, · · · , xk

})= lim

s!1Us+t

h({x1, · · · , xk

})= lim

s!1Us

h({x1, · · · , xk

})= '({x1, · · · , xk

})Which implies that '({x1, · · · , xk

}) doesn’t depend on x1, · · · , xk

, thus

'({x1, · · · , xk

}) =: c(k)

99

Page 100: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

Summary so far: µ stationary, then

h(A) = µ{⌘ ⌘ 1 on A}Vt

h = h

|Vt

h(A)� Ut

h(A)| g(A)

Ut

h(A)! c(|A|)) |h(A)� c(|A|) g(A)

Claim: c is a moment sequence. There exists �, st.

c(k) =

Z 1

0

xkd�(x)

Fix m,n. By Proposition 5.30 part iv. , there exist Ai

, s.t. |Ai

| = m+n and g(Ai

)

i!1���!0.

N.B. This is because transience of Z implies that eventually the Xx

i

(t) are distinct andthus for any large t

|{Xx1(t), · · · , Xx

m+n

(t)}| = m+ n

Let Ai

= Bi

[ Ci

where |Bi

| = m, |Ci

| = n. By proposition (a), g(Bi

), g(Ci

)! 0. Byinclusion-exclusion, one obtains:

µ{⌘ ⌘ 1 on Bi

^ ⌘ ⌘ 0 on Ci

} =

X

F⇢C

i

(�1)|F | µ{⌘ ⌘ 1 on Bi

[ F}| {z }i!1���!c(m+|F |)

Now since |h(Bi

[ F )� c(m+ |F |)| g(Bi

[ F ), one obtains, that

µ{⌘ ⌘ 1 on Bi

[ F} = h(Bi

[ F )

which implies for the sum

i!1���!nX

k=0

✓n

k

◆(�1)kc(m+ k)

nX

k=0

✓n

k

◆(�1)kc(m+ k) = lim

i!1µ{⌘ ⌘ 1 on B

i

^ ⌘ ⌘ 0 on Ci

} � 0

and thus, there exists a �, s.t.

8k : c(k) =

Z 1

0

xkd�(x)

Let µ⇤=

Z 1

0

µ⇢

�(d⇢) and h⇤(A) = µ⇤{⌫|⌫ ⌘ 1 on A}. Then µ⇤ is stationary (because

µ⇢

is stationary for any ⇢. Thus Vt

h⇤= h⇤.

100

Page 101: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 5.6 Contact Process

By Theorem (a) it follows, that

0 Z

(µ⇢

{⌫|⌫ ⌘ 1 on A}� ⇢|A|)d�(rho)

| {z }=µ

⇤{⌫|⌫⌘1 on A}�c(|A|)=h

⇤(A)�c(|A|)

g(A)

|h⇤(A)� c(|A|) g(A)

|h(A)� h⇤(A)| 2g(A)

|h(A)� h⇤(A)| |V

t

h(A)� Vt

h⇤(A)|

Vt

|h(A)� h⇤(A)|

2Vt

g(A) = 2EAg(At

)

t!1���! 0

Thus h = h⇤ and µ = µ⇤.

Now suppose µ⇢

=

Z 1

0

µr

d�(r), then for any x

⇢ = µ⇢

{⌘(x) = 1} =

Z 1

0

µr

{⌘(x) = 1}d�(r)

=

Z 1

0

rd�(r)

Now for any x, y

µ⇢

{⌘(x) = ⌘(y) = 1} =

Z 1

0

µr

{⌘(x) = ⌘(y) = 1}dy

= ⇢(1� ⇢)G(x, y)

G(0, 0)=

G(x, y)

G(0, 0)

Zr(1� r)d�(r)

Thus ⇢(1� ⇢) =Z 1

0

r(1� r)d�(r). (r) = r(1� r) is strictly concave

✓Zrd�(r)

◆= (rho) =

Z 1

0

(r)d�(r)

which implies � = �x

. Since mean of the measure is ⇢ it has to be �⇢

.

5.6 Contact Process

Let S be a graph, write x ⇠ y, if x, y are neighbours. Denote the usual degree of a noden by deg(n) and assume the degree function is bounded by D. Let

c(x, ⌘) =

(1 ⌘(x) = 1

�|{y ⇠ x|⌘(y) = 1}| ⌘(x) = 0

Make some additional assumptions

101

Page 102: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

∗ S is connected

∗ For any node of S, let it’s degree be bounded.

Also assume there exists a parameter � > 0, s.t.

c(x, ⌘) =

(1 ⌘(x) = 1

�|{x ⇠ y|⌘(y) = 1}| ⌘(x) = 0

Recall, that this defines a spin system. Interpretation: ⌘(x) = 1 means x has someproperty (is infected).

∗ everyone transfers to the state of not having the property (heals) with rate 1.

∗ everyone transmits the property (infection) with rate �.

Facts:

∗ ⌘t

is attractive

∗ �0 is stationary

thus⌫ = �0

Question: Is ⌫ = �0? This would be equivalent to the question whether the process isergodic as well as whether “the disease dies out”.

The dual Process: Markov Chain on finite subsets of S

8x 2 A : q(A,A \ {x}) = 1

8x /2 A : q(A,A [ {x}) = �|{y 2 A|y ⇠ x}|Proposition 5.39. This defines a Markov Chain.

Proof. Consider the embedded discrete time chain An

. Note: |An+1| |A

n

|+ 1.

c(A) =X

B 6=A

q(A,B)

=

X

x2A

q(A,A \ {x}) +X

x/2A

q(A,A [ {x})

= |A|+ �X

x/2A

X

y2A

1{y⇠x}

|A|+ �X

y2A

D

= |A|(1 + �D)

102

Page 103: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 5.6 Contact Process

where D is the maximum degree. W.t.s.: 8A0 :

X

n�0

1

c(A)=1 a.s.. So consider:

X

n�0

1

c(An

)

� 1

1 + �D

X

n�0

1

|An

|

� 1

1 + �D

X

n�0

1

n+ |An

|=1

m

Theorem 5.40. Call ⌘t

the contact process and At

the continuous-time Markov Chainwith Q-Matrix as above. Then ⌘ and A are dual with respect to

H(⌘, A) = 1{Px2A

⌘(x)>0}

where

(X

x2A

⌘(x) > 0

)= {A \ {x|⌘(x) = 1} 6= ;}“= {A \ ⌘ 6= ;}”

Proof. Exercise. m

This means that E⌘H(⌘t

, A) = EAH(⌘, At

), so

P ⌘

(9x 2 A : ⌘t

(x) = 1) = P ⌘

(someone in A is infected at time t)

= P ⌘

(someone in A is infected at time 0)

= PA

(9x 2 At

: ⌘(x) = 1)

Fact: At

also follows the contact process! (identifying A ⇢ S with 1A

2 S) This meansthat the “contact process is self-dual”.

N.B. ⌘t

is a Feller Process and At

is a Markov Chain.

N.B. One can actually prove ⌘t

os dual to itself.

Graphical Representation Nx

: x 2 S rate-1 Poisson Process. Call them “healingtimes”. N

x,y

: x ⇠ y rate-� Poisson Process. Those last ones are the “infection times”.A path is active, if

∗ if only goes up

∗ if doesn’t cross a healing time

∗ it follows direction of the infection arrows

A path doesn’t need to be of infinite or interval length. It may also end at a givenhealing point.

Given ⌘0, set

⌘t

= {x|there is some y 2 ⌘0 with an active path from (y, 0) to (x, t)}

103

Page 104: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

Figure 2: Active Paths Example

Claim: ⌘t

above follows the contact process. Now reverse time and reverse the infectionarrows. Nothing really changes, because N

x,y

d

= Ny,x

. Also Nx,y

&Nx

have the samedistribution as their time reversals.

Fix t, A0 and let

As

= {x|there is some y 2 A0 and a backwards path from (y, t) to (x, t� s)

= {people at time t� s that could have been infected someone in A0}This implies that ⌘

t

and At

are “the same” in distribution.

Survival:

Definition 5.41 (Survival). Define ⌘t

survives, if

8⌘0 6⌘ 0 : P ⌘0(8t : ⌘

t

6⌘ 0) > 0

Note: attractiveness, ⇠0 ⌘0 and

P ⇠0(8t : ⌘

t

6⌘ 0) > 0

implyP ⌘0

(8t : ⌘t

6⌘ 0) > 0

Hence, survival holds, iff

8x 2 S : ⌘0 = 1{x} ) P ⌘0(8⌘

t

6⌘ 0) > 0

�1T (t){⌘|⌘ \ A0 6= ;} = P 1(⌘

t

\ A0 6= ;)= PA0

(1 \ At

6= ;)= PA0

(At

6= ;)

104

Page 105: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 5.6 Contact Process

thus 8s > t : At

= ; ) As

= ;lim

t!1�1T (t){⌘|⌘ \ A0 6= ;} = ⌫{⌘|⌘ \ A0 6= ;}

8A0 , ⌫ = �0 : = lim

t!0PA0

(At

6= ;) = 0

= PA0(8t : A

t

6= ;)where the last line follows, because:

�0{⌘|⌘ \ A0 6= ;} �0{⌘|⌘ 6⌘ 0} = 0

and ⌫ 6= �0 implies, there exists an x 2 S, s.t. ⌫{⌘(x) = 1} > 0, thus

⌫{⌘|⌘ \ {x} 6= ;} = ⌫{⌫(x) = 1} > 0

Proposition 5.42. ⌘t

is ergodic, iff ⌘t

doesn’t survives, iff

8A0 : PA0(8t : A

t

6= ;) = 0

Proof. First part done. Let ⌘0 = 1A0 , then

8A0PA0(8t : A

t

6= ;) = 0

= P ⌘0(8t : ⌘

t

6⌘ 0)

Also 8⌘0, |{⌘|⌘0 = 1}| <1 : P ⌘0(8t : ⌘

t

6⌘ 0) = 0, which is equivalent to

8⌘0 : P ⌘0(8t : ⌘

t

6⌘ 0) = 0

by attractiveness. m

Definition 5.43. '(A) = PA

(8t : At

6= ;)⌘t

survives, iff 9A : '(A) > 0

Proposition 5.44.

9A : '(A) > 0, 8A : '(A) > 0

Proof. S is connected implies that At

is irreducible. One can go from {x1, · · · , xk

} to{y1, · · · , ym} by {x1, · · · , xk

}! {x1} and then finding paths

x1 · · · y1x1 · · · y2x1 · · · y3

∗ add elements in the pathds one-by-one.

∗ delete everything but {y1, · · · , ym}, one-by-one.

∗ Now take A : '(A) > 0. For any B 6= A,

'(B) = PB

(At

survives)� PB

(A1 = A)| {z }⇤1

PA

(At

survives)| {z }⇤2

> 0

where ⇤1 > 0 by irreducibility and ⇤2 > 0 by A.m

105

Page 106: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

Critical Values: If �1 < �2 and ⌘�10 = ⌘�2

0 , then one can couple ⌘�1t

and ⌘�2t

such that

8t⌘�1t

⌘�2t

(Exercise: check condition on c). Now if ⌘�1t

survives, then ⌘�2t

survives.

Definition 5.45 (Critical Value). �c

is called Critical Value, iff

�c

= inf{�|⌘�t

survives}�c

= sup{�|⌘�t

doesn’t survive}

Then

� < �c

)⌘�t

doesn’t survive� > �

c

)⌘�t

survives

Main question: What is �c

? Is �c

finite?

Secondary question: What if � = �c

?

∗ Answer depends on S.

∗ Don’t expect formulas (bounds & asymptotics) e.g. S = Zd, d!1.

Proposition 5.46. �c

� 1

D

Proof. M = �D, " = 1. M < " implies ergodicity, in turn implying non-survival. Also� <

1

Dimplies non-survival. Thus

�c

� 1

D

m

Today: Consider the (d+ 1)-regular tree. The following is an example for d = 2

��

��

��

��

��

106

Page 107: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 5.6 Contact Process

A regular graph is a graph where all the nodes have the same degree.

S = {finite possibly empty sequences of [d]0 where only the first element can be d}S = {;, 0, 1, · · · , 00, 01, · · · , 010, · · · }

x ⇠ y , x0x1 · · · xk

⇠ x0x1 · · · xk

xk+1

Theorem 5.47. On (d+ 1)-regular trees

1

d+ 1

�c

1

d� 1

Recall Definition 4.21. For sets it reads the following

Definition 5.48 (Superharmonic Function). A function f is called superharmonic

w.r.t. a Markov Chain At

, if

(i) 8A, t : EA

(f(At

)) f(A)

(ii) 8A, t : EA|f(At

)| <1(iii) 8A : PA

(8t : At

6⌘ ;) > 0

Theorem 5.49. Suppose At

is a Markov Chain, that is irreducible except for a singleabsorbing state ;. Then A

t

survives, iff

9 bounded, non-constant, superharmonic f : 8A : f(;) � f(A)

Goal: For some �, construct a bounded, non-constant, superharmonic f for At

, then

∗ At

survives for this �

∗ '(A) > 0

∗ ⌘t

survives

∗ �c

Use the Kolmogorov-Forward equation: Let pt

be the transition function of At

. If

8A :

X

B

pt

(A,B)c(B) <1 (⇤)

thend

dtpt

(A,B) =

X

C

pt

(A,C)q(C,B).

Proposition 5.50. (⇤) holds for the At

in question.

107

Page 108: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

Proof. Fix A. Let f(⌘) = 1{⌘⌘0 on A}

T (t)f(⌘) = P ⌘

(⌘t

⌘ 0 on A)

= PA

(⌘t

⌘ 0 on At

)

|(T (t)f)(⌘)� (T (t)f)(⌘x

)| PA

(x 2 At

)

sup

|T (t)f(⌘)� T (t)f(⌘x

)| = PA

(x 2 At

)

|||T (t)f ||| =X

x

sup

|T (t)f(⌘)� T (t)f(⌘x

)|

= EA|At

|Recall: |||T (t)||| et(M�")|||f |||.

Now substitute M with �D and " with 1 and obtain

EA|At

| et(�D�1)|A|Last time: c(A) (1 + �D)|A|

X

B

pt

(A,B)c(B) = EAc(At

)

(1 + �D)EA|At

| (1 + �D)et(�D�1)|A|<1

m

Proposition 5.51. IfX

B

pt

(A,B)c(B) <1 uniformly for t 2 [0, T ] and

8AX

B

f(B)q(A,B)

| {z }d

dt

E

A

f(At

)��t=0

0

then EAf(At

) # in t. (i.e. f is a supermartingale)Proof. Take s < t

pt

(A,B)� ps

(A,B) =

Zt

s

X

C

pr

(A,C)q(C,B)dr

EAf(At

)� EAf(As

) =

X

B

(pt

(A,B)� ps

(A,B))f(B)

=

Zt

s

X

C

pr

(A,C)

| {z }�0

X

B

q(C,B)f(B)

| {z }0

dr

0

m

108

Page 109: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 5.6 Contact Process

For 0 < p < 1 let f(A) := ⇢|A|

∗ non-constant

∗ 8A : f(;) � f(A)

The goal is to show that f is superharmonic.X

B

q(A,B)f(B) =

X

B 6=A

q(A,B)(f(B)� f(A))

=

X

x2A

q(A,A \ {x})⇢|A|�1(1� ⇢) +

X

x/2A

q(A,A [ {x})⇢|A|�1(⇢2 � ⇢)

= ⇢|A|�1(1� ⇢)

"X

x2A

1� ⇢X

x/2A

�X

y2A,y⇠x

1

#

Proposition 5.52. X

x/2A

X

y2A,y⇠x

1 � |A|(d� 1)

Proof.

X

x/2A

X

y2S,y⇠x

1 =

X

x2A

|{y|y ⇠ x}

= |A|(d+ 1)

X

x/2A

X

y2A,y⇠x

1 2|A|

because A has no cycles (because it is a subgraph of a tree)X

x2A

X

y2A

1 = 2|{edges in A}|

= 2|A|X

x/2A

X

y2A,y⇠x

1 =

X

x2A,y2Sx⇠y

1�X

x2A,y2Ax⇠y

1

� |A|(d� 1)

m

X

B

q(A,B)f(B) = ⇢|A|�1(1� ⇢)

2

64|A|� �⇢X

x/2A,y2Ax⇠y

1

3

75

⇢|A|�1(1� ⇢)[|A|� �⇢|A|(d� 1)]

109

Page 110: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

Suppose � >1

d� 1

, then 9p < 1 : 1� �(d� 1)⇢ 0, thus implying that f(A) = ⇢|A| isa bounded, non-constant, super-harmonic function.

∗ At

survives

∗ '(A) > 0

∗ ⌘t

survives

∗ � >1

d� 1

) ⌘t

survives

∗ �c

= inf{�|⌘t

survives} 1

d� 1

This concludes the proof for a (d+ 1)-regular tree.

N.B. A 2-regular tree (Z1) do not have a bound in this case. On Z1 one can show:�c

2.

Strong Survival

Definition 5.53 (Strong Survival). Say ⌘t

survives strongly if

8x 2 S : P 1x

(⌘(x) = 1 for a sequence of t!1) > 0

survives = desease doesn’t go extinct strongly survives = I keep getting it Clearlystrong survival implies survival.

Definition 5.54. �s

= inf

{⌘t

strongly survives}Then �

c

�S

Theorem 5.55. On the (d+ 1)-regular tree,

�s

� 1

2

pd

Proof.

�� �

� � � �

�2

�1

Level 0

1

l(x) = Level of vertex x

f(A) :=X

x2A

⇢l(x)

∗ not bounded

110

Page 111: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Notes Max Goldowsky 5.6 Contact Process

Claim: f is superharmonicX

B

q(A,B)f(B) =

X

B 6=A

q(A,B)(f(B)� f(A))

=

X

x2A

q(A,A \ {x}) (f(A \ {x})� f(A))| {z }�⇢

l(x)

+

X

x/2A

q(A,A [ {x}) (f(A [ {x})� f(A))| {z }⇢

l(x)

=

X

x2A

�⇢l(x) +X

x/2A

�X

y2A,y⇠x

⇢l(x)

�f(A) + �X

y2A

X

x,x⇠y

⇢l(x)

by the structure of each y, it has one neighbour of level l(y) � 1 and d neighbours oflevel l(y) + 1.

= �f(A) + �X

y2A

[d⇢l(y)+1+ ⇢l(y)�1

] = �f(A) + �

✓dp+

1

p

◆X

y2A

⇢l(y)

= f(A)

✓dp+

1

p

◆� 1

Take ⇢ =1pd

and � =

1

2

pd

✓dp+

1

p

◆= 1

)X

B

q(A,B)f(A) 0

Thus f is superharmonic, f(At

) is a non-negative supermartingale and has a limit a.s..Since x heals with rate 1

8T :P (8t > T : ⌘t

(x) = 1) = 0

= P 1x

(⌘t

(x) goes from 0 to 1 infinitely often){⌘

t

(x) goes from 0 to 1 infinitely often} ⇢ {f(At

) doesn’t converge}P 1

x

(f(At

) doesn’t converge) = 0

thus there is no strong survival.

�s

� 1

2

pd

m

Corollary. If d � 6, then �c|{z}

1d�1

< �s|{z}

� 12pd

111

Page 112: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

5 SPIN SYSTEMS Joe Neeman

Some facts for any d � 2 on a (d+ 1)-regular tree.

∗ �c

< �s

1pd� 1

∗ � = �c

) no survival (ergodic)

∗ � = �s

) no strong survival

∗ � > �s

) every stationary measure has the form

↵�0 + (1� ↵)⌫

∗ �c

< � < �s

) there exist other stationary measures.

Facts on Zd

∗1 +

p37

6

�c

2 (d+ 1)

∗ 8d :

1

2d� 1

�c

2

d

∗ �c

= �s

∗ � = �c

) no survival

∗ �c

⇠ 1

2dfor d!1

∗ � > �c

) stationary distributions have the form

↵�0 + (1� ↵)⌫

112

Page 113: Markov Processes - uni-bonn.de · 2016-02-09 · Markov Processes Joe Neeman Notes Max Goldowsky February 2, 2016 WS2015/16 N.B.: These notes were typed during the lectures. I didn’t

Index

Blackwell’s Example, 15Blumenthal’s 0-1-law, 6Brownian Motion, 3, 5, 43

Absorbed ⇠, 53Reflected ⇠, 54with sticky boundary, 54with twist, 9

Càdlàg Functions, 5, 34Càglàd Functions, 9Chapman-Kolmogorov Equations, 10, 12Compound Poisson Process, 3, 5Configuration, 70Coupling, 80Critical Value, 106

Ergodicity, 77

Feller Property, 35Fischer-Wright

Diffusion, 62Fisher-Wright

Generator, 62Function

harmonic ⇠, 98

Generatorof a Markov Process, 39Core, 53of a Spin System, 70

Green’s Function, 26

infinitely divisible, 38Ising Model, 79, 82, 83

Kolmogorov Backward Equation, 13Kolmogorov Forward Equation, 33

Lévy Semigroup, 38Liggett Duality, 68Linear Operator

closed, 51Closure of ⇠, 51

Markov Chain, 10Birth and Death, 11Coalescing Markov Chain ⇠s, 88

Markov Propertystrong, 8weak, 2

MartingaleSuper⇠, 28

Measureassociated ⇠, 84mixing ⇠, 96positively associated ⇠, 84reversible, 25stationary (Chain), 25stationary (Feller), 59

ProcessCauchy, 43Contact ⇠, 79, 82, 101Dual of a ⇠, 68Feller, 34Galton-Watson, 32Lévy, 37

Q-Matrix, 11

Resolvent Operator, 36Right-Continuous Filtration, 5

Shift Operator ✓, 3, 34Sites, 70Spin System, 70

attractive, 81State

absorbing, 12, 28recurrent, 26transient, 26

Stopping Time, 6Superharmonic Function, 28, 107

Transition Function pt

, 10irreducible, 26

Voter Model, 87

113