149
Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4 More on stopping times 16 5 Martingale convergence theorem 19 6 Submartingale inequalities 25 7 Local martingales and uniform integrability 27 8 Quadratic variation 32 9 Semimartingale defined by Riemann-Stieltjes integration 39 10 Product of semimartingales is a semimartingale 44 11 Stochastic integration – beyond Riemann-Stieltjes 48 12 Semimartingale and Itˆ o’s formula 53 13 Burkholder-Davis-Gundy inequality 58 14 Sample path regularity of submartingale 64 15 Natural increasing process and martingale 68 16 Local martingale with finite variation path 74 1

Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Note on martingales

May 29, 2004

Contents

1 Conditional expectation 3

2 Optional sampling 8

3 Optional sampling–continuous time 12

4 More on stopping times 16

5 Martingale convergence theorem 19

6 Submartingale inequalities 25

7 Local martingales and uniform integrability 27

8 Quadratic variation 32

9 Semimartingale defined by Riemann-Stieltjes integration 39

10 Product of semimartingales is a semimartingale 44

11 Stochastic integration – beyond Riemann-Stieltjes 48

12 Semimartingale and Ito’s formula 53

13 Burkholder-Davis-Gundy inequality 58

14 Sample path regularity of submartingale 64

15 Natural increasing process and martingale 68

16 Local martingale with finite variation path 74

1

Page 2: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

17 Predictable process and martingale with finite variation path 78

18 Doob-Meyer decomposition theorem 84

19 Compensator of bounded increasing process 89

20 Stable spaces of square integrable martingales 95

21 General square integrable martingale 100

22 Predictable process and square integrable martingale 105

23 Square integrable martingale as integrator 109

24 Predictable processes 112

25 Semigroup of measure kernel and Markov process 120

26 Strong Markov property and quasi left continuity 127

27 orthogonality of martingales 132

28 Integrands for stochastic integration 139

29 Quasi-left continuous increasing process 141

2

Page 3: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

1 Conditional expectation

Let (Ω,F , P ) be a complete probability space. We write R := R ∪ −∞,+∞.

In this note a random variable is an F -measurable function X : Ω → R suchthat P (X = −∞) = 0. Given two random variables X and Y we write

(X + Y )(ω) :=

X(ω) + Y (ω) X(ω) > −∞, Y (ω) > −∞

−∞ otherwise

Note that X + Y remains to be a random variable.

1.1 Definition. An F -measurable function X : Ω → R is said to be lower semi-integrableif E[max−X, 0] < +∞. (If so then P (X = −∞) = 0.)

1.2 Remark. If X and Y are lower semi-integrable random variables then so is X + Y .

1.3 Lemma. Suppose that G is a sub σ-field of F , Y and Z are G-measurable and lowersemi-integrable random variables. If E[Y ;A] ≤ E[Z ;A] for all A ∈ G then Y ≤ Z a.s.

Proof. We exploit the relation Z < Y =⋃∞n=1Z ≤ n, Z + 1/n ≤ Y . Let n ∈ N

and set A := Z ≤ n, Z + 1/n ≤ Y . Then we have that A ∈ G, E[|Z| ;A] < +∞ andE[Z + 1/n ;A] ≤ E[Y ;A]. It therefore follows that

E[Z ;A] + P (A)/n ≤ E[Y ;A] ≤ E[Z ;A].

This implies that P (A) = 0. Thus we get the claim.

1.4 Definition. Given a lower semi-integrable random variable X and a sub σ-field G of F ,

E≤[X|G] := Y : G-measurable, lower semi-integrable, E[Y ;A] ≤ E[X ;A]∀A ∈ G,E[X|G] := Y : G-measurable, lower semi-integrable, E[Y ;A] = E[X ;A]∀A ∈ G

We write E≥[X|G] := −E≤[−X|G] provided X is upper semi-integrable.

Suppose ξ is a signed measure on a measurable space (S,B) and µ is a measure on (S,B).The Radon Nikodym theorem asserts that

If µ is σ-finite and ξ is µ-absolutely continuous then D(ξ/µ) 6= ∅ where

D(ξ/µ) := f : B-measurable,

∫S

max−f, 0µ < +∞,

∫A

f µ = ξ(A)∀A ∈ B.

A lower semi-integrable random variable X defines a signed measure F → R, A 7→ E[X ;A],which we denote by XP . Let G be a sub σ-field of F . Then

E[X|G] = D((XP )/P |G)

where we write D(ξ/µ|C) := D(ξ|C/µ|C) for a sub σ-field C of B.

3

Page 4: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

In the rest of this section G is a sub σ-field of F .

1.5 Theorem. Suppose X is a lower semi-integrable random variable. Then E[X|G] 6= ∅ andE≤[X|G] = Y : G-measurable, lower semi-integrable and Y ≤ Z a.s. for all Z ∈ E[X|G].

1.6 Lemma. Suppose that X is an integrable random variable. Then E[−X|G] = −E[X|G],all Y ∈ E≤[X|G] are integrable, and E[X|G] = Y ∈ E≤[X|G] : E[Y ] = E[X].

1.7 Lemma. Let X1 and X2 be lower semi-integrable random variables.(i) If a ∈ R>0 then aE[X1|G] = E[aX1|G] and aE≤[X1|G] = E≤[aX1|G].(ii) E[X1|G] + E[X2|G] ⊂ E[X1 + X2|G] and E≤[X1|G] + E≤[X2|G] ⊂ E≤[X1 + X2|G]. Ifeither X1 or X2 is integrable then both of the inclusions ⊂ are replaced by the equality.

Proof. (ii) Let Y1 ∈ E[X1|G], Y2 ∈ E[X2|G], Z1 ∈ E[X1|G] and Z2 ∈ E[X2|G]. We have thatY1 + Y2 ∈ E[X1 +X2|G]. Indeed Y1 + Y2 is lower semi-integrable and G-measurable, and

E[Y1 + Y2 ;A] = E[Y1 ;A] + E[Y2 ;A] = E[X1 ;A] + E[X2 ;A] = E[X1 +X2 ;A]∀A ∈ G.

The second equality changes to ≤ when Yi are replaced by Zi. Suppose that X2 is integrableand Z ∈ E≤[X1 +X2|G]. Then, since Y2 is integrable and Z ≤ Y1 + Y2 a.s., it follows thatZ−Y2 is lower semi-integrable and Z−Y2 ≤ Y1 a.s. This implies that Z−Y2 ∈ E≤[X1|G].

1.8 Lemma. Suppose X1, X2 are lower semi-integrable random variables and X1 ≤ X2 a.s.Then E≤[X1|G] ⊂ E≤[X2|G]. If Y1 ∈ E≤[X1|G] and Y2 ∈ E[X2|G] then Y1 ≤ Y2 a.s.

Proof. We have that E[Y1 ;A] ≤ E[X1 ;A] ≤ E[X2 ;A] = E[Y2 ;A] for all A ∈ G.

1.9 Corollary. If X ≥ 0 a.s. then Y ≥ 0 a.s. for all Y ∈ E[X|G].

1.10 Theorem. Suppose that X1 is lower semi-integrable and Xn ≤ Xn+1 a.s. for all n ∈ N.(i) If Yn ∈ E[Xn|G] for all n ∈ N then supn∈N Yn ∈ E[supn∈NXn|G].(ii) If Zn ∈ E≤[Xn|G] for all n ∈ N then supn∈N Zn ∈ E≤[supn∈NXn|G].

Proof. Observe that supn∈N Yn is G-measurable. According to Lemma 1.8 Xn ≤ Xn+1 a.s.implies that Yn ≤ Yn+1 a.s. On the other hand supn∈N Zn ≤ supn∈N Yn a.s. because Zn ≤ Yna.s. Since E[X1] = E[Y1] > −∞, invoking the monotone convergence theorem, we infer that

E[supn∈N

Zn ;A] ≤ E[supn∈N

Yn ;A] = supn∈N

E[Yn ;A] = supn∈N

E[Xn ;A] = E[supn∈N

Xn ;A] ∀A ∈ G.

Consequently supn∈N Yn ∈ E[supn∈NXn|G] and supn∈N Zn ∈ E≤[supn∈NXn|G].

1.11 Corollary. Suppose infn∈NXn is lower semi-integrable and Yn ∈ E[Xn|G] for n ∈ N.(i) Z ≤ lim infn→∞ Yn a.s. for all Z ∈ E[lim infn→∞Xn|G].(ii) If lim infn→∞E[Xn] < +∞ then lim infn→∞Xn is integrable, lim infn→∞ Yn is integrableand lim infn→∞ Yn ∈ E≥[lim infn→∞Xn|G].

Proof. (i) Let Zn ∈ E[infk≥nXk|G]. Then we have that supn∈N Zn ∈ E[lim infn→∞Xn|G] byTheorem 1.10(i). On the other hand, since E≤[infk≥nXk|G] ⊂ E≤[Xk|G] for all k ≥ n byLemma 1.8, Zn ≤ Yk a.s. for all k ≥ n. This implies that supn∈N Zn ≤ lim infn→∞ Yn a.s.

(ii) We see that lim infn→∞Xn is integrable since E[lim infn→∞Xn] ≤ lim infn→∞E[Xn]by Fatou’s lemma. It then follows that infn∈NXn is integrable and so is Z1. We have thatZ1 ≤ Yk a.s. for all k ≥ 1. We get E[lim infn→∞ Yn] ≤ lim infn→∞E[Yn] by invoking Fatou’slemma. The right hand side equals lim infn→∞E[Xn] < +∞.

4

Page 5: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

1.12 Lemma. (i) Suppose that f : R → R∪+∞ is convex. Then f(x) = sup(a,b)∈F (ax+b)for all x ∈ R where F is a countable dense subset of (a, b) ∈ R2 : ax+ b ≤ f(x)∀x ∈ R.(ii) If f : R∪+∞ → R∪+∞ is non-decreasing and convex then f(x) = sup(a,b)∈F (ax+b)for all x ∈ R where F is a countable dense subset of (a, b) ∈ R≥0×R : ax+b ≤ f(x)∀x ∈ R.

1.13 Theorem. (i) Suppose X is integrable and f : R → R ∪ +∞ is a convex function.Then f(X) is lower semi-integrable and f(Y ) ∈ E≤[f(X)|G] for all Y ∈ E[X|G].(ii) Suppose X is lower semi-integrable and f : R∪+∞ → R∪+∞ is non-decreasing andconvex. Then f(X) is lower semi-integrable and f(Y ) ∈ E≤[f(X)|G] for all Y ∈ E[X|G].

Proof. We choose a countable dense subset F of (a, b) ∈ R2 : ax + b ≤ f(x)∀x ∈ R.Let (a, b) ∈ F . Then, since aX + b ≤ f(X), it follows that E≤[aX + b|G] ⊂ E≤[f(X)|G]by Lemma 1.8. On the other hand aY + b ∈ E[aX + b|G] by Lemma 1.7 and Lemma 1.6.Consequently aY + b ∈ E≤[f(X)|G] for all (a, b) ∈ F . The set F being countable, we inferthat sup(a,b)∈F (aY + b) ∈ E≤[f(X)|G] by Theorem 1.10(ii). Thus Lemma 1.12 claims thatf(Y ) ∈ E≤[f(X)|G].

1.14 Lemma. Suppose that µ is a measure on (Ω,F) which is P -absolutely continuous onG and ρ ∈ D(µ/P |G). Let A ∈ G and f : Ω → R be a G-measurable function.(i) µ(A) = 0 if and only if P (A ∩ ρ > 0) = 0.(ii) f ≥ 0 µ-a.e. if and only if fρ ≥ 0 P -a.s. If so then

∫Af µ = E[fρ ;A].

(iii) f is µ-integrable if and only if fρ is P -integrable. If so then∫Af µ = E[fρ ;A].

1.15 Theorem. (i) Suppose that X ≥ 0 a.s. and E[X ;Z < 0] = 0. Then XZ ≥ 0 P -a.s.If Y ∈ E[X|G] and Z is G-measurable then Y Z ≥ 0 P -a.s. and Y Z ∈ E[XZ|G].(ii) Suppose that X is integrable and XZ is integrable. If Y ∈ E[X|G] and Z is G-measurablethen Y Z is integrable and Y Z ∈ E[XZ|G].

1.16 Lemma. Let X be a lower semi-integrable random variable and N := Null(P ). ThenE[X|G] = E[X|G ∨ N ] ∩ G-measurable and E≤[X|G] = E≤[X|G ∨ N ] ∩ G-measurable.

1.17 Theorem. Suppose that X is lower semi-integrable and H is a sub σ-field of G.(i) If Y ∈ E≤[X|G] and Z ∈ E≤[Y |H] then Z ∈ E≤[X|H].(ii) If Y ∈ E[X|G] and Z ∈ E[Y |H] then Z ∈ E[X|H].

Proof. Let A ∈ H. Then, since A ∈ G as well, we have that E[X ;A] ≥ E[Y ;A] ≥ E[Z ;A].This means Z ∈ E≤[X|H] because Z is H-measurable.

1.18 Lemma. Suppose that X is a non-negative random variable, A ∈ F and c ∈ R≥0.Then E[X ;A] ≤ E[X ;X > c] + cP (A).

1.19 Corollary. (i) Let Λ be a family of random variables. Then the following holds:

infc>0

supX∈Λ

E[|X| ; |X| > c] = 0 ⇔ supX∈Λ

E[|X|] < +∞ and infδ>0

supX∈Λ

supA∈F :P (A)<δ

E[|X| ;A] = 0.

(ii) If X is integrable then infδ>0 supA∈F :P (A)<δ E[|X| ;A] = 0.

5

Page 6: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. (i) ⇒ Lemma 1.18 shows that supX∈ΛE[|X|] ≤ supX∈ΛE[|X| ; |X| > c] + c and

infδ>0

supX∈Λ

supA∈F :P (A)<δ

E[|X| ;A] ≤ supX∈Λ

E[|X| ; |X| > c] for all c > 0.

⇐ Since P (|X| > c) ≤ supX∈ΛE[|X|]/c for X ∈ Λ and c > 0, it follows that

supX∈Λ

E[|X| ; |X| > c] ≤ supX∈Λ

supA∈F :P (A)≤M/c

E[|X| ;A] for all c ∈ R>0

where M := supX∈ΛE[|X|] < +∞. The infimum of the right hand side over c > 0 is 0.(ii) We see that E[|X| ; |X| > c] converges to E[|X| ; |X| = +∞] = 0 as c tends to +∞

by the monotone convergence theorem.

1.20 Remark. Let P and Q be mutually absolutely continuous probability measures on(Ω,F). We infer by Corollary 1.19(ii) that the convergence in probability with respect to Pand with respect to Q are equivalent.

1.21 Definition. Let Λ be a family of random variables. It is said to be uniformly integrableif infc>0 supX∈ΛE[|X| ; |X| > c] = 0.

1.22 Example. Let Λ be a family of random variables. If there exists an integrable dom-inating random variable X, that is P (|Y | ≤ |X|) = 1 for all Y ∈ Λ, then Λ is uniformlyintegrable. Indeed E[|Y | ; |Y | > c] ≤ E[|X| ; |X| > c] for all Y ∈ Λ.

1.23 Theorem. Let Λ be a uniformly integrable family of random variables and Θ be thecollection of all sub σ-fields of F . Then the family

⋃X∈Λ,G∈ΘE[X|G] is uniformly integrable.

Proof. Let X ∈ Λ, G ∈ Θ and Y ∈ E[X|G]. Since |Y | ∈ E≤[|X| |G] by Theorem 1.13,

P (|Y | > c) ≤ E[|Y |]/c ≤ E[|X|]/c and E[|Y | ; |Y | > c] ≤ E[|X| ; |Y | > c] for all c > 0.

Here we used that |Y | > c ∈ G. Invoking Lemma 1.18, we see that

E[|X| ; |Y | > c] ≤ E[|X| ; |X| >√c] +

√cP (|Y | > c).

The second term in the right hand side is dominated by E[|X|]/√c. It therefore follows that

E[|Y | ; |Y | > c] ≤ E[|X| ; |Y | > c] ≤ E[|X| ; |X| >√c] + E[|X|]/

√c.

The infimum of the right hand side over c ∈ R>0 is 0.

1.24 Lemma. Suppose that X is a random variable and Λ is a family of random variablessuch that P (|Y | = ∞) = 0 ∀Y ∈ Λ and infY ∈Λ P (|X − Y | > ε) = 0 for all ε ∈ R>0. Then

E[|X| ; |X| > c] ≤ lim infδ↓0

infY ∈Λ:P (|X−Y |>δ)<δ

E[|Y | ; |Y | > c] for all c ∈ R≥0.

Proof. Let c ∈ R≥0, δ ∈ R>0, M ∈ R>0 and Y ∈ Λ. It follows that

E[|X| ; |X| > c+ δ, |X| ≤ |Y |+ δ] ≤ E[|Y |+ δ ; |Y | > c] ≤ E[|Y | ; |Y | > c] + δ

6

Page 7: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

since |X| > c + δ, |X| ≤ |Y | + δ ⊂ |Y | > c, |X| ≤ |Y | + δ. On the other hand we havethat |X| > c+ δ, |X| > |Y |+ δ ⊂ |X| > |Y |+ δ ⊂ |X − Y | > δ. Consequently

E[min|X|,M ; |X| > c+ δ] ≤ E[|Y | ; |Y | > c] + δ +MP (|X − Y | > δ).

Observe that Y ∈ Λ : P (|X − Y | > δ) < δ 6= ∅ for all δ ∈ R>0. We thus get

E[min|X|,M ; |X| > c+ δ] ≤ infY ∈Λ:P (|X−Y |>δ)<δ

E[|Y | ; |Y | > c] + (1 +M)δ.

Tending δ to 0 we see by the monotone convergence theorem that

E[min|X|,M ; |X| > c] ≤ lim infδ↓0

infY ∈Λ:P (|X−Y |>δ)<δ

E[|Y | ; |Y | > c].

The number M being arbitrary the monotone convergence theorem yields the claim.

1.25 Corollary. Suppose that Λ is a family of random variables. If there exists a uniformlyintegrable family of random variables Ξ such that infY ∈Ξ P (|X − Y | > ε) = 0 for all X ∈ Λand ε ∈ R>0 then Λ is uniformly integrable.

Proof. Lemma 1.24 shows E[|X| ; |X| > c] ≤ supY ∈ΞE[|Y | ; |Y | > c] for X ∈ Λ, c ∈ R≥0.

1.26 Lemma. Let Λ1 and Λ2 be families of random variables. If both Λ1 and Λ2 are uni-formly integrable then so is X + Y ;X ∈ Λ1, Y ∈ Λ2.

Proof. Let X ∈ Λ1 and Y ∈ Λ2. Since |X + Y | ≤ 2 max|X|, |Y |, we have that

E[|X + Y | ; |X + Y | > c] ≤ E[2 max|X|, |Y | ; 2 max|X|, |Y | > c].

The right hand side reads E[2|X| ; |X| ≥ |Y |, 2|X| > c]+E[2|Y | ; |X| < |Y |, 2|Y | > c], whichis dominated by 2 supX∈Λ1

E[|X| ; |X| > c/2] + 2 supY ∈Λ2E[|Y | ; |Y | > c/2].

1.27 Corollary. Suppose that Xn is a sequence of uniformly integrable random variables. IfXn converges to a random variable X in probability then E[|Xn −X|] converges to 0.

Proof. Since E[|X| ; |X| > c] ≤ lim infn→∞E[|Xn| ; |Xn| > c] for all c ∈ R≥0 by Lemma 1.24,we see that the sequence Xn−X is also uniformly integrable by Lemma 1.26. Suppose thatc, ε ∈ R>0 and ε < c. In general we have that

E[|Y |] ≤ εP (|Y | ≤ ε)+cP (ε < |Y | ≤ c)+E[|Y | ; |Y | > c] ≤ ε+cP (|Y | > ε)+E[|Y | ; |Y | > c].

where Y is a random variable. It therefore follows that

E[|Xn −X|] ≤ ε+ cP (|Xn −X| > ε) + supk∈N

E[|Xk −X| ; |Xk −X| > c].

Tending n to ∞ first and ε to 0 next we get

lim supn→∞

E[|Xn −X|] ≤ supk∈N

E[|Xk −X| ; |Xk −X| > c].

The number c being arbitrary, we get the claim due to the uniform integrability.

7

Page 8: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

2 Optional sampling

Let (Ω,F , P ) be a complete probability space. In this section a stochastic process is simply afamily of random variables with index set Z≥0. We shall discuss submartingales. To describethis notion we need to introduce a filtration.

F· is a filtration of F with index set Z≥0, i.e., it is family of sub σ-fields of Fwith index set Z≥0 such that Fm ⊂ Fn for all m,n ∈ Z≥0 with m < n.

2.1 Example. Let τ, σ be Z≥0∪+∞-valued random variables and X· a stochastic process.We discuss the following random variable:

Xτ :=

Xn on τ = n0 on τ = +∞

,

which coincides with X0 +∑τ−1

n=0(Xn+1 − Xn) on Λ := τ < +∞, |Xn| < +∞∀n ∈ Z≥0.Suppose that E[|Xn|] < +∞ for all n ∈ Z≥0 and there exists T ∈ Z≥0 such that τ ≤ T a.s.Note that Xτ is integrable. Suppose that σ ≤ τ a.s. Then we see that

Xτ −Xσ =a.s.

∑n:σ≤n<τ

(Xn+1 −Xn)1Λ =T−1∑n=0

(Xn+1 −Xn)1σ≤n1τ>n1Λ.

Let A ∈ F . If E[Xn |Fn] ⊂ E≤[Xn+1 |Fn] for all n ∈ Z≥0, τ > n ∈ Fn ∨ Null(P ) for alln ∈ Z≥0 and A ∩ σ ≤ n ∈ Fn ∨ Null(P ) for all n ∈ Z≥0 then E[Xτ ;A] ≥ E[Xσ ;A].

2.2 Definition. (i) A mapping σ : Ω → Z≥0∪+∞ such that σ = n ∈ Fn for all n ∈ Z≥0

(equivalently σ ≤ n ∈ Fn for all n ∈ Z≥0) is called an F·-stopping time.(ii) Given an F·-stopping time σ, set

Fσ := A ∈ F : A ∩ σ = n ∈ Fn ∀n ∈ Z≥0 (= A ∈ F : A ∩ σ ≤ n ∈ Fn ∀n ∈ Z≥0).

We see that Fσ is a sub σ-field of F since (Ac ∩ σ ≤ n)c = (A ∩ σ ≤ n) ∪ σ > n.As for a constant stopping time, say σ, we have that Fσ = Fm where σ(Ω) = m.

2.3 Definition. (i) A stochastic process X· such that Xn is integrable for all n ∈ Z≥0 andXn ∈ E≤[Xn+1|Fn] for all n ∈ Z≥0 is called an F·-submartingale.(ii) A stochastic process X· such that −X· is an F·-submartingale, i.e., Xn is integrable forall n ∈ Z≥0 and Xn ∈ E≥[Xn+1|Fn] for all n ∈ Z≥0, is called an F·-supermartingale.(iii) A stochastic process X· such that Xn is integrable for all n ∈ Z≥0 and Xn ∈ E[Xn+1|Fn]for all n ∈ Z≥0 is called an F·-martingale.

2.4 Definition. A stochastic process X· is said to be F·-adapted if the random variable Xn

is Fn-measurable for each n ∈ Z≥0.

2.5 Remark. All F·-submartingales are F·-adapted due to the condition Xn ∈ E≤[Xn+1|Fn].

2.6 Theorem. Suppose that X· is an F·-submartingale and σ and τ are F·∨Null(P )-stoppingtimes such that σ ≤ τ ≤ T a.s. for some T ∈ R>0.(i) Xσ and Xτ are integrable and Xσ ∈ E≤[Xτ |(F· ∨ Null(P ))σ].(ii) If σ is an F·-stopping time then Xσ ∈ E≤[Xτ |Fσ].

8

Page 9: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. According to the discussion in Example 2.1, the random variables Xσ and Xτ areintegrable and E[Xτ ;A] ≥ E[Xσ ;A] for all A ∈ (F·∨Null(P ))σ. Fσ ⊂ (F·∨Null(P ))σ. TheFσ-measurability of Xσ for F·-stopping times σ shall be verified in Lemma 2.7.

2.7 Lemma. Let X· be an F·-adapted stochastic process and σ be an F·-stopping time. ThenXσ is Fσ-measurable. In particular σ is Fσ-measurable.

Proof. Let a ∈ R. Then Xσ < a∩σ = n = Xn < a∩σ = n ∈ Fn for all n ∈ Z≥0.

2.8 Lemma. If σ and τ are F·-stopping times then so are σ ∧ τ , σ ∨ τ and σ + τ .

Proof. σ ∧ τ ≤ n = σ ≤ n ∪ τ ≤ n, σ ∨ τ ≤ n = σ ≤ n ∩ τ ≤ n andσ + τ = n =

⋃nk=0σ = k ∩ τ = n− k.

2.9 Remark. Suppose that σ and τ are F·-stopping times and σ ≤ τ a.s.(i) the F·-stopping times σ∧τ and σ∨τ satisfy σ∧τ = σ a.s. and σ∨τ = τ a.s. respectively.(ii) Suppose in addition that there exists T ∈ Z≥0 such that τ ≤ T a.s. Then the F·-stoppingtimes σ∧τ ∧T and (σ∨τ)∧T satisfy σ∧τ ∧T = σ a.s. and (σ∨τ)∧T = τ a.s. respectively.

2.10 Lemma. Suppose that σ and τ are F·-stopping times.(i) Fσ∧τ = Fσ ∩ Fτ . If σ ≤ τ then Fσ ⊂ Fτ .(ii) τ ≤ σ, σ ≤ τ, τ < σ, σ < τ and τ = σ belong to Fσ∧τ(iii) A ∩ σ ≤ τ ∈ Fτ and A ∩ σ < τ ∈ Fτ for all A ∈ Fσ∨τ . Fσ∨τ = Fσ ∨ Fτ .(iv) A ∩ σ ≤ τ ∈ Fσ∧τ and A ∩ σ < τ ∈ Fσ∧τ for all A ∈ Fσ.Proof. Suppose B ∈ Fσ∧τ . Then it follows that B ∈ Fτ . Indeed

B ∩ τ ≤ n = (B ∩ σ ∧ τ ≤ n) ∩ (σ > n ∪ σ ≤ n, τ ≤ n) ∈ Fn for all n ∈ Z≥0.

Similarly we get B ∈ Fσ. Conversely if B ∈ Fσ ∩ Fτ then B ∈ Fσ∧τ since

B ∩ σ ∧ τ ≤ n = (B ∩ σ ≤ n) ∪ (B ∩ τ ≤ n) ∈ Fn for all n ∈ Z≥0.

Let A ∈ Fσ∨τ . We infer that A ∩ σ ≤ τ ∈ Fτ by using

(A ∩ σ ≤ τ) ∩ τ = n = (A ∩ σ ∨ τ ≤ n) ∩ τ = n ∈ Fn for all n ∈ Z≥0.

Since σ = σ ∨ (σ ∧ τ) and σ ≤ τ = σ ≤ σ ∧ τ, we get C ∩ σ ≤ τ ∈ Fσ∧τ for allC ∈ Fσ. It then follows that σ ≤ τ ∈ Fσ∧τ and σ < τ = τ ≤ σc ∈ Fσ∧τ . We finallyget A ∩ σ < τ = (A ∩ σ ≤ τ) ∩ σ < τ ∈ Fτ because Fσ∧τ ⊂ Fτ .2.11 Theorem. Suppose X· is an F·-submartingale and σ(1), σ(2) and τ are F· ∨Null(P )-stopping times such that σ(1) ≤ σ(2) a.s. and σ(2) ∧ τ ≤ T a.s. for some T ∈ R>0.(i) Xσ(1)∧τ ∈ E≤[Xσ(2)∧τ |(F· ∨ Null(P ))σ(1)].(ii) If σ(1) and τ are F·-stopping times then Xσ(1)∧τ ∈ E≤[Xσ(2)∧τ |Fσ(1)].

Proof. We write Ft := Ft ∨ Null(P ). Let A ∈ Fσ(1). Then A ∩ σ(1) ≤ τ ∈ Fσ(1)∧τ byLemma 2.10(iv). Therefore, invoking Theorem 2.6, we see that

E[Xσ(1)∧τ ;A ∩ σ(1) ≤ τ] ≤ E[Xσ(2)∧τ ;A ∩ σ(1) ≤ τ].

Adding E[Xτ ;A ∩ τ < σ(1)] we get

E[Xσ(1)∧τ ;A] ≤ E[Xσ(2)∧τ ;A].

On the other hand Fσ(1)∧τ ⊂ Fσ(1) according to Lemma 2.10(i). Therefore Xσ(1)∧τ is Fσ(1)-

measurable by Lemma 2.7. Consequently we get Xσ(1)∧τ ∈ E≤[Xσ(2)∧τ |Fσ(1)].

9

Page 10: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

2.12 Corollary. If X· is an F·-submartingale and τ is an F·-stopping time then the stoppedprocess X·∧τ is an F·-submartingale.

2.13 Lemma. Let X· be an F·-submartingale with index set Z≤0. If inf E[X·] > −∞ thenX· is uniformly integrable.

Proof. Suppose that λ ∈ R>0, k,m ∈ Z≤0 and k < m. We see that

E[|Xk| ; |Xk| ≥ λ] = E[Xk ;Xk ≥ λ] + E[Xk ;Xk > −λ]− E[Xk]

Due to the submartingale property, the first term and the second term in the right hand sideare dominated by E[Xm ;Xk ≥ λ] and E[Xm ;Xk > −λ] respectively. Therefore

E[|Xk| ; |Xk| ≥ λ] ≤ E[Xm ;Xk ≥ λ]− E[Xm ;Xk ≤ −λ] + E[Xm]− E[Xk]

≤ E[|Xm| ; |Xk| ≥ λ] + E[Xm]− inf E[X·].

Since n 7→ E[Xn] is non-decreasing, it follows that lim supn→−∞E[Xn] = inf E[X·] > −∞.Given ε > 0. There exists m ∈ Z≤0 such that

E[Xm]− inf E[X·] < ε.

By Corollary 1.19(i) we can choose δ > 0 such that

if A ∈ F and P (A) < δ then E[|Xk| ;A] < ε for all k ∈ Z[m,0].

Using Theorem 1.13 we observe that

E[|Xk|] = 2E[maxXk, 0]− E[Xk] ≤ 2E[maxX0, 0]− inf E[X·].

This implies that there exists λ ∈ R>0 such that

P (|Xk| ≥ λ) < δ for all k ∈ Z≤0.

It follows that

E[|Xk| ; |Xk| ≥ λ] < ε for all k ∈ Z[m,0] and E[|Xm| ; |Xk| ≥ λ] < ε for all k ∈ Z<m.

Thus we get E[|Xk| ; |Xk| ≥ λ] < 2ε for all k ∈ Z≤0.

We call N an exceptional family if ∅ ∈ N and A : A ⊂ B ⊂ N for all B ∈ N . Anexceptional family N is said to be σ-consistent if

⋃nA(n) ∈ N for all A ∈ Seq(N ).

Let N be a σ-consistent exceptional family of Sbset(Ω). For example N = Null(P ).

We denote the symmetric difference of two sets A and B by A on B.

2.14 Lemma. If σ is an F·-stopping time then (F· ∨N )σ = Fσ ∨N .

Proof. Clearly we have that Fσ ⊂ (F· ∨N )σ. On the other hand Sbset(A) ⊂ N for A ∈ N ,which implies that N ⊂ (F· ∨N )σ. Conversely suppose that A ∈ (F· ∨N )σ. There exists asequence of sets B· such that Bn ∈ Fn and (A ∩ σ = n) on Bn ∈ N for all n ∈ Z≥0. Wesee that Bn ∩ σ = n ∈ Fn. It therefore follows that

B := (A ∩ σ = +∞) ∪∞⋃n=0

(Bn ∩ σ = n) ∈ Fσ.

Since A on B ⊂⋃∞n=0((A ∩ σ = n) on Bn) ∈ N , we get A ∈ Fσ ∨N .

10

Page 11: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

2.15 Lemma. Suppose σ and τ are F· ∨N -stopping times.(i) If τ < σ ∈ N then (F·∨N )σ ⊂ (F·∨N )τ . If τ 6= σ ∈ N then (F·∨N )σ = (F·∨N )τ .(ii) There exists an F·-stopping time σ such that σ 6= σ ∈ N .

Proof. (i) We see by Lemma 2.10 that A ∩ σ ≤ τ ∈ (F ∨N )τ for all A ∈ (F ∨N )σ.(ii) For each n ∈ Z≥0 there exists An ∈ Fn such that

σ = n on An ∈ N .

We set N := σ 6∈ Z≥0 ∪ +∞ ∪⋃∞n=0(σ = n on An). Then we have that

An ⊂ σ = n ∪N , σ = n ⊂ An ∪N for all n ∈ Z≥0 and N ∈ N .

It follows that

An ∩ (⋃n−1k=0 Ak) ⊂ N and (

⋃∞n=0An)

c on σ = +∞ ⊂ N .

Define σ(ω) := infn ∈ Z≥0 : ω ∈⋃nk=0Ak. Then, since

σ = n on An ∩ (⋃n−1k=0 Ak)

c and σ = +∞ on (⋃∞n=0An)

c,

we see that σ = n ∈ Fn for all n ∈ Z≥0 and σ 6= σ ∈ N a.s.

2.16 Remark. Let σ be as above. Suppose in addition that there exists T ∈ Z≥0 such thatσ ≤ T a.s. Then the F·-stopping time σ ∧ T satisfies σ ∧ T = σ a.s.

We slightly extend the class of submartingales.

2.17 Definition. A stochastic process X· such that Xn is lower semi-integrable for alln ∈ Z≥0 and Xn ∈ E≤[Xn+1|Fn] for all n ∈ Z≥0 is called an F·-submartingale.

2.18 Lemma. Suppose that X· is an F·-submartingale and f : R ∪ +∞ → R ∪ +∞ isa non-decreasing convex function. Then f(X·) is an F·-submartingale.

Proof. Let Y ∈ E[Xn+1|Fn]. Then it follows by Theorem 1.13 that f(Xn+1) is lower semi-integrable and f(Y ) ∈ E≤[f(Xn+1)|Fn]. Since Xn ≤ Y a.s. and the function f is non-decreasing, we have that f(Xn) ≤ f(Y ) a.s. Thus f(Xn) ∈ E≤[f(Xn+1)|Fn].

2.19 Remark. If X· is an F·-martingale then f(X·) is an F·-submartingale without f beingnon-decreasing.

2.20 Lemma. Let X· be an F·-submartingale. If σ and τ are bounded F·-stopping times suchthat P (τ = σ ∪ τ = σ + 1) = 1 then Xτ is lower semi-integrable and Xσ ∈ E≤[Xτ |Fσ].

Proof. There exists T ∈ Z≥0 such that σ ∨ τ ≤ T . Let A ∈ Fσ. Then

E[Xτ ;A] =∑T

n=0E[Xτ ;A, σ = n],

where A ∩ σ = n ∈ Fn for each n. Observe that σ = n, τ 6= n = σ = n, τ = n+ 1 upto Null(P ) and the left hand side belongs to Fn. It then follows that

E[Xτ ;A, σ = n, τ 6= n] = E[Xn+1 ;A, σ = n, τ 6= n] ≥ E[Xn ;A, σ = n, τ 6= n].

The right hand side reads E[Xσ ;A, σ = n, τ 6= n]. We thus get E[Xτ ;A] ≥ E[Xσ ;A] forall A ∈ Fσ by adding E[Xτ ;A, σ = n, τ = n] = E[Xσ ;A, σ = n, τ = n]. Since Xσ isFσ-measurable by Lemma 2.7, we reach the statement.

11

Page 12: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

2.21 Corollary. Let X· be an F·-submartingale. If σ and τ are bounded F·-stopping timeswith σ ≤ τ then Xτ is lower semi-integrable and Xσ ∈ E≤[Xτ |Fσ].

Proof. By applying Lemma 2.20 to the pair of stopping times τ ∧ (σ+n) and τ ∧ (σ+n+1)we get Xτ∧(σ+n) ∈ E≤[Xτ∧(σ+n+1)|Fτ∧(σ+n)] for all n ∈ Z≥0. There exists T ∈ Z≥0 such thatτ ≤ T . Since τ ∧ σ = σ and τ ∧ (σ + T ) = τ , we get the conclusion by Theorem 1.17.

3 Optional sampling–continuous time

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields. Weswitch to discuss submartingales with continuous parameter. So the index set is R≥0.

3.1 Lemma. Given A ∈ Sbset(E) define τ : Map(R≥0, E) → R≥0 ∪ +∞ as follows:τ(ω) := infs ∈ R>0 : ω(s) ∈ A the hitting time of path ω to the set A. Let t ∈ R>0.(i) ω : τ(ω) < t =

⋃0<s<tω : ω(s) ∈ A = projΩ(s, ω) : 0 < s < t, ω(s) ∈ A.

(ii) A open ⇒ ω ∈ RC(R≥0, E) : τ(ω) < t =⋃s∈Q:0<s<tω ∈ RC(R≥0, E) : ω(s) ∈ A.

3.2 Definition. A mapping σ : Ω → R≥0 ∪ +∞ such that σ < t ∈ Ft for all t ∈ R≥0 iscalled an F·-optional time. Given an F·-optional time σ, set

Fσ+ := A ∈ F : A ∩ σ < t ∈ Ft ∀t ∈ R≥0.

3.3 Remark. If σ is a constant optional time then Fσ+ =⋂Ft ; t ∈ R>s where σ(Ω) = s.

Thus we shall write Fs+ :=⋂Ft ; t ∈ R>s where there is no fear of confusion. Note that

s 7→ Fs+ is a right continuous filtration of sub σ-fields with index space R≥0.

3.4 Example. Let X· be an F·-adapted process and D be an open set. If every samplepath (almost every sample path) is right continuous then infs ∈ R>0 : Xs ∈ D as wellas infs ∈ R≥0 : Xs ∈ D, called the entry time of sample path t 7→ Xt to the set D, areF·-optional times (F· ∨ Null(P )-optional times) by Lemma 3.1(ii).

3.5 Definition. A mapping σ : Ω → R≥0 ∪ +∞ such that σ ≤ t ∈ Ft for all t ∈ R≥0 iscalled an F·-stopping time. Given an F·-stopping time σ, set

Fσ := A ∈ F : A ∩ σ ≤ t ∈ Ft ∀t ∈ R≥0.

3.6 Lemma. (i) Any F·-stopping time is an F·-optional time. Both Fσ and Fσ+ are definedfor an F·-stopping time σ and Fσ ⊂ Fσ+.(ii) σ is an F·-optional time if and only if it is an F·+-stopping time. If the filtration F· isright continuous then any F·-optional time σ is an F·-stopping time and Fσ = Fσ+ holds.(iii) The notions F·-optional time, F·+-stopping time and F·+-optional time are equivalent.Moreover Fσ+ = (F·+)σ = (F·+)σ+ for an F·-optional time σ.

We shall later show that the hitting time of a continuous process to a closed set is astopping time under some mild assumption on the state space, see Theorem 4.4.

3.7 Lemma. Let Xn be a uniformly integrable sequence of random variables.(i) lim infn→∞ |Xn| is integrable. lim supn→∞E[|Xn|] ≤ E[lim supn→∞ |Xn|].(ii) If lim infn→∞Xn = lim supn→∞Xn a.s then lim infn→∞Xn is integrable and X· convergesto lim infn→∞Xn in L1-sense.

12

Page 13: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. (i) Invoking Fatou’s lemma, we get E[lim infn→∞ |Xn|] ≤ lim infn→∞E[|Xn|]. There-fore lim infn→∞ |Xn| is integrable. On the other hand

E[|Xk|] ≤ E[min|Xk|, c] + E[|Xk| ; |Xk| > c] for all c > 0.

Since lim supk→∞ min|Xk|, c = minlim supk→∞ |Xk|, c, we see by Fatou’s lemma that

lim supk→∞

E[|Xk|] ≤ E[minlim supk→∞

|Xk|, c] + supn∈N

E[|Xn| ; |Xn| > c].

Tending c→ +∞, we get (i).(ii) It follows that | lim infn→∞Xn| = lim infn→∞ |Xn| a.s. Hence Y := lim infn→∞Xn is

integrable by (i). Since −∞ < Y < +∞ a.s., we have that lim supn→∞ |Xn − Y | = 0 a.s.The sequence Xn − Y is uniformly integrable by Lemma 1.26. Applying (i) we infer thatlim supn→∞E[|Xn − Y |] ≤ E[lim supn→∞ |Xn − Y |] = 0.

3.8 Lemma. (i) Suppose that X· is an F·-adapted process. If almost every sample path(every sample path) is right continuous and σ is an F· ∨Null(P )-optional time (F·-optionaltime) then Xσ1σ<+∞ is (F· ∨ Null(P ))σ+-measurable (Fσ+-measurable).(ii) Suppose that X· is an F·-submartingale. If almost every sample path is right continuousand σ is a bounded F· ∨ Null(P )-optional time then Xσ is integrable.

Proof. (i) To save the space we write the filtration F·∨Null(P ) by F·. Let n ∈ Z≤0. Denoteby Fn

· the filtration q 7→ Fq with index set 2nZ≥0. We set

σ(n) := minq ∈ 2nZ≥0 : q > σ.

Then σ(n) is an Fn· -stopping time since σ(n) ∈ 2nZ≥0 ∪ +∞ and

σ(n) = q = σ < q ∩ σ ≥ q − 2n ∈ Fq for all q ∈ 2nZ≥0.

Since σ(n) ↓ σ as n tends to −∞ and X· is right continuous a.s., it follows that

lim infn→−∞Xσ(n) =a.s.

Xσ and lim supn→−∞Xσ(n) =a.s.

Xσ on σ < +∞.

The random variable Xσ(n)1σ(n)<+∞ is Fnσ(n)-measurable by Lemma 2.7. We have that

(?) σ(n) ≤ t =⋃σ(n) ≤ q ; q ∈ 2nZ≥0 : q ≤ t =

⋃σ < q ; q ∈ 2nZ≥0 : q ≤ t.

for all t ∈ R≥0. This implies that σ(n) is an F·-stopping time and Fσ(n) = Fnσ(n). In particular

Xσ(n)1σ(n)<+∞ is Fσ(n)-measurable. We show that

Fσ+ =⋂

n∈Z:n≤0

Fσ(n)

Indeed let A ∈ Fσ+. The relation (?) implies that

A ∩ σ(n) ≤ q = A ∩ σ < q ∈ Fq for q ∈ 2nZ≥0,

which means A ∈ Fnσ(n) = Fσ(n). Conversely, since σ = infn∈Z:n≤0 σ(n), it follows that

A ∩ σ < t =⋃

n∈Z:n≤0

(A ∩ σ(n) < t) ∈ Ft for all A ∈⋂

n∈Z:n≤0

Fσ(n) and t ∈ R>0.

On the other hand, taking σ(n− 1) ≤ σ(n) into account, we see that

13

Page 14: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Fσ(n−1) ⊂ Fσ(n) for all n ∈ Z≤0.

Consequently lim infn→−∞Xσ(n)1σ(n)<+∞ is Fσ+-measurable and so is Xσ1σ<+∞ becauseσ(n) < +∞ = σ < +∞ for all n ∈ Z≤0 and Null(P ) ⊂ Fσ+.

(ii) We shall discuss the uniform integrability of Xσ(·). Observe that q 7→ Xq with indexset 2n−1Z≥0 is an Fn−1

· -submartingale. On the other hand σ(n) is an Fn−1· -stopping time as

well because σ(n) = q = ∅ if q 6∈ 2nZ≥0. Moreover σ(n) is bounded and

σ(n− 1) = σ(n) or σ(n− 1) = σ(n)− 2n−1.

Therefore we see by Theorem 2.6 that both Xσ(n) and Xσ(n−1) are integrable and

Xσ(n−1) ∈ E≤[Xσ(n)|Fn−1σ(n−1)] and X0 ∈ E≤[Xσ(n)|Fn−1

0 ]

The latter implies that E[Xσ(n)] ≥ E[X0] > −∞ for all n ∈ Z≤0. Thus we get an Fσ(·)-submartingale Xσ(·) with index set Z≤0. Since inf E[Xσ(·)] > −∞, it follows by Lemma 2.13thatXσ(·) is uniformly integrable. We infer by Lemma 3.7 that lim infn→−∞Xσ(n) is integrableand Xσ(·) converges to lim infn→−∞Xσ(n) in L1-sense.

3.9 Theorem. Suppose that X· is an F·-submartingale which is right continuous a.s. and σand τ are F· ∨ Null(P )-optional times such that σ ≤ τ ≤ T a.s. for some T ∈ R>0.(i) Xτ is F-measurable and integrable and Xσ ∈ E≤[Xτ |(F· ∨ Null(P ))σ+].(ii) If every sample path is right continuous and σ is an F·-optional time (respectively F·-stopping time) then Xσ1σ<+∞ ∈ E≤[Xτ |Fσ+] (Xσ1σ<+∞ ∈ E≤[Xτ |Fσ]).

Proof. To save the space we write the filtration F· ∨ Null(P ) by F·. Let n ∈ Z≤0. Denoteby Fn

· the filtration q 7→ Fq with index set 2nZ≥0. We set

σ(n) := minq ∈ 2nZ≥0 : q > σ, τ(n) := minq ∈ 2nZ≥0 : q > τ

As we have seen in the proof of Lemma 3.8 both σ(n) and τ(n) are bounded Fn· -stopping

times. Since q 7→ Xq with index set 2nZ≥0 is an Fn· -submartingale and σ(n) ≤ τ(n) a.s., we

see by Theorem 2.6 that both Xσ(n) and Xτ(n) are integrable and

Xσ(n) ∈ E≤[Xτ(n)|Fnσ(n)].

We verified in the proof of Lemma 3.8 that Fσ+ ⊂ Fnσ(n). It therefore follows that

E[Xσ(n) ;A] ≤ E[Xτ(n) ;A] for all A ∈ Fσ+.

We also verified in the proof of Lemma 3.8 that Xσ(·) converges to Xσ in L1-sense. The sameclaim holds for the pair Xτ(·) and Xτ . Tending n to −∞, we obtain that

E[Xσ ;A] ≤ E[Xτ ;A] for all A ∈ Fσ+.

According to Lemma 3.8, Xσ is Fσ+-measurable. Thus we get (i). Finally Lemma 3.12 andLemma 3.13 show the parenthesis part of (ii).

3.10 Remark. Suppose that σ and τ are F·-stopping times and σ ≤ τ . Given A ∈ Fσ, thefunction S such that S = σ on A and S = τ on Ac is an F·-stopping time. Clearly S ≤ τ . LetX· be an F·-adapted process with a.s. right continuous sample path. Then E[XS] ≤ E[Xτ ]implies E[Xσ ;A] ≤ E[Xτ ;A] provided the integrals make sense.

14

Page 15: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

In what follows we are concerned with optional stopping theorem.

3.11 Definition. A stochastic process X· with state space E is said to be F·-progressivelymeasurable if R[0,t]×Ω → E, (s, ω) 7→ Xs(ω) is Borel(R[0,t])⊗Ft-measurable for each t ≥ 0.

3.12 Lemma. Let X· be an F·-adapted process. If every sample path is right continuous(every sample path is left continuous) then X· is F·-progressively measurable.

Proof. Suppose that f : I × Ω → R where I is an interval of R, and G is a sub σ-field of F .If f(·, ω) is right continuous for every ω ∈ Ω (or left continuous for every ω ∈ Ω) and f(t, ·)is G-measurable for every t ∈ I then f is Borel(I)⊗ G-measurable.

3.13 Lemma. Let X· be an F·-progressively measurable process and τ an F·-optional time(i) The stopped process X·∧τ is F·∩Fτ+-progressively measurable. If τ is an F·-stopping timethen X·∧τ is F· ∩ Fτ -progressively measurable.(ii) 1τ<+∞Xτ is Fτ+-measurable while if τ is an F·-stopping time then it is Fτ -measurable.(iii) If σ is an F·-stopping (optional) time then 1σ<+∞Xσ∧τ is Fσ-(Fσ+-)measurable.

Proof. Let t ∈ R>0. Since t∧ τ ≤ a = τ ≤ a for all a ∈ R[0,t), the random variable t∧ τis Ft-measurable. It is Fτ+-measurable as well for τ ≤ a ∩ τ < s reads τ < s if s ≤ aand τ ≤ a if s > a. It follows that the mapping R[0,t]×Ω → R[0,t], (s, ω) 7→ mins, t∧τ(ω)is Borel(R[0,t])⊗ (Ft ∩ Fτ+)-measurable. Consequently the self mapping

R[0,t] × Ω → R[0,t] × Ω, (s, ω) 7→ (s ∧ τ(ω), ω)

is measurable with respect to Borel(R[0,t])⊗ (Ft ∩ Fτ+). We thus infer that the mapping

R[0,t] × Ω → E, (s, ω) 7→ Xs∧τ(ω)(ω)

is Borel(R[0,t])⊗ (Ft ∩ Fτ+)-measurable. In particular Xt∧τ is (Ft ∩ Fτ+)-measurable.

3.14 Corollary. Suppose that X· is an F·-adapted process whose almost every sample path isright continuous and τ is an F·∨Null(P )-optional time. If σ is an F·∨Null(P )-stopping (op-tional) time then 1σ<+∞Xσ∧τ is (F·∨Null(P ))σ-measurable ((F·∨Null(P ))σ+-measurable).

Proof. There exists Ω0 ∈ F such that P (Ω0) = 1 and X·(ω) is right continuous for all ω ∈ Ω0.Being F· ∨ Null(P )-adapted and right continuous, the modification 1Ω0X· is F· ∨ Null(P )-progressively measurable. (Nevertheless X· itself may fail to be F· ∨ Null(P )-progressivelymeasurable.) Since Null(P ) ⊂ (F· ∨ Null(P ))σ, we get the claim by Lemma 3.13.

3.15 Lemma. Suppose that σ and τ are F·-stopping times.(i) σ ∧ τ and σ ∨ τ are F·-stopping times.(ii) Fσ∧τ = Fσ ∩ Fτ . If σ ≤ τ then Fσ ⊂ Fτ .(iii) τ ≤ σ, σ ≤ τ, τ < σ, σ < τ and τ = σ belong to Fσ∧τ(iv) A ∩ σ ≤ τ ∈ Fτ and A ∩ σ < τ ∈ Fτ for all A ∈ Fσ∨τ . Fσ∨τ = Fσ ∨ Fτ .(v) A ∩ σ ≤ τ ∈ Fσ∧τ , A ∩ σ < τ ∈ Fσ∧τ for all A ∈ Fσ.

Proof. The random variable τ ∧ t is Ft-measurable for all t ∈ R≥0. It then follows thatA ∩ σ ≤ τ ∈ Fτ for all A ∈ Fσ∨τ . Indeed

(A ∩ σ ≤ τ) ∩ τ ≤ t = (A ∩ σ ∨ τ ≤ t) ∩ σ ∧ t ≤ τ ∧ t ∈ Ft for all t ∈ R≥0.

The rest of the discussion is the same as in the proof of Lemma 2.10.

15

Page 16: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

3.16 Remark. (i) Suppose that σ and τ are F·-optional times, equivalently F·+-stoppingtimes. Recall that (F·+)σ = Fσ+ etc. Thus Lemma 3.15 reads as follows: σ ∧ τ and σ ∨ τare F·-optional times. F(σ∧τ)+ = Fσ+ ∩ Fτ+. If σ ≤ τ then Fσ+ ⊂ Fτ+. And so on.(ii) If σ is an F·-optional time and τ is an F·-stopping time then A∩σ < τ ∈ Fτ ∀A ∈ Fσ+.Indeed (A ∩ σ < τ) ∩ τ ≤ t = (A ∩ σ < t) ∩ τ ≤ t ∩ σ ∧ t < τ ∧ t ∈ Ft ∀t ≥ 0.

3.17 Corollary. Suppose that X· is an F·-submartingale such that almost every sample path(every sample path) is right continuous, and σ(1), σ(2) and τ are F·∨Null(P )-optional times(F·-optional times) such that σ(1) ≤ σ(2) a.s. and σ(2) ∧ τ ≤ T a.s. for some T ∈ R>0.(i) Xσ(1)∧τ ∈ E≤[Xσ(2)∧τ |(F· ∨ Null(P ))σ(1)+] (1σ(1)∧τ<+∞Xσ(1)∧τ ∈ E≤[Xσ(2)∧τ |Fσ(1)+]).(ii) The stopped process X·∧τ is an F· ∨ Null(P )-submartingale (F·-submartingale).(iii) If every sample path is right continuous, σ(1) is an F·-stopping time with σ(1) < +∞a.s. and τ is an F·-optional time then 1σ(1)<+∞Xσ(1)∧τ ∈ E≤[Xσ(2)∧τ |Fσ(1)].

Proof. (i) Invoking Corollary 3.14 and Lemma 3.15(ii), we see that 1σ(1)∧τ<+∞X(σ(1)∧τ)∧∞is (F·∨Null(P ))σ(1)+-measurable. On the other hand A∩σ(1) ≤ τ ∈ (F·∨Null(P ))(σ(1)∧τ)+for all A ∈ (F· ∨ Null(P ))σ(1)+ by Lemma 3.15(v). The rest of the discussion is the same asin the proof of Corollary 2.11. (ii) X·∧τ is F· ∨ Null(P )-adapted by Corollary 3.14.

3.18 Question. Suppose that X· is a measurable process and σ is an σX·≤·-stopping time.When X is σX·≤·-progressively measurable? Is the relation σX·∧σ≤t = σX·≤t∧σ holds?

4 More on stopping times

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields. Wediscuss hitting times of continuous process to closed subsets. So the index space is R≥0.

4.1 Lemma. Suppose that E is a topological space, Dn is sequence of open subsets withDn+1 ⊂ Dn for all n ∈ N, F is a closed subset and

⋂n∈NDn = F . Given ω ∈ C(R≥0 → E)

define σn := infs ∈ R≥0 : ω(s) ∈ Dn. Then supn∈N σn = infs ∈ R≥0 : ω(s) ∈ F.

Proof. Since F ⊂ Dn, we have that σn ≤ τ := infs ∈ R≥0 : ω(s) ∈ F. In particular

supn∈N σn ≤ τ .

If supn∈N σn = +∞ then it follows that τ = +∞. Thus we assume that supn∈N σn < +∞ inaddition. Since ω is right continuous,

ω(σn) ∈ Dn ⊂ Dm+1 for all n ∈ Z>m.

(Note that σn is an optional time.) The (quasi-) left continuity implies that

ω(supn∈N σn) ∈ Dm+1 ⊂ Dm for all m ∈ N and hence ω(supn∈N σn) ∈ F .

The latter means supn∈N σn ≥ τ . Thus we get supn∈N σn = τ .

4.2 Lemma. Suppose that E is a topological space, D is an open subset, F is a closed subsetand F ⊂ D. Given ω ∈ C(R≥0 → E) define

S := infs ∈ R≥0 : ω(s) ∈ D and T := infs ∈ R≥0 : ω(s) ∈ F.

If S = +∞ then T = +∞. If T = 0 then S = 0. If S < +∞ and T > 0 then S < T .

16

Page 17: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. Since F ⊂ D, we have that S ≤ T . Hence the first two statements are obvious.Suppose that S < +∞ and T > 0. If T = +∞ or S = 0 then it follows that S < T . Thuswe assume that T < +∞ and S > 0 in addition. We see that

ω(T ) ∈ F and ω(S) ∈ Dc.

The former is because ω is right continuous and F is closed while the latter is because ω isleft continuous and Dc is closed. On the other hand F∩Dc = ∅. Therefore we get T 6= S.

4.3 Lemma. Suppose that σn ∈ Map(Ω,R≥0 ∪ +∞) for n ∈ N. If σk < supn∈N σn onσk < +∞, supn∈N σn > 0 for all k ∈ N then supn∈N σn ≤ t =

⋂n∈Nσn < t for t ∈ R>0.

4.4 Theorem. Suppose that E is metrizable, X· is an F·-adapted process and F is a closedset. If every sample path is continuous then infs ∈ R≥0 : Xs ∈ F is an F·-stopping time.

Proof. Example 3.4, Lemma 4.1, Lemma 4.2 and Lemma 4.3.

Since infs ∈ R≥1/n : Xs ∈ F infs ∈ R>0 : Xs ∈ F as n tends to ∞, the hittingtime infs ∈ R>0 : Xs ∈ F is an F·-optional time.

4.5 Lemma. Suppose that E is metrizable, ρ is a compatible metric on E, ω : R≥0 → E isright continuous and admits left-hand limits everywhere, and F is a closed set. Set τ(n) :=inft ∈ R≥0 : ρ(ω(t), F ) < 1/n for n ∈ N. Then τ(n) ≤ τ(n+ 1) and

supn∈N

τ(n) = mininft ∈ R≥0 : ω(t) ∈ F, inft ∈ R>0 : ω(t− o) ∈ F.

Proof. We may assume that F is not void. We denote the right hand side by ζ. Let n ∈ N.Clearly τ(n) ≤ τ(n + 1). If t ∈ R>0 and ω(t − o) ∈ F then ρ(ω(t − o), F ) < 1/n and,x 7→ ρ(x, F ) being continuous, there exists s ∈ R[0,t) such that ρ(ω(s), F ) < 1/n. Thisimplies τ(n) ≤ ζ for all n ∈ N. Suppose that t ∈ R, δ ∈ R>0 and supn∈N τ(n) < t− δ. Then

s ∈ Q>0 : s < t− δ, ρ(ω(s), F ) < 1/n 6= ∅ for each n ∈ N

due to the right continuity of ω. Therefore we can find a sequence s(·) such that s(n) < t−δand ρ(ω(s(n)), F ) < 1/n for all n ∈ N (the countability of Q>0 prevents us from using theaxiom of choice). Recall that ω is right continuous and admits left hand limits everywhere.On the other hand ρ(·, F ) is continuous and, F being closed, x ∈ E : ρ(x, F ) = 0 = F .If s ∈ clusts(·) then, since s ≤ t − δ and either ρ(ω(s), F ) ≤ 0 or ρ(ω(s − o), F ) ≤ 0 holds,ζ ≤ s ≤ t − δ < t. The sequence s(·) being bounded, we have that clusts(·) 6= ∅, whichimplies that ζ < t. This holds whenever supn∈N τ(n) < t.

4.6 Lemma. Suppose that σ and τ are F·-optional times.(i) τ > 0, σ > 0, τ + σ ≤ t ∈ Ft for all t ∈ R≥0.(ii) If σ is an F·-stopping time then σ > 0, τ + σ ≤ t ∈ Ft for all t ∈ R≥0.(iii) τ + s is an F·-stopping time for all s ∈ R>0.(iv) τ + σ is an F·-optional time. If σ and τ are F·-stopping times then so is τ + σ.

Proof. If 0 < τ ≤ t < τ + σ then maxt− σ, 0 < minτ, t. It follows that

0 < τ < t, τ + σ > t ⊂⋃

s∈Q:0<s<t

(0 < τ < t ∩ τ > s, σ > t− s).

17

Page 18: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

The converse inclusion clearly holds. Consequently

0 < τ < t, τ + σ ≤ t = 0 < τ < t ∩⋂

s∈Q:0<s<t

(τ ≤ s ∪ σ ≤ t− s) for all t ∈ R≥0.

We have that τ > 0, σ > 0, τ + σ ≤ t = 0 < τ < t, τ + σ ≤ t ∩ σ > 0. Since the righthand is Ft-measurable when t ∈ R>0, we get (i). On the other hand σ > 0, τ + σ ≤ tequals τ > 0, σ > 0, τ +σ ≤ t∪τ = 0, 0 < σ ≤ t, which is Ft-measurable by (i) providedt ∈ R>0 and σ is an F·-stopping time. This shows (ii). Finally τ + σ ≤ t coincides withσ > 0, τ + σ ≤ t ∪ σ = 0, τ ≤ t, which is Ft-measurable by (ii) provided both σ and τare F·-stopping times. Thus we reach the second half of (iv). Since an F·-optional time isan F·+-stopping time and vice versa, we obtain the first half of (iv).

Let N be a σ-consistent exceptional family of Sbset(Ω). For example N = Null(P ).

4.7 Lemma. infr1[0,r)(t) ; r ∈ Q>0 = maxt, 0 for all t ∈ R.

4.8 Lemma. (i) If σ is an F·-optional time then (F· ∨N )σ+ = Fσ+ ∨N .(ii) If N ⊂ F0, σ and τ are F·-stopping times and τ < σ ∈ N then Fσ ⊂ Fτ .(iii) Every F· ∨N -optional time τ has an F·-optional time modification τ .

Proof. (i) Clearly we have that Fσ+ ⊂ (F· ∨ N )σ+. On the other hand Sbset(A) ⊂ N forA ∈ N , which implies that N ⊂ (F· ∨ N )σ+. Conversely suppose that A ∈ (F· ∨ N )σ+.There exists a mapping B0 : Q>0 → F such that

B0(q) ∈ Fq and (A ∩ σ < q) on B0(q) ∈ N for all q ∈ Q>0.

Since B0(r) ∩ (A ∩ σ < q)c ⊂ B0(r) ∩ (A ∩ σ < r)c ∈ N for r ∈ Q(0,q], it follows that

(A ∩ σ < q) on B1(q) ∈ N for all q ∈ Q>0

where B1(q) :=⋃r∈Q:0<r≤q B0(r) ∈ Fq. We set

B := (A ∩ σ = +∞) ∪⋂

r∈Q:r>0

((B1(r) ∪ σ ≥ r) ∩ σ < +∞

)Let q ∈ Q>0. Since σ ≥ r ∩ σ < q = ∅ and B1(q) ⊂ B1(r) for all r ∈ Q≥q, we have that

B ∩ σ < q =⋂

r∈Q:0<r<q

((B1(r) ∪ σ ≥ r) ∩ σ < q

)∩ (B1(q) ∩ σ < q).

(This relation may fail off Q>0, which means the impossibility of adjusting this claim forF·-stopping times.) Consequently B ∩ σ < q ∈ Fq for all q ∈ Q>0 and hence B ∈ Fσ+.The above relation also implies that (A ∩Bc) ∩ σ < n ∈ N for n ∈ N. Indeed(

A ∩ (B1(r) ∪ σ ≥ r)c)∩ σ < n ⊂ (A ∩ σ < r) ∩B1(r)

c ∈ N for all r ∈ Q(0,n].

On the other hand (B ∩ Ac) ∩ σ < n ∈ N for n ∈ N derives from((B1(n) ∪ σ ≥ n) ∩ Ac

)∩ σ < n ⊂ B1(n) ∩ (A ∩ σ < n)c ∈ N .

18

Page 19: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Since (A on B) ∩ σ = +∞ = ∅, it follows that A on B =⋃∞n=1(A on B) ∩ σ < n ∈ N .

(ii) According to Lemma 3.15 σ ∧ τ is an F·-stopping time and Fσ∧τ = Fσ ∩ Fτ . Weshow that Fσ∧τ = Fσ. Let A ∈ Fσ and t ∈ R≥0. We see that

A ∩ σ ∧ τ ≤ t = (A ∩ σ ≤ t) ∪ (A ∩ τ ≤ t, σ > t).

Since A∩τ ≤ t, σ > t ⊂ τ < σ and N is an exceptional family, A∩τ ≤ t, σ > t ∈ N .It follows that A∩σ∧ τ ≤ t ∈ Ft∨N = Ft. Consequently we get Fσ ⊂ Fσ∧τ . Incidentallywe have also verified that Fσ∨τ = Fτ .

(iii) There exists a mapping B : Q>0 → F such that

B(q) ∈ Fq, τ < q on B(q) ∈ N for all q ∈ Q>0 and B(q) ⊂ B(r) for all r ∈ Q≥q.

We set τ := infq1B(q) ; q ∈ Q>0 on B(∞) :=⋃q∈Q:q>0B(q) and τ := +∞ off B(∞). Then

τ 6= τ , τ < +∞ ∪ (τ < +∞ on B(∞)) ⊂⋃

q∈Q>0

(τ < q on B(q)

)∈ N

and τ < t =⋃q∈Q:0<q<tB(q) ∈ Ft for all t ∈ R>0.

4.9 Corollary. (i) Suppose that Ft+ ⊂ Ft ∨N for all t ∈ R≥0 and σ is an F·-optional time.Then F· ∨N is a right continuous filtration and (F· ∨N )σ = (F· ∨N )σ+ = Fσ+ ∨N .(ii) If σ is an F·-optional time then (F·+ ∨N )σ = (F·+ ∨N )σ+ = Fσ+ ∨N = (F· ∨N )σ+.

Proof. (i) Let t ∈ R≥0. Since Ft ⊂ Ft+, the given condition implies that Ft+ ∨N = Ft ∨N .It then follows by Lemma 4.8(i) that (F· ∨ N )t+ = Ft+ ∨ N = Ft ∨ N , which means theright continuity of F· ∨ N . Invoking Lemma 3.6(ii), we get (F· ∨ N )σ = (F· ∨ N )σ+. Theright hand side coincides with Fσ+ ∨N by Lemma 4.8(i).

(ii) Since (F·+)t+ = Ft+, we see by (i) that F·+ ∨N is right continuous. Finally we havethat (F·+)σ+ = Fσ+ by Lemma 3.6(iii).

5 Martingale convergence theorem

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields. Wediscuss submartingales with discrete parameter. So the index space is Z≥0.

5.1 Lemma. Suppose that f ∈ map(Z≥0,R) and a, b be a pair of real numbers with a < b.Then lim infn→∞ f(n) < a and lim supn→∞ f(n) > b if and only if

sup]∆ ; ∆ family of disjoint intervals s.t. f(min J) < a, f(max J) > b ∀J ∈ ∆ = +∞.

5.2 Lemma. Let X· be an F·-submartingale and a, b be a pair of real numbers with a < b.

N := sup]∆ ; ∆ family of disjoint intervals s.t. Xmin J < a,Xmax J > b ∀J ∈ ∆

Then there exists a random variable N∗ such that N ≤ N∗ and

E[N∗] ≤ (supk∈N

E[maxXk, a]− E[maxX0, a])/(b− a).

19

Page 20: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. We set S(0) := infi ∈ Z≥0 : Xi < a with the convention inf ∅ = +∞. For n ∈ Nwe successively define

T (n) := infi ∈ Z≥0 : i > S(n− 1), Xi > b, S(n) := infi ∈ Z≥0 : i > T (n), Xi < a.

Clearly S(n− 1) < T (n) provided S(n− 1) < +∞ and T (n) < S(n) provided T (n) < +∞.Introduce the following random variable so that we have N = supk∈NN(k):

N(k) := ]n ∈ N : T (n) ≤ k k ∈ Z≥0.

Since N(k) ≤ k for all k ∈ N, we see that S(n) ≥ T (n) ≥ k when n > k. It follows that

maxXk, a =k∑

n=1

(maxXT (n)∧k, a −maxXS(n−1)∧k, a) + maxXS(0)∧k, a

+k∑

n=1

(maxXS(n)∧k, a −maxXT (n)∧k, a)

Observe that T (N(k)) ≤ k < T (N(k)+1). The first summation in the right hand side reads

N(k)∑n=1

(maxXT (n), a −maxXS(n−1), a)

+ (maxXk, a −maxXS(N(k)), a)1S(N(k))≤k

Since XS(n−1) < a provided S(n− 1) < +∞ and XT (n) > b provided XT (n) < +∞,

maxXT (n), a −maxXS(n−1), a ≥ b− a for all n ∈ N with n ≤ N(k) and

(maxXk, a −maxXS(N(k)), a)1S(N(k))≤k = (maxXk, a − a)1S(N(k))≤k ≥ 0.

Consequently we have that

maxXk, a ≥ (b− a)N(k) + maxXS(0)∧k, a+k∑

n=1

(maxXS(n)∧k, a −maxXT (n)∧k, a).

The random variables T (n) and S(n) are F·-stopping times. Indeed S(0) is an F·-stoppingtime since S(0) ≤ i =

⋂j≤iXj < a. We then see that

T (n) ≤ i =⋃j≤i

S(n− 1) < j,Xj > b, S(n) ≤ i =⋃j≤i

T (n) < j,Xj < b.

Recall that T (n) ≤ S(n). On the other hand, the function x 7→ maxx, a being convex andnon-decreasing, the process maxX·, a is an F·-submartingale by Lemma 2.18. InvokingTheorem 2.6 we infer that

E[maxX0, a] ≤ E[maxXS(0)∧k, a], E[maxXT (n)∧k, a] ≤ E[maxXS(n)∧k, a]

for all n ∈ N, k ∈ Z≥0. We thus obtain that

E[maxXk, a] ≥ (b− a)E[N(k)] + E[maxX0, a].

Since N(k) ≤ N(k + 1) for all k ∈ N and N = supk∈NN(k), we get the conclusion by themonotone convergence theorem.

20

Page 21: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

5.3 Lemma. Let X· be an F·-supermartingale and a, b be a pair of real numbers with a < b.

N := sup]∆ ; ∆ family of disjoint intervals s.t. Xmin J < a,Xmax J > b ∀J ∈ ∆

Then there exists a random variable N∗ such that N ≤ N∗ and

E[N∗] ≤ supk∈N

E[maxa−Xk, 0]/(b− a).

Proof. We use the same notation as in the proof of Lemma 5.2. It follows that

k∑n=1

(XT (n)∧k −XS(n−1)∧k) =

N(k)∑n=1

(XT (n) −XS(n−1)) + (Xk −XS(N(k)))1S(N(k))≤k

≥ (b− a)N(k) + (Xk − a)1S(N(k))≤k for all k ∈ Z>0.

According to Theorem 2.6 the expectation of the left hand side is non-positive.

5.4 Theorem. Suppose that X· is an F·-submartingale, supk∈NE[maxXk, 0] < +∞ andσ and τ is an pair of F·-stopping times with σ ≤ τ a.s. Set X∞ := lim infk→∞Xk.(i) lim infk→∞Xk = lim supk→∞Xk a.s. and moreover E[|Xσ|] < +∞.(ii) If X· is uniformly integrable then X· converges to X∞ in L1-sense and Xσ ∈ E≤[Xτ |Fσ].

Proof. (i) We infer by Lemma 5.1 and Lemma 5.2 that

P (lim infk→∞

Xk < a, lim supk→∞

Xk > b) = 0 for all pair a, b ∈ Q with a < b.

This implies that P (Ω0) = 1 where Ω0 := ω : lim infk→∞Xk(ω) = lim supk→∞Xk(ω). Itfollows by Lemma 2.18 and Theorem 2.11 that

E[|Xσ∧k|] = 2E[maxXσ∧k, 0]− E[Xσ∧k] ≤ 2E[maxXk, 0]− E[X0].

Since Xσ∧k converges to Xσ on Ω0, we see by Fatou’s lemma that

E[|Xσ|] = E[lim infk→∞

|Xσ∧k|] ≤ 2 supk∈N

E[maxXk, 0]− E[X0].

(ii) We infer by (i) and Lemma 3.7 that X· converges to lim infn→∞Xn in L1-sense. LetA ∈ Fσ. Since σ ≤ n ∈ Fσ by Lemma 2.7, it follows by Theorem 2.11 that

E[Xσ ;A ∩ σ ≤ n] = E[Xσ∧n ;A ∩ σ ≤ n] ≤ E[Xτ∧n ;A ∩ σ ≤ n] for all n ∈ N.

The left hand side converges to E[Xσ ;A∩σ < +∞]. The right hand side reads as follows:

E[Xn ;A ∩ σ ≤ n < τ] + E[Xτ ;A ∩ τ ≤ n]

We see that 1σ≤n<τ converges to 1σ<+∞,τ=+∞. Due to the L1-convergence of Xn to X∞,the first term of the above converges to E[X∞ ;A ∩ σ < +∞, τ = +∞]. Consequently

E[Xσ ;A ∩ σ < +∞] ≤ E[X∞ ;A ∩ σ < +∞, τ = +∞] + E[Xτ ;A ∩ τ < +∞].

Since E[Xσ ;A ∩ σ = +∞] equals E[X∞ ;A ∩ σ = +∞] and E[Xτ ;A ∩ τ = +∞]equals E[X∞ ;A ∩ τ = +∞], adding the missing term E[X∞ ;A ∩ σ = +∞], we reachthat E[Xσ ;A] ≤ E[Xτ ;A] for all A ∈ Fσ.

21

Page 22: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

5.5 Corollary. Suppose that X· is an F·-submartingale. If X· is uniformly integrable thenthe family Xσ ;σ F·-stopping times is uniformly integrable where X∞ := lim infk→∞Xk.

Proof. Let σ be an F·-stopping time. Choose Yn ∈ E[Xn+1|Fn] for each n ∈ N. We mayassume that Xn(ω) ≤ Yn(ω) for all ω ∈ Ω else replace Yn by minXn, Yn. Let n ∈ N andA ∈ Fσ. Since E[Yk ;A ∩ σ = n] = E[Xk+1 ;A ∩ σ = n] for all k ∈ N≥n, it follows that

E[∞∑k=n

(Yk −Xk) ;A ∩ σ = n] =∞∑k=n

E[Yk −Xk ;A ∩ σ = n]

=∞∑k=n

(E[Xk+1 ;A ∩ σ = n]− E[Xk ;A ∩ σ = n]).

The summation in the right hand side equals E[X∞ ;A ∩ σ = n] − E[Xn ;A ∩ σ = n]according to Theorem 5.4(ii). We set M := X∞−

∑∞k=0(Yk−Xk), which is integrable. Then

E[−M ;A ∩ σ = n] = E[n−1∑k=0

(Yk −Xk) ;A ∩ σ = n]− E[Xn ;A ∩ σ = n].

The right hand side equals E[∑

k<σ(Yk−Xk) ;A∩σ = n]−E[Xσ ;A∩σ = n]. Therefore

E[−M ;A ∩ σ < +∞] = E[∑k<σ

(Yk −Xk) ;A ∩ σ < +∞]− E[Xσ ;A ∩ σ < +∞].

Note that −M =∑

k<∞(Yk−Xk)−X∞. Adding the missing term E[−M ;A∩σ = +∞],we get

E[−M ;A] = E[∑k<σ

(Yk −Xk) ;A]− E[Xσ ;A] for all A ∈ Fσ.

This reads Xσ ∈ E[M |Fσ] +∑

k<σ(Yk − Xk). Since∑

k<σ(Yk − Xk) is non-negative anddominated by the integrable random variable

∑∞k=0(Yk−Xk), we infer by Theorem 1.23 that

Xσ ;σ F·-stopping times is uniformly integrable.

5.6 Theorem. Suppose X is an integrable random variable and Xn ∈ E[X|Fn] for all n ∈ N.(i) X∞ := lim infn→∞Xn = lim supn→∞Xn a.s. and X· converges to X∞ in L1-sense.(ii) X∞ ∈ E[X|σ(F·)]. If X is σ(F·)∨Null(P )-measurable then X∞ = X a.s. and moreoverXσ ∈ E[X|Fσ] for all F·-stopping times σ.

Proof. Applying Lemma 1.17 we see that X· is an F·-martingale. According to Theorem 1.23⋃∞n=1E[X|Fn] is uniformly integrable. Hence Theorem 5.4 shows (i). Let A ∈ Fn. Then

E[X∞ ;A] = E[Xn ;A] = E[X ;A].

by Corollary 5.5. Therefore the following Dynkin system contains the preσ-field⋃∞n=1Fn:

A ∈ F : E[X∞ ;A] = E[X ;A]

The Dynkin system theorem shows that E[X∞ ;A] = E[X ;A] for all A ∈ σ(F·). Since X∞ isσ(F·)-measurable, we get the first statement of (ii). The rest follows by Theorem 5.4(ii).

22

Page 23: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

5.7 Definition. An F·-supermartingale X· such that Xk ≥ 0 a.s. for all k ∈ Z≥0 andlim supk→∞Xk = 0 a.s. is called an F·-potential.

5.8 Lemma. Suppose that X· is an F·-supermartingale and Xk ≥ 0 a.s. for all k ∈ Z≥0.(i) lim infk→∞Xk = lim supk→∞Xk a.s. and

E[lim supk→∞

Xk ;A] ≤ E[Xn ;A] for all A ∈ Fn and n ∈ Z≥0.

(ii) Let Yn ∈ E[lim supk→∞Xk|Fn] for n ∈ Z≥0. Then 0 ≤ Yn ≤ Xn a.s. for all n ∈ Z≥0 andXn − Yn is an F·-potential.(iii) If E[lim supk→∞Xk] = E[X0] then Xn ∈ E[lim supk→∞Xk|Fn] for all n ∈ Z≥0.

Proof. (i) We see by applying Theorem 5.4 to the non-positive F·-submartingale −X· thatlim infk→∞Xk = lim supk→∞Xk a.s. Let n ∈ Z≥0 and A ∈ Fn. Since E[Xk ;A] ≤ E[Xn ;A]for all k ≥ n and Xk ≥ 0 a.s., Fatou’s lemma shows that

E[lim supk→∞

Xk ;A] = E[lim infk→∞

Xk ;A] ≤ lim infk→∞

E[Xk ;A] ≤ E[Xn ;A].

(iii) Let Y ∈ E[lim supk→∞Xk|Fn]. It follows that Y ≤ Xn a.s. On the other hand

E[Xn] ≤ E[X0] = E[lim supk→∞

Xk] = E[Y ].

Consequently Y = Xn a.s. Thus we get Xn ∈ E[lim supk→∞Xk|Fn].5.9 Theorem. Suppose that µ is a probability measure on (Ω,F) which is locally (F·, P )-absolutely continuous and Xn ∈ D(µ/P |Fn).(i) lim supk→∞Xk < +∞ P -a.s., lim infk→∞Xk = lim supk→∞Xk µ-a.e. and

µ(A ∩ lim supk→∞

Xk < +∞) = E[lim supk→∞

Xk ;A] for all A ∈ σ(F·).

(ii)⋃∞n=1 D(µ/P |Fn) is uniformly integrable ⇔ µ(lim supk→∞Xk < +∞) = 1

⇔ E[lim supk→∞Xk] = 1 ⇔ µ is P -absolutely continuous on σ(F·).

Proof. Since E[lim supn→∞Xn] ≤ 1 by Lemma 5.8, we get P (lim supk→∞Xk = +∞) = 0.We introduce the following probability measure on (Ω,F):

ν(A) := P (A)/2 + µ(A)/2 A ∈ F .

Let Y ∈ D(P/ν|σ(F·)). Then, P being ν-absolutely continuous, it follows that

(?) P (A) = Eν [Y ;A] and µ(A) = 2ν(A)− P (A) = Eν [(2− Y ) ;A] for all A ∈ σ(F·).

Let Yn ∈ Eν [Y |Fn]. Then we have that

P (A) = Eν [Yn ;A] and µ(A) = Eν [(2− Yn) ;A] for all A ∈ Fn.

On the other hand, since Xn ≥ 0 P -a.s., we see by Lemma 1.14 that XnYn ≥ 0 ν-a.s. and

Eν [XnYn ;A] = E[Xn ;A] = µ(A) = Eν [(2− Yn) ;A] for all A ∈ Fn.

The second equality holds because µ is P -absolutely continuous on Fn. We set

Ω∞ :=∞⋂n=1

ω ∈ Ω : 0 ≤ Yn(ω) ≤ 2, Xn(ω)Yn(ω) = 2− Yn(ω).

It follows that ν(Ω∞) = 1. Observe that

23

Page 24: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Yn > 0 on Ω∞ and Xn = 2/Yn − 1 on Ω∞.

Theorem 5.6 claims that lim infn→∞ Yn = Y = lim supn→∞ Yn ν-a.s. Consequently

(??) lim supn→∞

Xn = 2/Y − 1 = lim infn→∞

Xn ν-a.s.

This means that lim supn→∞Xn < +∞ coincides with Y > 0 up to ν-null set. Since µis ν-absolutely continuous, it follows that

µ(A ∩ lim supn→∞

Xn < +∞) = µ(A ∩ Y > 0) for all A ∈ F .

If A ∈ σ(F·) then the right hand side reads Eν [(2 − Y ) ;A ∩ Y > 0] by (?). We see by(??) that (2− Y )1Y >0 = Y lim supn→∞Xn ν-a.s. Therefore

µ(A ∩ lim supn→∞

Xn < +∞) = Eν [Y lim supn→∞

Xn ;A] for all A ∈ σ(F·).

The right hand side coincides with E[lim supn→∞Xn ;A] according to Lemma 1.14.

5.10 Remark. Suppose that X· is an F·-martingale, Xk ≥ 0 a.s. for all k ∈ Z≥0 and thepremeasure

⋃∞n=1Fn → R, A 7→ lim infn→∞E[Xn ;A] is σ-additive. Then

E[lim supk→∞

Xk] + lima→+∞

limn→∞

limm→∞

E[Xm ;m⋃k=n

Xk > a] = 1.

E[lim infk→∞

Xk] + lima→+∞

limn→∞

limm→∞

E[Xm ;m⋂k=n

Xk > a] = 1.

5.11 Example. We choose the following product measure space as (Ω,F , P ):∏(Z≥0, 0, 1),

⊗(Z≥0, Sbset(0, 1)),

⊗(Z≥0, (δ0 + δ1)/2).

We set Fn := σproj≤n for n ∈ Z≥0 where proj≤n is the canonical mapping∏(Z≥0, 0, 1) →

∏(Z[0,n], 0, 1).

Then F· is a filtration on the present probability space. Given p ∈ R(0,1) the measure

µ :=⊗

(Z≥0, (1− p)δ0 + pδ1)

is locally (F·, P )-absolutely continuous. Indeed the random variable

Xn : ω 7→ 2n+1p#k≤n:ω(k)=1(1− p)#k≤n:ω(k)=0

belongs to D(µ/P |Fn) for each n ∈ Z≥0. Clearly if p = 1/2 then Xn(ω) = 1 for all ω and n.In what follows suppose that p 6= 1/2. Observe that

logXn(ω)

n+ 1= log(2− 2p) + log

p

1− p

#k ≤ n : ω(k) = 1n+ 1

.

By virtue of the strong law of large number the above converges to log√

4p(1− p) as n tendsto ∞ P -a.s. Since 4p(1− p) < 1 unless p = 1/2, it follows that lim supn→∞Xn = 0 P -a.s.

24

Page 25: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

6 Submartingale inequalities

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields. Wedirectly discuss the continuous time situation. So the index set is R≥0.

6.1 Lemma. Suppose X· is an F·-submartingale such that almost every sample path is rightcontinuous and σ is an F· ∨ Null(P )-optional time such that σ ≤ T a.s. for some T ∈ R>0.(i) P (supXs ; s ∈ R[0,σ] ≥ λ) ≤ E[maxXσ, 0]/λ for all λ ∈ R>0.(ii) P (infXs ; s ∈ R[0,σ] ≤ λ) ≤ (E[X0]− E[maxXσ, 0])/λ for all λ ∈ R<0.

Proof. There exists a right continuous stochastic process Y· such that Xt = Yt for all t ∈ R≥0

a.s. Then Y· is an F· ∨ Null(P )-submartingale. Given λ ∈ R>0, we write

τ := infs ∈ R≥0 : Ys > λ,

which is an F·∨Null(P )-optional time. Indeed τ < t =⋃s∈Q:0≤s<tYs > λ ∈ Ft∨Null(P )

for all t ∈ R>0 due to the right continuity of sample path. We have that

supYs ; s ∈ R[0,σ] > λ ⇔ τ < σ or ‘τ = σ, Yσ > λ’

Clearly P (τ = σ, Yσ > λ) ≤ E[maxYσ, 0 ; τ = σ, Yσ > λ]/λ. On the other hand, sinceYτ ≥ λ on τ < +∞ due to the right continuity of sample path, it follows that

P (τ < σ) ≤ E[Yτ∧σ ; τ < σ]/λ.

According to Lemma 3.15(ii) we have τ < σ ∈ (F·∨Null(P ))(τ∧σ)+. We see by Theorem 3.9that

E[Yτ∧σ ; τ < σ] ≤ E[Yσ ; τ < σ] ≤ E[maxYσ, 0 ; τ < σ].

Consequently P (τ < σ) ≤ E[maxYσ, 0 ; τ < σ]/λ. Thus we get

P (supYs ; s ∈ R[0,σ] > λ) ≤ E[maxYσ, 0 ; supYs ; s ∈ R[0,σ] > λ]/λ.

We turn to show (ii). This time, given λ ∈ R<0, introduce an F· ∨Null(P )-optional time by

τ := infs ∈ R≥0 : Ys < λ.

We see by Theorem 3.9 that

E[Y0] ≤ E[Yτ∧σ] = E[Yτ ; τ < σ] + E[Yσ ; τ = σ, Yσ < λ] + E[Yσ ; τ ≥ σ, Yσ ≥ λ].

The term E[Yσ ; τ ≥ σ, Yσ ≥ λ] is dominated by E[maxYσ, 0]. Consequently

(E[Yτ ; τ < σ] + E[Yσ ; τ = σ, Yσ < λ])/λ ≤ (E[Y0]− E[maxYσ, 0])/λ.

The left hand side dominates P (τ < σ) + P (τ = σ, Yσ < λ). Thus we get (ii).

6.2 Lemma. Suppose that X· is an F·-submartingale whose almost every sample path is rightcontinuous and σ is an F· ∨ Null(P )-optional time such that σ ≤ T a.s. for some T ∈ R≥0.Then E[supmaxXs, 0 ; s ∈ R[0,σ]p] ≤ p/(p− 1)pE[maxXσ, 0p] for all p ∈ R>1.

Proof. Note that maxX·, 0 is an F·-submartingale. We write

Y := supmaxXs, 0 ; s ∈ R[0,σ].

25

Page 26: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

As is shown in the proof of Lemma 6.1, we have that

P (Y > λ) ≤ E[maxXσ, 0 ;Y > λ]/λ for all λ ∈ R>0.

Let b ∈ R>0. We observe that

E[(Y ∧ b)p] =

∫ b

0

pλp−1P (Y > λ) dλ ≤∫ b

0

pλp−1E[maxXσ, 0 ;Y > λ]/λ dλ.

Invoking Fubini’s theorem we see that the right hand side reads

pE[maxXσ, 0(Y ∧ b)p−1/(p− 1)].

Using Holder’s inequality we infer that

E[(Y ∧ b)p] ≤ p/(p− 1)E[maxXσ, 0p]1/pE[(Y ∧ b)p]1−1/p.

Dividing by E[(Y ∧ b)p]1−1/p (provided it is positive) we get

E[(Y ∧ b)p] ≤ p/(p− 1)pE[maxXσ, 0p].

The inequality trivially holds if E[(Y ∧ b)p] = 0. Tending b to +∞ we reach the claim.

6.3 Remark. The function R>1 → R, p 7→ p log p−p log(p−1) is convex. Indeed the derivativeand the second derivative read log p− log(p− 1)− 1/(p− 1) and 1/p(p− 1)2 respectively.Since the derivative converges to 0 as p tends to +∞, the derivative is negative. Finally thefunction converges to 1 as p tends to +∞. Thus p/(p− 1)p ≥ e for all p ∈ R>1.

6.4 Example. Let M· be an F·-martingale whose almost every sample path is right contin-uous and σ is an F· ∨ Null(P )-optional time such that σ ≤ T a.s. for some T ∈ R≥0.(i) P (sup|Ms| ; s ∈ R[0,σ] ≥ λ) ≤ E[|Mσ|p]/λp for all λ ∈ R>0 and p ∈ R≥1.(ii) E[sup|Ms| ; s ∈ R[0,σ]p] ≤ p/(p− 1)pE[|Mσ|p] for all p ∈ R>1.

Proof. (i) Apply Lemma 6.1(i) to the submartingale t 7→ |Mt|p.(ii) Apply Lemma 6.2 to the submartingale t 7→ |Mt|.

6.5 Example. Let W· be a complex Wiener process and σ a bounded optional time. Then(i) E[max|Wt|p ; t ∈ R[0,σ]] ≤ pp/(p−1)E[|Wσ|p] for all p ∈ R>1.(ii) E[max|Wt| ; t ∈ R[0,σ]] ≤ eE[|Wσ|].

Proof. Observe that |W·|q is a submartingale for all q ∈ R>0. It follows by Lemma 6.1 that

P (Y > λ) ≤ E[|Wσ|q ;Y > λ]/λq for all q ∈ R>0 and λ ∈ R>0

where Y := max|Wt| ; t ∈ R[0,σ]. Let b ∈ R>0. We observe that

E[(Y ∧ b)p] =

∫ b

0

pλp−1P (Y > λ) dλ ≤∫ b

0

pλp−1E[|Wσ|p−1 ;Y > λ]/λp−1 dλ.

The rest of the argument is the same as that in the proof of Lemma 6.2.

26

Page 27: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

7 Local martingales and uniform integrability

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields. Wediscuss local martingales with continuous parameter. So the index space is R≥0.

Time(F·) denotes the set of all F·-optional times.

7.1 Definition. A stochastic processX· is called an F·-local submartingale if it is F·-adaptedand there exists a sequence S(·) of Time(F· ∨ Null(P )) such that

S(n) ≤ S(n+ 1) a.s., supn∈N S(n) = +∞ a.s., E[|X0| ;S(n) > 0] < +∞ andX·∧S(n) −X0 is an F· ∨ Null(P )-submartingale.

Such S(·) is called a reducing sequence.

7.2 Remark. Let t ∈ R≥0 and a ∈ R. Observe that the set Xt < a coincides with⋃∞n=1Xt < a, t ≤ S(n) =

⋃∞n=1Xt∧S(n) < a, t ≤ S(n) up to Null(P ).

7.3 Lemma. (i) All F·-submartingales are F·-local submartingales.(ii) A stochastic process X· is a F·-local submartingale if and only if it is an F·-adaptedF· ∨ Null(P )-local submartingale.

Proof. (i) We see by Lemma 1.16 that a stochastic process X· is a F·-submartingale if andonly if it is an F·-adapted F· ∨ Null(P )-submartingale. Thus as for F·-submartingales thesequence S(n) := +∞ serves as a reducing sequence.

7.4 Lemma. Suppose that X· is a stochastic process such that P (|X0| < +∞) = 1 andalmost every sample path is right continuous, and σ ∈ Time(F·). Then X01σ>0 is F0+-measurable, E[|X0| ;σ > 0] < +∞ and X·∧σ − X0 is an F·-submartingale if and only ifX·∧σ1σ>0 is an G·-submartingale where G0 := F0+ and Gt := Ft for t ∈ R>0.

Proof. Suppose that X01σ>0 is F0+-measurable, E[|X0| ;σ > 0] < +∞ and X·∧σ −X0 is anF·-submartingale. Since Xt∧σ −X0 +X01σ>0 = Xt∧σ1σ>0, it follows that

X·∧σ1σ>0 is an F·-submartingale with index set R>0

and E[Xt∧σ ;σ > 0] ≥ E[X0 ;σ > 0] > −∞ for all t ∈ R>0. We see by Lemma 2.13 that thesequence X(1/n)∧σ1σ>0 is uniformly integrable. This sequence converges to X01σ>0 a.s. dueto the right continuity of sample path. Let A ∈ F0+ and t ∈ R>0. Then

E[X(1/n)∧σ ;σ > 0, A] ≤ E[Xt∧σ ;σ > 0, A] for all n ∈ N>1/t.

Thus, tending n to ∞, we conclude that E[X0 ;σ > 0, A] ≤ E[Xt∧σ ;σ > 0, A] and henceX·∧σ1σ>0 is an G·-submartingale. The converse implication is easy.

7.5 Lemma. Suppose that X· and Y· are F·-local submartingales and τ ∈ Time(F·∨Null(P )).(i) If f is nonnegative, bounded and F0-measurable then t 7→ fXt is an F·-local submartingaleand t 7→ −fXt is an F·-local supermartingale.(ii) If almost all sample paths of X and Y are right continuous then X· + Y· is an F·-localsubmartingale and X·∧τ is an F· ∨ Null(P )-local submartingale.(ii) If every sample path of X is right continuous and τ ∈ Time(F·) then X·∧τ is an F·-localsubmartingale.

27

Page 28: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. (ii) We select and fix F·-reducing sequences S(·) and T (·) for X· and Y· respectively.Invoking Corollary 3.17(ii) to the F·∨Null(P )-submartingales t 7→ Xt∧S(n)−X0, we see thatS(·) also serves as an F· ∨ Null(P )-reducing sequence for the F· ∨ Null(P )-adapted processX·∧τ . Likewise the sequence n 7→ S(n) ∧ T (n) serves as an F·-reducing sequence for theF·-adapted process X· + Y·.

7.6 Lemma. Let X· be an F·-local supermartingale. If almost every sample path is rightcontinuous, Xt ≥ 0 a.s. for all t ∈ R≥0 and E[X0] < +∞ then X· is an F·-supermartingale.

Proof. Let S(·) be an F·-reducing sequence for X·. Fix s, t ∈ R≥0 with s < t. Then, sinceX0 is integrable, it follows that

Xs∧S(n) ∈ E≥[Xt∧S(n)|Fs ∨ Null(P )] for all n ∈ N.

For each n ∈ N we choose Yn ∈ E[Xt∧S(n)|Fs] so that Xs∧S(n) ≥ Yn a.s. Let Y ∈ E[Xt|Fs],that is, Y ∈ E[lim infn→∞Xt∧S(n)|Fs]. The right continuity turns Xu ≥ 0 a.s. ∀u ∈ R≥0 intoinfu≥0Xu ≥ 0 a.s. We see by Corollary 1.11 that

Xs = lim infn→∞

Xs∧S(n) ≥ lim infn→∞

Yn ≥ Y a.s.

Fatou’s lemma shows that E[Xs] ≤ lim infn→∞E[Xs∧S(n)] ≤ E[X0], which means the inte-grability of Xs. Since it is Fs-measurable, we get Xs ∈ E≥[Xt|Fs].

7.7 Lemma. Suppose that X· is an F·-supermartingale with almost sure right continuoussample path, Xt ≥ 0 a.s. for all t ∈ R≥0, σ, τ ∈ Time(F· ∨ Null(P )).(i) Xτ ≥ 0 a.s. on τ < +∞, [Xτ ; τ < +∞] ≤ E[X0] and

Xσ∧τ1σ∧τ<+∞ ∈ E≥[Xτ1τ<+∞|(F· ∨ Null(P ))σ+].

(ii) If τ < +∞ a.s. and E[Xτ ] = E[X0] then Xσ∧τ ∈ E[Xτ |(F· ∨ Null(P ))σ+].

Proof. (i) Let A ∈ (F· ∨ Null(P ))σ+ and n ∈ N. Then, since σ ∧ τ is (F· ∨ Null(P ))σ+-measurable, we see by Corollary 3.17 that

E[Xσ∧τ ;A ∩ σ ∧ τ < n] = E[Xσ∧τ∧n ;A ∩ σ ∧ τ < n] ≥ E[Xτ∧n ;A ∩ σ ∧ τ < n].

The right continuity turns Xt ≥ 0 a.s. for all t ∈ R≥0 into inft≥0Xt ≥ 0 a.s. Thereforethe left hand side is dominated by E[Xσ∧τ ;A ∩ σ ∧ τ < +∞]. On the other hand, sinceτ < n ⊂ σ ∧ τ < n, the right hand side dominates E[Xτ∧n ;A ∩ τ < n], which readsE[Xτ ;A ∩ τ < n]. It then follows that

E[Xσ∧τ ;A ∩ σ ∧ τ < +∞] ≥ E[Xτ ;A ∩ τ < n] for all n ∈ N.

Invoking the monotone convergence theorem, we infer that

E[Xσ∧τ ;A ∩ σ ∧ τ < +∞] ≥ E[Xτ ;A ∩ τ < +∞] for all A ∈ Fσ+.

Since Xσ∧τ1σ∧τ<+∞ is (F·∨Null(P ))σ+-measurable by Lemma 3.8 and Lemma 3.15, we reachthe statement (i).

(ii) Suppose τ < +∞ and E[Xτ ] = E[X0]. We select and fix Y ∈ E[Xτ |(F·∨Null(P ))σ+].Then we have that Xσ∧τ ≥ Y by (i) and E[Xσ∧τ ] is dominated by E[X0] = E[Xτ ] = E[Y ].Thus we get Y = Xσ∧τ a.s.

28

Page 29: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

7.8 Remark. If one takes Lemma 3.13 into account, he can also draw the following statement:Suppose that X· is a right continuous F·-supermartingale, Xt ≥ 0 a.s. for all t ∈ R≥0,σ ∈ Time(F·) (respectively σ is an F·-stopping time), τ ∈ Time(F·) and τ < +∞ a.s.(i) 1σ∧τ<+∞Xσ∧τ ∈ E≥[Xτ |Fσ+] (1σ<+∞Xσ∧τ ∈ E≥[Xτ |Fσ]).(ii) If E[Xτ ] = E[X0] then 1σ∧τ<+∞Xσ∧τ ∈ E[Xτ |Fσ+] (1σ<+∞Xσ∧τ ∈ E[Xτ |Fσ]).

7.9 Theorem. Suppose that M· is an F·-local martingale such that almost every sample pathis right continuous, Mt ≥ 0 a.s. for all t ∈ R≥0 and E[M0] < +∞.(i) Let τ ∈ Time(F· ∨ Null(P )). If τ < +∞ a.s. and E[Mτ ] = E[M0] then t 7→ Mt∧τ is anF· ∨ Null(P )-martingale and Mσ∧τ ;σ ∈ Time(F·) is uniformly integrable.(ii) If there exists T ∈ Seq(R≥0) such that supn∈N T (n) = +∞ and E[MT (n)] = E[M0] for alln ∈ N then M· is an F·-martingale.

Proof. Immediate from Lemma 7.6 and Lemma 7.7.

7.10 Definition. Let X· be an F·-adapted process with almost sure right continuous path.It is said to be of class DL if Xt∧σ ;σ ∈ Time(F·) is uniformly integrable for all t ∈ R>0.

7.11 Remark. Suppose X· and Y· are F·-adapted processes with almost sure right continuouspath. If Xt = Yt a.s. for all t ∈ R≥0 and X· is of class DL then so is Y·. See Corollary 1.25.

7.12 Lemma. Let X· be an F·-submartingale with almost sure right continuous sample path.If Xt ≥ 0 a.s. for all t ∈ R≥0 or X· is an F·-martingale then X· is of class DL.

Proof. Due to the right continuity, Xt ≥ 0 for all t ∈ R≥0 a.s. Let σ be an F·-optional timeand T ∈ R>0. It follows that Xσ∧T ≥ 0 a.s. Since Xσ∧T is (F· ∨Null(P ))(σ∧T )+-measurable,we see by Theorem 3.9 that

λP (Xσ∧T > λ) ≤ E[Xσ∧T ;Xσ∧T > λ] ≤ E[XT ;Xσ∧T > λ] ≤ E[XT ].

Thus we get the claim by Corollary 1.19. Indeed if δ ∈ R>0 and E[XT ] ≤ δλ then

E[Xσ∧T ;Xσ∧T > λ] ≤ supB∈F :P (B)≤δ

E[XT ;B] for all σ ∈ Opt(F·).

The same reasoning also applies to the martingale case. Indeed |X·| is a submartingale.

7.13 Corollary. Let M· be an F·-local martingale with almost sure right continuous samplepath. Then M· an F·-martingale if and only if it is is of class DL.

7.14 Lemma. Let X· be an F·-local submartingale with almost sure right continuous path.If Xt ≥ 0 a.s. for all t ∈ R≥0 and there exist an F·-reducing sequence S(·) and p ∈ R>1 suchthat supn∈NE[Xt∧S(n)

p] < +∞ for all t ∈ R>0 then X·p is of class DL.

Proof. Apply Lemma 6.2 to the F·-submartingale X·∧S(n).

7.15 Lemma. Suppose that M· is an F·-local martingale such that almost every sample pathis right continuous and τ ∈ Time(F· ∨ Null(P )).(i) If τ < +∞ a.s. and the family Mσ∧τ1τ>0 ;σ ∈ Time(F·) is uniformly integrable then

Mσ∧τ1τ>0 ∈ E[Mτ1τ>0|(F· ∨ Null(P ))σ+] for all σ ∈ Time(F· ∨ Null(P )).

(ii) If the family Mσ∧t∧τ1τ>0 ;σ ∈ Time(F·) is uniformly integrable for all t ∈ R≥0 thent 7→Mt∧τ1τ>0 is a G· ∨ Null(P )-martingale where G0 := F0+ and Gt := Ft for t ∈ R>0.

29

Page 30: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. We select and fix an F·-reducing sequence S(·) for M·. Then, since (F· ∨ N )0+ =F0+ ∨N by Lemma 4.8(i) where we write N := Null(P ),

M·∧S(n)1S(n)>0 is a G· ∨N -martingale for each n ∈ N

according to Lemma 7.4. Observe that Time(F·∨N ) = Time(G·∨N ). Let σ ∈ Time(F·∨N ).Then we have that

M(σ∧n∧τ)∧S(n)1S(n)>0 ∈ E[M(∞∧n∧τ)∧S(n)1S(n)>0|(G· ∨N )σ+].

by invoking Corollary 3.17(i). Since τ > 0 ∈ (F· ∨N )0+ = (G· ∨N )0, we see that

M(σ∧n∧τ)∧S(n)1S(n)>01τ>0 ∈ E[M(∞∧n∧τ)∧S(n)1S(n)>01τ>0|(G· ∨N )σ+].

Observe that n∧S(n) ∈ Time(F·∨N ). It follows that Mn∧S(n)∧τ1τ>0 is uniformly integrableand hence so is Mn∧S(n)∧τ1S(n)>01τ>0. Moreover Mσ∧n∧S(n)∧τ1S(n)>01τ>0 ] is also uniformlyintegrable by Theorem 1.23. Therefore, tending n to ∞, we reach that

Mσ∧τ1τ>0 ∈ E[Mτ1τ>0|(G· ∨N )σ+].

Note that (G· ∨N )σ+ = (F· ∨N )σ+.

7.16 Definition. An F·-increasing process is an F·-adapted process whose almost everysample path is finite valued, right continuous and non-decreasing and takes the value 0 at 0.

7.17 Theorem. Suppose that M· is an F·-local martingale with almost sure right continuoussample path, A· is an F·-increasing process such that t 7→ |Mt|2−At is an F·-local martingaleand τ ∈ Time(F· ∨ Null(P )). Set G0 := F0+ and Gt := Ft for t ∈ R>0.(i) E[sup|Ms|2 ; s ∈ R≥0, s ≤ τ ;B] ≤ 4E[|M0|2 ;B] + 4E[Aτ ;B] for all B ∈ F0+.(ii) If E[At∧τ ] < +∞ for all t ∈ R≥0 and E[|M0|2 ; τ > 0] < +∞ then t 7→ Mt∧τ1τ>0 andt 7→ |Mt∧τ |21τ>0 − At∧τ are G· ∨ Null(P )-martingales.(iii) If τ < +∞ a.s., E[Aτ ] < +∞ and E[|M0|2 ; τ > 0] < +∞ then E[|Mτ | ; τ > 0] < +∞,E[Mτ ; τ > 0] = E[M0 ; τ > 0] and E[|Mτ |2 ; τ > 0] = E[|M0|2 ; τ > 0] + E[Aτ ].

Proof. We select and fix F·-reducing sequences S(·) and T (·) for M· and for |M·|2 − A·respectively. We shall write N := Null(P ) as usual. Then, according to Lemma 7.4,

M·∧S(n)1S(n)>0 as well as |M·∧T (n)|21T (n)>0 − A·∧T (n) are G· ∨N -martingales.

Note that Time(F· ∨ N ) = Time(G· ∨ N ). Let σ be a bounded F· ∨ N -optional time andn ∈ N. Applying Lemma 6.2 to the G· ∨N -submartingale |M·∧S(n)1S(n)>0|, we see that

E[ sups≤σ∧S(n)

|Ms|2 ;S(n) > 0, B] ≤ 4E[|Mσ∧S(n)|2 ;S(n) > 0, B] for all B ∈ F0+.

On the other hand |Mσ∧T (n)|21T (n)>0 − Aσ∧T (n) is integrable and

E[|Mσ∧T (n)|2 − Aσ∧T (n) ;T (n) > 0, C] = E[|M0|2 ;T (n) > 0, C] for all C ∈ F0+

by Lemma 3.9. The function Aσ∧T (n) is F -measurable by Lemma 3.8 and, almost everysample path being non-decreasing, Aσ∧T (n) ≥ 0 a.s. Consequently we have that

E[|Mσ∧T (n)|2 ;T (n) > 0, C] = E[|Mσ∧T (n)|2 − Aσ∧T (n) ;T (n) > 0, C] + E[Aσ∧T (n) ;C]

= E[|M0|2 ;T (n) > 0, C] + E[Aσ∧T (n) ;C] for all C ∈ F0+.

30

Page 31: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Combining the two relations above, we infer that

E[sup|Ms|2 ; s ∈ R≥0, s ≤ τ ∧ n ∧ T (n) ∧ S(n) ;S(n) > 0, T (n) > 0, B]

≤ 4E[|M0|2 ;T (n) > 0, S(n) > 0, B] + 4E[Aτ∧n∧S(n)∧T (n) ;B].

Almost every sample path of A· is non-decreasing. Hence, tending n to ∞, we get (i) by themonotone convergence theorem. Now suppose that E[|M0|2 ; τ > 0] < +∞. According to(i), if E[At∧τ ] < +∞ for all t ∈ R≥0 then |Mσ∧t∧τ |21τ>0 ;σ ∈ Time(F·) is uniformly inte-grable for all t ∈ R≥0 while if E[Aτ ] < +∞ then |Mσ∧τ |21τ>0 ;σ ∈ Time(F·) is uniformlyintegrable. Therefore we get (ii) and (iii) with the help of Lemma 7.15.

31

Page 32: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

8 Quadratic variation

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields. Theindex set is R≥0. We slightly extend the notion of martingales.

8.1 Definition. A stochastic processM· is called a martingale after time t0 ifMt is integrablefor every t ≥ t0 and Ms ∈ E[Mt|Fs] whenever time parameters satisfy t0 ≤ s < t. (Notethat Mt is adapted after time t0, i.e., Ft-measurable for every t ≥ t0.)

8.2 Lemma. Suppose that M· is an F·-martingale after t(1) ∈ R≥0. If ψ is an Ft(1)+-measurable random variable and E[|Mtψ|] < +∞ for all t ∈ R≥t(1) then t 7→ (Mt−Mt∧t(1))ψand t 7→ (Mt∧t(2) −Mt∧t(1))ψ are F·-martingales where t(2) ∈ R>t(1).

Proof. Since both of the processes vanish for t ∈ R[0,t(1)], they are adapted. Let 0 ≤ s ≤ t.Then s ∨ t(1) ≤ t ∨ t(1) and ψ is Fs∨t(1)-measurable. Due to the martingale property of M·we see by Lemma 1.15 that Ms∨t(1)ψ ∈ E[Mt∨t(1)ψ|Fs∨t(1)]. It therefore follows that

E[Ms∨t(1)ψ ;A] = E[Mt∨t(1)ψ ;A] for all A ∈ Fs.

Subtracting E[Mt(1)ψ ;A] we get

E[(Ms∨t(1) −Mt(1))ψ ;A] = E[(Mt∨t(1) −Mt(1))ψ ;A] for all A ∈ Fs.

We see that Mt∨t(1) − Mt(1) coincides with Mt − Mt∧t(1) for all t ∈ R≥0. Consequentlyt 7→ (Mt−Mt∧t(1))ψ is a martingale. Finally we observe that (Mt∧t(2)−Mt∧t(1))ψ equals thedifference (Mt −Mt∧t(1))ψ − (Mt −Mt∧t(2))ψ.

8.3 Remark. In particular, by choosing ψ = 1, we infer that t 7→Mt−Mt∧t(1) is a martingale.Thus if t0 ≤ t(1) and M· is an F·-martingale after t0 then so is t 7→Mt∧t(1).

8.4 Corollary. Suppose that M· is an F·-martingale after t(1) ∈ R≥0. If ψ· is a processF·-adapted after t(1), E[|ψsMt|] < +∞ for all s, t ∈ R≥t(1) and t(2) ∈ R>t(1) ∪ ∞ thent 7→Mt(ψt∧t(2) − ψt∧t(1))− (Mt∧t(2)ψt∧t(2) −Mt∧t(1)ψt∧t(1)) is an F·-martingale.

Proof. Observe that (Mt −Mt∧t(1))ψt∧t(1) = (Mt −Mt∧t(1))ψt(1) and (Mt −Mt∧t(2))ψt∧t(2) =(Mt − Mt∧t(2))ψt(2) for all t ∈ R≥0. Therefore the process in question coincides with thedifference of t 7→ (Mt−Mt∧t(2))ψt(2) and t 7→ (Mt−Mt∧t(1))ψt(1), both of which are martingalesaccording to Lemma 8.2.

8.5 Definition. An F·-martingale M· is said to be square integrable if E[|Mt|2] < +∞ forall t ∈ R≥0.

8.6 Example. Suppose thatM· andN· are square integrable F·-martingales after s(1) ∈ R≥0

respectively t(1) ∈ R≥0. Given a bounded Fs(1)∨t(1)-measurable function ψ, s(2) ∈ R>s(1)

and t(2) ∈ R>t(1). If s(1) < t(2) and t(1) < s(2) then the following is an F·-martingale:

t 7→ (Mt∧s(2) −Mt∧s(1))(Nt∧t(2) −Nt∧t(1))ψ

− (Mt∧s(2)∧t(2)Nt∧s(2)∧t(2) −Mt∧(s(1)∨t(1))Nt∧(s(1)∨t(1)))ψ.

Otherwise t 7→ (Mt∧s(2) −Mt∧s(1))(Nt∧t(2) −Nt∧t(1))ψ is an F·-martingale.

32

Page 33: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. We first discuss the case t(2) ≤ s(1). We then have that

(Mt∧s(2) −Mt∧s(1))(Nt∧t(2) −Nt∧t(1))ψ = (Mt∧s(2) −Mt∧s(1))(Nt(2) −Nt(1))ψ for all t ∈ R≥0.

Since (Nt(2) − Nt(1))ψ is Fs(1)-measurable, it therefore follows by Lemma 8.2 that t 7→(Mt∧s(2) −Mt∧s(1))(Nt∧t(2) − Nt∧t(1))ψ is an F·-martingale. The same reasoning works forthe case s(2) ≤ t(1). It remains to discuss the case both s(1) < t(2) and t(1) < s(2) hold.So we have that t(1) ∨ s(1) < t(2) ∧ s(2). We write as follows:

Mt∧s(2) −Mt∧s(1) = (Mt∧s(2) −Mt∧u(2)) + (Mt∧u(2) −Mt∧u(1)) + (Mt∧u(1) −Mt∧s(1)),

Nt∧t(2) −Nt∧t(1) = (Nt∧t(2) −Nt∧u(2)) + (Nt∧u(2) −Nt∧u(1)) + (Nt∧u(1) −Nt∧t(1)).

where u(1) := s(1) ∨ t(1) and u(2) := s(2) ∧ t(2). Since either s(2) = u(2) or t(2) = u(2),the cross term (Mt∧s(2)−Mt∧u(2))(Nt∧t(2)−Nt∧u(2)) never contributes. Neither does the crossterm of Mt∧u(1) −Mt∧s(1) and Nt∧u(1) −Nt∧t(1). On the other hand all of the processes

t 7→ (Mt∧s(2) −Mt∧u(2))(Nt∧u(2) −Nt∧t(1))ψ,

t 7→ (Mt∧u(2) −Mt∧s(1))(Nt∧t(2) −Nt∧u(2))ψ,

t 7→ (Mt∧u(2) −Mt∧u(1))(Nt∧u(1) −Nt∧t(1))ψ,

t 7→ (Mt∧u(1) −Mt∧s(1))(Nt∧u(2) −Nt∧u(1))ψ

are F·-martingales by the already established part of the present claim. Consequently all wehave to verify is the martingale property of

t 7→ (Mt∧u(2) −Mt∧u(1))(Nt∧u(2) −Nt∧u(1))ψ − (Mt∧u(2)Nt∧u(2) −Mt∧u(1)Nt∧u(1))ψ.

The process in question is the difference of

t 7→Mt∧u(2)(Nt∧u(2)ψ −Nt∧u(1)ψ)− (M(t∧u(2))∧u(2)Nt∧u(2)ψ −M(t∧u(1))∧u(2)Nt∧u(1)ψ)

and t 7→ (Nt∧u(2) − Nt∧u(1))Mu(1)ψ. The former is an F·-martingale by Corollary 8.4 whilethe latter is an F·-martingale by Lemma 8.2.

8.7 Remark. Suppose that s(1) < s(2) and t(1) < t(2). Then we have that

(s(1), s(2)] ∩ (t(1), t(2)] =

(s(1) ∨ t(1), s(2) ∧ t(2)] if s(1) < t(2) and t(1) < s(2)

∅ otherwise

Suppose that M and N are square integrable F·-martingales after s(1) respectively t(1)and φ and ψ are bounded Fs(1)- respectively Ft(1)-measurable function. If t(2) ≤ s(1) ors(2) ≤ t(1) then the random variables (Ms(2) −Ms(1))φ and (Nt(2) −Nt(1))ψ are orthogonal.

Suppose that M is a square integrable F·-martingale.

Given φ : Z≥0 → R such that φ(0) = 0, φ(i) < φ(i + 1) for all i ∈ Z≥0 andsupi∈Z:≥0 φ(i) = +∞, we write mesh(φ) := supi∈Z:i≥0(φ(i+ 1)− φ(i)) and

Qφt :=

∑i∈Z;i≥0 |Mt∧φ(i+1) −Mt∧φ(i)|2 (=

∑i:φ(i)<t |Mt∧φ(i+1) −Mt∧φ(i)|2).

33

Page 34: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

8.8 Lemma. The process t 7→ |Mt|2 −Qφt is an F·-martingale.

Proof. Since |Mt|2 = |M0|2 +∑

i∈Z;i≥0(|Mt∧φ(i+1)|2 − |Mt∧φ(i)|2), by Example 8.6.

8.9 Lemma. If M is an F·-martingale and E[|Mt|4] < +∞ for all t ∈ R≥0 then

E[|Qφt |2] ≤

∑i:φ(i)<t

2E[(|Mt|2 + |Mt∧φ(i)|2)|Mt∧φ(i+1) −Mt∧φ(i)|2].

If there exists K ∈ R>0 such that |Mt| ≤ K a.s. for all t ∈ R≥0 then E[|Qφt |2] ≤ 4K4.

Proof. Since∑

j:j≥i+1,φ(j)<t |Mt∧φ(j+1) −Mt∧φ(j)|2 = Qt −Qt∧φ(i+1), we have that

|Qt|2 =∑

i:φ(i)<t

|Mt∧φ(i+1) −Mt∧φ(i)|4 +∑

i:φ(i)<t

2|Mt∧φ(i+1) −Mt∧φ(i)|2(Qt −Qt∧φ(i+1)).

We infer by Lemma 8.8 and Lemma 1.15 that

E[|Mt∧φ(i+1) −Mt∧φ(i)|2(Qt −Qt∧φ(i+1))] = E[|Mt∧φ(i+1) −Mt∧φ(i)|2(|Mt|2 − |Mt∧φ(i+1)|2)]

Observe that |Mt∧φ(i+1) − Mt∧φ(i)|2 + 2(|Mt|2 − |Mt∧φ(i+1)|2) ≤ 2|Mt|2 + 2|Mt∧φ(i)|2. Thisimplies the desired estimate. If |Mt| ≤ K a.s. for all t ∈ R≥0 then∑

i:φ(i)<t

E[(|Mt|2 + |Mt∧φ(i)|2)|Mt∧φ(i+1) −Mt∧φ(i)|2] ≤∑

i:φ(i)<t

E[2K2|Mt∧φ(i+1) −Mt∧φ(i)|2].

The right hand side equals 2K2E[|Mt|2 − |M0|2], which is dominated by 2K4.

Given δ ∈ R>0 and f : I → R where I is an interval of R, we write

amp(δ, f) := sups,t∈I:|s−t|≤δ

|f(s)− f(t)|.

8.10 Lemma. If ∆ is a locally finite refinement of (φ(i), φ(i+ 1)] ; i ∈ Z≥0 then∑J∈∆:inf J<t

|Qφt∧sup J −Qφ

t∧inf J |2 ≤ 4(amp(mesh(φ),M |[0,t]))2

∑J∈∆:inf J<t

|Mt∧sup J −Mt∧inf J |2.

Proof. Suppose that i ∈ Z≥0, φ(i) < t, J ∈ ∆ and J ⊂ (φ(i), φ(i+ 1)]. Then

Qφt∧sup J −Qφ

t∧inf J = |Mt∧sup J −Mt∧φ(i)|2 − |Mt∧inf J −Mt∧φ(i)|2

= (Mt∧sup J −Mt∧inf J)(Mt∧sup J +Mt∧inf J − 2Mt∧φ(i)).

It follows that∑

J∈∆:inf J<t |Qφt∧sup J −Qφ

t∧inf J |2 equals∑i:φ(i)<t

∑J∈∆:J⊂(φ(i),φ(i+1)]

|Mt∧sup J −Mt∧inf J |2|Mt∧sup J +Mt∧inf J − 2Mt∧φ(i)|2.

The second factor in the summand is dominated by 2amp(mesh(φ),M |[0,t])2.

34

Page 35: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

8.11 Lemma. Suppose that M is an F·-martingale, its almost every sample path is rightcontinuous and there exists K ∈ R>0 such that |Mt| ≤ K a.s. for all t ∈ R≥0 and that ψ isanother locally finite partition of R≥0. If ε, δ ∈ R>0, mesh(φ) ≤ δ and mesh(ψ) ≤ δ then

P ( sups∈R:0≤s≤t

|Qφs −Qψ

s | ≥ ε) ≤ 32K2

ε2

√E[amp(δ;M |[0,t])4].

Proof. Being the difference of t 7→ |Mt|2−Qφt and t 7→ |Mt|2−Qψ

t , the process t 7→ Qφt −Q

ψt

is an F·-martingale by Lemma 8.8. Moreover it is square integrable. It then follows byExample 6.4 that

P ( sups∈R:0≤s≤t

|Qφs −Qψ

s | ≥ ε) ≤ E[|Qφt −Qψ

t |2]/ε2.

Choose a common refinement of (φ(i), φ(i + 1)] ; i ∈ Z≥0 and (ψ(i), ψ(i + 1)] ; i ∈ Z≥0,for instance the minimal one ∆. We see by Example 8.6 that

E[|Qφt −Qψ

t |2] = E[∑

J∈∆:inf J<t

|Qφt∧sup J −Qψ

t∧sup J −Qφt∧inf J +Qψ

t∧inf J |2].

The summand is dominated by 2|Qφt∧sup J −Qφ

t∧inf J |2 + 2|Qψt∧sup J −Qψ

t∧inf J |2. We thus get

E[|Qφt −Qψ

t |2] ≤ E[(2 + 2)4amp(δ;M |[0,t])2Q∆t ] ≤ 16

√E[amp(δ;M |[0,t])4]E[|Q∆

t |2]

by Lemma 8.10 and Schwarz’ inequality. Finally we apply Lemma 8.9.

8.12 Lemma. Suppose that M· is an F·-martingale, its almost every sample path is contin-uous and there exists K ∈ R>0 such that |Mt| ≤ K a.s. for all t ∈ R≥0. Then there exists anF·-adapted process A· such that its almost every sample path is continuous and, as mesh(φ)tends to 0, sups∈R:0≤s≤t |Qφ

s − As| converges to 0 in probability for all t ∈ R≥0. A posteriorialmost every sample path of A· is nondecreasing and t 7→ |Mt|2 − At is an F·-martingale.

Proof. Let ε ∈ R>0 and t ∈ R≥0. We denote by µ the probability measure on C(R≥0,R)induced byM·. Due to the compact inner regularity of probability measures on Polish spaces,there exists a compact subset Λ of C(R≥0,R) such that µ(Λ) > 1−ε, that is, P (M· 6∈ Λ) < ε.Invoking the Arzela-Ascoli theorem, we can find δ ∈ R>0 such that amp(δ;w|[0,t]) ≤ ε for allw ∈ Λ. On the other hand amp(δ;M |[0,t]) ≤ 2K a.s. It then follows that

E[amp(δ;M |[0,t])4] = E[· · · ;M· ∈ Λ] + E[· · · ;M· 6∈ Λ]

≤ ε4P (M· ∈ Λ) + (2K)2P (M· 6∈ Λ) ≤ ε4 + (2K)2ε.

Consequently we get lim supδ↓0E[amp(δ;M |[0,t])4] = 0 for all t ∈ R≥0 and hence

(?) lim supδ↓0

supφ,ψ:mesh(φ),mesh(ψ)<δ

P ( sups∈R:0≤s≤t

|Qφs −Qψ

s | ≥ ε) = 0 for all ε ∈ R>0, t ∈ R≥0

by Lemma 8.11. We consider a particular sequence of partitions ψ(n, i) = i/2n. We can findφ ∈ Seq(N, ↑) such that

P ( sups∈R:0≤s≤T

|Qψ(m,·)s −Qψ(n,·)

s | ≥ 1/2T ) ≤ 1/2T for all T ∈ N and m,n ∈ N≥φ(T ).

The Borel-Cantelli lemma shows that there exists Ω0 ∈ F such that P (Ω0) = 1 and

35

Page 36: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

t 7→Mt(ω) is continuous for all n ∈ N and

supt:t≤n |Qψ(φ(n+1),·)s (ω)−Q

ψ(φ(n),·)s (ω)| < 1/2n except for finitely many n ∈ N

for all ω ∈ Ω0. We set

At(ω) := lim infn→∞

Qψ(φ(n),·)t (ω) for t ∈ R≥0 and ω ∈ Ω.

The process t 7→ at is F·-adapted. Let ω ∈ Ω0 and T ∈ N. Then, since

∞∑k=n

supt:t≤T

|Qψ(φ(n+1),·)s (ω)−Qψ(φ(n),·)

s (ω)| ≤∞∑k=n

supt:t≤k

| . . . | ≤ 1/2n−1

for all n ∈ N≥T , it follows that

At(ω)− 1/2n−1 ≤ Qψ(φ(n),·)t (ω) ≤ At(ω) + 1/2n−1 for t ∈ R[0,T ] and n ∈ N≥T .

Therefore the continuity of t 7→ At(ω) derives from that of t 7→ Qψ(n,·)t (ω). We verify that

A·(ω) is nondecreasing. Let s, t ∈ Q(2) :=⋃n∈N 2−nZ≥0 and ω ∈ Ω0. Then s, t ∈ 2−mZ≥0

for some m ∈ N. Since both s and t are partition points of ψ(n, ·) for n ∈ N≥m, we see that

Qψ(φ(n),·)s (ω) ≤ Q

ψ(φ(n),·)t (ω) provided s < t and φ(n) ≥ m. It therefore follows that

As(ω) ≤ At(ω) for all s, t ∈ Q(2) with s < t and ω ∈ Ω0.

The continuity of t 7→ At(ω) for ω ∈ Ω0 implies that t 7→ At(ω) is nondecreasing for ω ∈ Ω0.Let φ run through locally finite partitions of R≥0. We see by (?) that

lim supδ↓0

supφ:mesh(φ)<δ

P ( sups∈R:0≤s≤t

|Qφs − As| ≥ ε) = 0 for all ε ∈ R>0, t ∈ R≥0.

Finally we see by Lemma 8.9 that for each t ∈ R≥0 the sequence of random variables Qψ(n,·)t

is uniformly integrable. Consequently the martingale property of t 7→ |Mt|2 − At derives

from that of t 7→ |Mt|2 −Qψ(n,·)t with the help of Lemma 8.8.

8.13 Lemma. Suppose that w : R≥0 → R is continuous and

σ : R → R≥0 ∪ +∞, k 7→ infs ∈ R≥0 : w(s) > k.

(i) Let t ∈ R>0 ∪ +∞. Then supw(s) ; s ∈ R≥0, s ≤ t ≤ k if and only if t ≤ σ(k).(ii) σ(·) is nondecreasing and supk∈R σ(k) = +∞.(iii) Suppose that w is nondecreasing in addition and w(+∞) := supw(s) ; s ∈ R≥0. Thenw(t ∧ σ(k)) = w(t) ∧ k for t ∈ R≥0 ∪ +∞ and k ∈ R with k ≥ w(0).

8.14 Corollary. (i) If M· is an F·-local martingale and its almost every sample path iscontinuous then τ(K) := inft ∈ R≥0 : |Mt| > K forms a reducing sequence.(ii) If M· is an F·-local martingale with almost sure right continuous sample path, A· is anF·-increasing process with almost sure continuous sample path and t 7→ |Mt|2 − At is anF·-local martingale then τ(K) := inft ∈ R≥0 : |At| > K1|M0|≤K forms a common reducingsequence for M· and |M·|2 − A·.

36

Page 37: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. (i) Let K ∈ R>0. Due to the sample path continuity, τ(K) is an F·∨Null(P )-optionaltime by Example 3.4. Since |Mt∧τ(K)1τ(K)>0| ≤ K for all t ∈ R≥0 a.s. by Lemma 8.13(i), itfollows by Lemma 7.15 and Lemma 7.4 that t 7→Mt∧τ(K)−M0 is an F·∨Null(P )-martingale.

(ii) By the same reasoning as above, we see that τ(K) is an F·∨Null(P )-optional time andAt∧τ(K) ≤ K for all t ∈ R≥0 a.s. Since E[|M0|2 ; τ(K) > 0] ≤ K2, invoking Theorem 7.17(ii),we infer that both t 7→Mt∧τ(K)−M0 and t 7→ |Mt∧τ(K)|2−At∧τ(K)− |M0|2 are F· ∨Null(P )-martingales.

8.15 Theorem. Suppose that M· is an F·-local martingale with almost sure continuoussample path. Then there exists an F·-increasing process A· such that its almost every samplepath is continuous and sups∈R:0≤s≤t |Qφ

s −As| converges to 0 in probability as mesh(φ) tendsto 0 for all t ∈ R≥0. A posteriori the process t 7→ |Mt|2−At is an F·-local martingale. If M·is a square integrable F·-martingale then t 7→ |Mt|2 − At is an F·-martingale.

Proof. The following form a reducing sequence for the F·-local martingale M·:

τ(K) := inft ∈ R≥0 : |Mt| > K, K ∈ R>0,

that is, τ(K) ≤ τ(K + 1), supK∈N τ(K) = +∞ a.s. and

M·∧τ(K)1τ(K)>0 is an G· ∨ Null(P )-martingale.

where G0 := F0+ and Gt := Ft for t ∈ R>0 by Corollary 8.14(i). Its proof shows that

|Mt∧τ(K)1τ(K)>0| ≤ K for all t ∈ R≥0 a.s.

Let φ and ψ run through locally finite partitions of R≥0. Observe that

Qφt∧τ(K) =

∑i

|Mt∧φ(i+1)∧τ(K)1τ(K)>0 −Mt∧φ(i)∧τ(K)1τ(K)>0|2 and

P (sups≤t

|Qφs −Qψ

s | ≥ ε) ≤ P (sups≤t

|Qφs∧τ(K) −Qψ

s∧τ(K)| ≥ ε) + P (τ(K) < t).

The proof of Lemma 8.12 shows that the first term of the left hand side in the second lineconverges to 0 as mesh(φ) and mesh(ψ) tend to 0. Therefore it follows that

lim supδ↓0

supφ,ψ:mesh(φ),mesh(ψ)<δ

P (sups≤t

|Qφs −Qψ

s | ≥ ε) ≤ P (τ(K) < t) for all K ∈ R>0.

Since P (τ(K) < t) converges to 0 as K tends to +∞, we thus get

lim supδ↓0

supφ,ψ:mesh(φ),mesh(ψ)<δ

P (sups≤t

|Qφs −Qψ

s | ≥ ε) = 0 for all ε ∈ R>0, t ∈ R≥0.

The rest of the argument for drawing the existence of A· process is the same as in the proofof Lemma 8.12. It remains to show the local martingale property of |M·|2−A·. We see that

lim supδ↓0

supφ:mesh(φ)<δ

P (sups≤t

|Qφs∧τ(K) − As∧τ(K)| ≥ ε) = 0 for all ε ∈ R>0, t ∈ R≥0.

It therefore follows by Lemma 8.12 that

t 7→ |Mt∧τ(K)1τ(K)>0|2 − At∧τ(K) is an G· ∨ Null(P )-martingale for each K ∈ R>0.

37

Page 38: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Thus K 7→ τ(K) forms a reducing sequence for t 7→ |Mt|2−At in view of Lemma 7.4. Finallysuppose that M· is a square integrable F·-martingale. By choosing a sequence ψ(n) of locally

finite partitions, we can realize an almost sure convergence At = lim infn→∞Qψ(n)t . Since

t 7→ |Mt|2 −Qψ(n)t is an F·-martingale by Lemma 8.8, Fatou’s lemma shows that

E[At] ≤ lim infn→∞

E[Qψ(n)t ] = E[|Mt|2]− E[|M0|2] < +∞.

Thus t 7→ |Mt|2 − At is an F·-martingale by Theorem 7.17(ii). Indeed it is F·-adapted.

Given f : R≥0 → R and a left open and right closed interval I of R≥0, we write

var[df ; I] := sup∑J∈∆

|f(sup J)− f(inf J)| ; ∆ finite partition of I.

8.16 Lemma. If f : R≥0 → R is of finite variation and w : R≥0 → R is continuous thent 7→

∑i:φ(i)<tf(t∧φ(i+ 1))− f(t∧φ(i)w(t∧φ(i+ 1))−w(t∧φ(i) converges to 0 locally

uniformly as mesh(φ) tends to 0.

Proof. The sum∑

i:φ(i)<s · · · is dominated by var(df, (0, s])amp[mesh(φ), w|[0,s]].

8.17 Corollary. Let M· be an F·-local martingale. If its almost every sample path is con-tinuous and of finite variation then Mt = M0 for all t ∈ R≥0 a.s.

Proof. We may regard that M0 = 0 a.s. Theorem 8.15 and Lemma 8.16 show that t 7→ |Mt|2is an F·-local martingale. Fix t ∈ R≥0. Let τ(K) be as in the proof of Theorem 8.15. Itfollows that E[|Mt∧τ(K)|2] = 0. Invoking Fatou’s lemma, we see that

E[|Mt|2] = E[lim infK→∞

|Mt∧τ(K)|2] ≤ lim infK→∞

E[|Mt∧τ(K)|2] = 0,

which implies Mt = 0 a.s. We reach the conclusion due to the sample path continuity.

8.18 Corollary. Suppose that M· is an F·-local martingale whose almost every sample pathis continuous, B· is a stochastic process whose almost every sample path is continuous andof finite variation and Xt := Mt +Bt. Then there exists an F·-adapted process A· such thatits almost every sample path is continuous and as mesh(φ) tends to 0

sups∈R:0≤s≤t

|∑

i:φ(i)<s

|Xs∧φ(i+1) −Xs∧φ(i)|2 − As| converges to 0 in probability for all t ∈ R≥0.

A posteriori almost every sample path of A· is nondecreasing and the process t 7→ |Mt|2−Atis an F·-local martingale.

Proof. The sum∑

i:φ(i)<s |Xs∧φ(i+1) −Xs∧φ(i)|2 equals

Qφs + 2

∑i:φ(i)<s

(Ms∧φ(i+1) −Ms∧φ(i))(Bs∧φ(i+1) −Bs∧φ(i)) +∑

i:φ(i)<s

|Bs∧φ(i+1) −Bs∧φ(i)|2.

The second term and the third term converge to 0 by Lemma 8.16. As for the first term weapply Theorem 8.15.

38

Page 39: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

8.19 Definition. (i) An F·-finite variation process A· is an F·-adapted process such thatalmost every sample path is right continuous and of finite variation.(ii) An F·-semimartingale is an F·-adapted stochastic process which is indistinguishable froma sum of an F·-local martingale whose almost every sample path is continuous and an F·-finite variation process whose almost every sample path is continuous.(iii) Given an F·-semimartingale X·, a process which shares the same property as the A·process in Corollary 8.18 is called a quadratic variation of X·.

8.20 Remark. The usage of the term semimartingale deviates from the standard one: an F·-local martingale with almost sure right continuous path plus an F·-finite variation process.

Given an F·-semimartingale X· we write

Mart[X· ;F·] := M· : F·-local martingale with a.s.-continuous sample path,

M0 = 0 a.s., t 7→ Xt −Mt is of finite variation a.s.,Qvar[X· ;F·] := A· : F·-quadratic variation of X·.

8.21 Theorem. Suppose that X·, Y· are F·-semimartingales.(i) If M·, N· ∈ Mart[X·,F·] then Mt = Nt for all t ∈ R≥0 a.s. If M· ∈ Mart[X· ;F·], N· is anF·-adapted process and Mt = Nt for all t ∈ R≥0 a.s. then N· ∈ Mart[X· ;F·].(ii) If Mart[X· ;F· ∨ Null(P )] ∩ Mart[Y· ;F· ∨ Null(P )] 6= ∅ then t 7→ Xt − Yt is of finitevariation a.s. If t 7→ Xt − Yt is of finite variation a.s. then Mart[X· ;F·] = Mart[Y· ;F·].(iii) Qvar[X· ;F·] 6= ∅. If A·, B· ∈ Qvar[X· ;F·] then At = Bt ∀t ∈ R≥0 a.s. If A· is anF·-adapted process, B· ∈ Qvar[X· ;F·] and At = Bt ∀t ∈ R≥0 a.s. then A· ∈ Qvar[X· ;F·].

Proof. (i) The first part follows from Corollary 8.17. (iii) The first part is by Corollary 8.18.The second part is due to the uniqueness of limits in probability.

9 Semimartingale defined by Riemann-Stieltjes integration

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields. Theindex set is R≥0. We recall some facts from Riemann-Stieltjes integration.

9.1 Lemma. All continuous functions are Riemann-Stieltjes integrable with respect to aright (left) continuous integrator of finite variation. All functions of finite variation areRiemann-Stieltjes integrable with respect to a continuous integrator.

Proof. Suppose that a < b, f : R[a,b] → R is of finite variation and w : R[a,b] → R iscontinuous. Let ∆ be a tagged partition of R[a,b], by which we mean a set such that x ∈ R,J is an left open and right closed interval and inf J ≤ x ≤ sup J for all (x, J) ∈ ∆ and(x, J) 7→ J is a finite partition of R(a,b]. Then we have that∣∣∣w(t)−

∑(x,J)∈∆

w(x)1J(t)∣∣∣ ≤ sup

(x,J)∈∆

sups∈J

|w(s)− w(x)| ≤ amp(mesh(∆), w)

for all t ∈ R[a,b]. Suppose that f is right continuous. It follows that∣∣∣ ∫(a,b]

w df −∑

(x,J)∈∆

w(x)f(sup J)− f(inf J)∣∣∣ ≤ amp(mesh(∆), w)var(df,R(a,b]).

39

Page 40: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Therefore the function w is Riemann-Stieltjes integrable with respect to the integrator fand the integral coincides with the counter part

∫(a,b]

w df in the measure theory. If f is left

continuous then∫ b

aw df =

∫[a,b)

w df . On the other hand∣∣∣ ∑(x,J)∈∆

f(inf J + o)w(sup J)− w(inf J) −∑

(x,J)∈∆

f(x)w(sup J)− w(inf J)∣∣∣

≤∑

(x,J)∈∆

var(df, J)|w(sup J)− w(inf J)| ≤ var(df,R(a,b])amp(mesh(∆), w).

Observe that∑

(x,J)∈∆ f(inf J + o)w(sup J)− w(inf J) coincides with

w(b)fr(b)− w(a)f

r(a)−

∑(x,J)∈∆

w(sup J)f r(sup J)− f

r(inf J),

where we write fr(x) = f(x+ o) for x ∈ R[a,b]. Thus we infer that∣∣∣w(b)f

r(b)− w(a)f

r(a)−

∫(a,b]

w dfr −

∑(x,J)∈∆

f(x)w(sup J)− w(inf J)∣∣∣

≤ 2var(df,R(a,b])amp(mesh(∆), w).

Consequently f is Riemann-Stieltjes integrable with respect to the integrator w and

(9.2)

∫ b

a

f dw = w(b)fr(b)−w(a)f

r(a)−

∫(a,b]

w dfr= w(b)f

l(b)−w(a)f

l(a)−

∫[a,b)

w dfl.

Here fl(x) = f(x− o) for x ∈ R[a,b] and the second relation is obtained by changing the roll

of the left and the right. Note that∫ b

a

f dw =

∫ b

a

frdw =

∫ b

a

fldw and w(b)f

r(b)−

∫(a,b]

w dfr= w(b)f

l(b)−

∫(a,b)

w dfr

The latter implies that (s, t) : s < t → R, (s, t) 7→∫ t

sf dw is continuous.

Given t ∈ R≥0 and finite valued processes f· and X·, we set

R[fdX]t := lim infn→∞

∑i∈Z:0≤i<2nt

(Xt∧(i+1)/2n −Xt∧i/2n)fi/2n .

9.3 Corollary. Suppose that X· is an F·-adapted process whose almost every sample pathis finite valued and continuous and f· is an F·-adapted process whose almost every samplepath is of finite variation. Then t 7→ R[fdX]t is an F·-adapted process whose almost everysample path is finite valued and continuous.

Proof. The adaptedness is clear by definition. We select and fix Ω0 ∈ F such that P (Ω0) = 1,X·(ω) is finite valued and continuous and f·(ω) is right continuous and of finite variation forevery ω ∈ Ω0. Then R[fdX]t(ω) =

∫ t

0f·(ω) dX·(ω) for all ω ∈ Ω0 and t ∈ R≥0. The function

t 7→∫ t

0f·(ω) dX·(ω) is continuous for every ω ∈ Ω0 according to the proof of Lemma 9.1.

40

Page 41: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Martc(F·) stands for the space of all F·-local martingales with almost surecontinuous sample path.

9.4 Lemma. Suppose M· ∈ Martc(F·), A· ∈ Qvar[M· ;F·] and f· is an F·-adapted processsuch that supω∈Ω |ft(ω)| < +∞ for all t ∈ R≥0. If φ is a locally finite partition of R≥0 then

I· : t 7→∑

i:φ(i)<t

(Mt∧φ(i+1) −Mt∧φ(i))fφ(i) is an F·-local martingale

and t 7→∑

i:φ(i)<t(At∧φ(i+1) − At∧φ(i))|fφ(i)|2 ∈ Qvar[I· ;F·].

Proof. Let S(·) be a reducing sequence for the F·-local martingale M·. It follows that

t 7→∑

i:φ(i)<T

(Mt∧φ(i+1)∧S(k) −Mt∧φ(i)∧S(k))fφ(i) is an F· ∨ Null(P )-martingale

for each T ∈ R>0 and k ∈ N by Lemma 8.2. The above coincides with

t 7→∑

i:φ(i)<t

(Mt∧S(k)∧φ(i+1) −Mt∧S(k)∧φ(i))fφ(i)

on the time interval [0, T ]. T being arbitrary, this means that the latter process is anF·∨Null(P )-martingale for each k ∈ N. On the other hand the process t 7→ It is F·-adapted.Thus we verified that S(·) also serves as a reducing sequence for the F·-local martingale I·Let ∆ be a refinement of the partition φ(i) ; i ∈ Z≥0. Then∑

J∈∆:inf J<t

|It∧sup J − It∧inf J |2 =∑

i:φ(i)<t

∑J∈∆:J⊂(φ(i),φ(i+1)]

|Mt∧sup J −Mt∧inf J |2|fφ(i)|2.

According to Theorem 8.15 the term∑

J∈∆:J⊂(φ(i),φ(i+1)] |Mt∧sup J −Mt∧inf J |2 converges to

At∧φ(i+1) − At∧φ(i) locally uniformly in probability as mesh(∆) tends to 0. Therefore

sups∈R:0≤s≤t

∣∣∣ ∑J∈∆:inf J<t

|It∧sup J − It∧inf J |2 −∑

i:φ(i)<t

(At∧φ(i+1) − At∧φ(i))|fφ(i)|2∣∣∣

converges to 0 in probability as mesh(∆) tends to 0 for all t ∈ R≥0.

9.5 Lemma. Suppose that Mn· is a sequence of Martc(F·). If there exists a process M· such

that its almost every sample path is right continuous and sups≤t |Mns −Ms| converges to 0

in probability for all t ∈ R≥0 then t 7→Mt ∈ Martc(F· ∨ Null(P )).

Proof. By choosing a subsequence we may assume that

lim supn→∞

sups≤t

|Mns −Ms| = 0 for all t ∈ R≥0 a.s.

It follows that Mt = lim infn→∞Mnt a.s. for all t ∈ R≥0 and hence M· is F·∨Null(P )-adapted.

Let K ∈ R>0. We introduce the following F· ∨ Null(P )-optional times:

σ(n,K) := inft ∈ R≥0 : |Mnt | > K, n ∈ N; τ(K) := inft ∈ R≥0 : |Mt| > K.

The former constitutes a reducing sequence for Mn· as an F· ∨ Null(P )-local martingale for

each n ∈ N by Corollary 8.14(i). In particular we have by Corollary 3.17(ii) that

41

Page 42: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

t 7→Mnt∧τ(K)∧σ(n,K+1) −Mn

0 is an F· ∨ Null(P )-martingale.

Since sups≤t∧τ(K) |Ms| ≤ K on τ(K) > 0 by Lemma 8.13(i),

sups≤t∧τ(K)

|Mns | ≤ sup

s≤t∧τ(K)

|Ms|+ sups≤t∧τ(K)

|Mns −Ms| ≤ K + sup

s≤t|Mn

s −Ms| on τ(K) > 0.

Suppose that ε ∈ R(0,1]. Then, invoking Lemma 8.13(i) again, we infer that

τ(K) > 0 and sups≤t

|Mns −Ms| ≤ ε⇒ t ∧ τ(K) ≤ σ(n,K + 1).

If τ(K) = 0 then t ∧ τ(K) ≤ σ(n,K + 1) holds. Hence

sups≤t

|Mns −Ms| ≤ ε⇒ sup

s≤t|Mn

s∧τ(K)∧σ(n,K+1) −Ms∧τ(K)| ≤ ε.

Therefore we obtain that

P (sups≤t

|Mns∧τ(K)∧σ(n,K+1) −Ms∧τ(K)| > ε) ≤ P (sup

s≤t|Mn

s −Ms| > ε).

This implies that

Mnt∧τ(K)∧σ(n,K+1) −Mn

0 converges to Mt∧τ(K) −M0 in probability

as n tends to ∞. Taking into account that |Mnt∧τ(K)∧σ(n,K+1) −Mn

0 | ≤ 2(K + 1) a.s., we

conclude that t 7→Mt∧τ(K) −M0 is an F· ∨ Null(P )-martingale.

9.6 Lemma. Suppose that f : R≥0 → R is locally bounded and admits left hand limitseverywhere, and v : R≥0 → R is of finite variation and right continuous. Then∑

i:0≤i<2nt

v(t ∧ (i+ 1)/2n)− v(t ∧ i/2n)f(i/2n) converges to

∫(0,t]

f(· − o) dv ∀t ∈ R≥0

where f(t− o) := lim infst:s∈Q f(s).

Proof. The function s 7→ f(s− o) is left continuous by Lemma 14.3(ii).

9.7 Corollary. Suppose that f is an F·+-adapted process whose almost every sample pathis locally bounded and admits left hand limits everywhere, and A· is an F·-finite variationprocess. Then R[fdA]· is an F·-finite variation process. If A· admits almost sure continuoussample path then so does R[fdA]·.

9.8 Lemma. Suppose f : R[a,b] → R is of finite variation and w : R[a,b] → R is continuous.(i) If g : R[a,b] → R is of finite variation then so are f + g and fg : R[a,b] → R, and∫ b

a

(f + g) dw =

∫ b

a

f dw +

∫ b

a

g dw and

∫ b

a

fg dw =

∫ b

a

f dv

where v is the continuous function t 7→∫ t

ag dw.

(ii) If v : R[a,b] → R is continuous then∫ b

a

f d(v + w) =

∫ b

a

f dv +

∫ b

a

g dw and

∫ b

a

f d(vw) =

∫ b

a

g dw + w(b)

∫ b

a

f dv

where g is the function of finite variation t 7→∫

[a,t)v df(· − o) + f(a− o)v(a).

(iii) If ` : R → R is Lipschitz continuous then the composition ` f is of finite variation.

42

Page 43: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. (i) Since var(d(fg),R(a,b]) ≤ sup[a,b] |f |var(dg,R(a,b]) + sup[a,b] |g|var(df,R(a,b]), theproduct fg is of finite variation. Suppose that a ≤ s < t ≤ b. We see by (9.2) that

g(s+ o)(w(t)− w(s))− (v(t)− v(s)) =

∫(s,t]

(w(t)− w) dg(·+ o).

It therefore follows that |g(s+ o)(w(t)−w(s))− (v(t)− v(s))| ≤ amp[t− s, w]var(dg,R(s,t]).(ii) We repeatedly apply the integration by parts formula (9.2). Taking into account that

g(b) +∫ b

af dv = f(b− o)v(b) and

∫[a,b)

w dg =∫

[a,b)wv df(· − o), we get∫ b

a

g dw + w(b)

∫ b

a

f dv = g(b)w(b)− g(a)w(a)−∫

[a,b)

w dg + w(b)

∫ b

a

f dv

= f(b− o)v(b)w(b)− f(a− o)v(a)w(a)−∫

[a,b)

vw df(· − o).

The right hand side equals∫ b

af d(vw).

9.9 Lemma. Suppose that M· ∈ Martc(F·) and A· is an F·-adapted process. Then A· ∈Qvar[M· ;F·] if and only if almost every sample path of A· starts from 0 and is continuousand of finite variation, and t 7→ |Mt|2 − At is an F·-local martingale.

Proof. Theorem 8.15 shows that Qvar[M· ;F·] is not void. Choose an element B· form it.Then it is F·-adapted, its almost every sample path starts from 0 and is continuous andof finite variation. Moreover |M·|2 − B· is an F·-local martingale by Theorem 8.15. Theimplication ⇒ is thus established. To prove the converse suppose that A· shares the sameproperty of B· described above. It follows that A·−B· is an F·-local martingale whose almostevery sample path starts from 0, and is continuous and of finite variation. With the help ofCorollary 8.17 and Theorem 8.21(iv) we infer that A· ∈ Qvar[M· ;F·].

9.10 Theorem. Suppose that M· ∈ Martc(F·), A· ∈ Qvar[M· ;F·] and f· is an F·-adaptedprocess whose almost every sample path is of finite variation. Then the process t 7→ R[fdM ]tbelongs to Martc(F·) and R[|f |2dA]· ∈ Qvar[R[fdM ]· ;F·].

Proof. The sample path continuity of the processes in question is mentioned in Corollary 9.3.We next show the local martingale property of t 7→ R[fdM ]t with additionally assuming thatsupω∈Ω |ft(ω)| < +∞ for all t ∈ R≥0. Let n ∈ N. It follows by Lemma 9.4 that

In· : t 7→∑

i∈Z:0≤i<2nt

(Mt∧(i+1)/2n −Mt∧i/2n)fi/2n .

is an F·-local martingale and

Bn· : t 7→

∑i∈Z:0≤i<2nt

(At∧(i+1)/2n − At∧i/2n)|fi/2n|2 ∈ Qvar[In· ;F·].

According to Theorem 8.15,

t 7→ |Int |2 −Bnt is an F·-local martingale.

43

Page 44: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

We select and fix Ω0 ∈ F such that P (Ω0) = 1,

f·(ω) is of finite variation, M·(ω) is finite valued and continuous, and

A·(ω) is finite valued, nondecreasing and continuous for every ω ∈ Ω0.

It then follows by Lemma 9.1 that∑i∈Z:0≤i<2nt

(Mt∧(i+1)/2n(ω)−Mt∧i/2n(ω))fi/2n(ω) converges to

∫ t

0

f·(ω) dM·(ω),

∑i∈Z:0≤i<2nt

(At∧(i+1)/2n(ω)− At∧i/2n(ω))|fi/2n(ω)|2 converges to

∫ t

0

|f·(ω)|2 dA·(ω)

locally uniformly on R≥0 for each ω ∈ Ω0. Invoking Lemma 9.5, we infer that

R[fdM ]· and |R[fdM ]·|2 −R[|f |2dA]· are F· ∨ Null(P )-local martingales.

Actually, being F·-adapted by definition, they are F·-local martingales. The local martin-gale property of the second process implies that R[|f |2dA]· ∈ Qvar[R[fdM ]· ;F·] in view ofCorollary 9.7 and Lemma 9.9.

We relax the boundedness of the integrand. Let n ∈ N. Since x 7→ maxminx, n,−nis Lipschitz continuous, fn· : t 7→ maxminft, n,−n is an F·-finite variation process byLemma 9.8(iii) and |fnt | ≤ n for all t ∈ R≥0. We see by the preceding paragraph that

t 7→ R[fndM ]t and t 7→ |R[fndM ]t|2 −R[|fn|2dA]t are F·-local martingales.

Let Ω0 be as above. If ω ∈ Ω0 then sups∈R:0≤s≤t |fs(ω)| ≤ n and hence fn· (ω) = f·(ω) on R[0,t]

for sufficiently large n ∈ N. Thus, with the help of Lemma 9.5, we get the full claim.

9.11 Corollary. Suppose that X· is an F·-semimartingale and f· is an F·-adapted processwhose almost every sample path is of finite variation. Then the process t 7→ R[fdX]t is anF·-semimartingale. Moreover if M· ∈ Mart[X· ;F·] and A· ∈ Qvar[X· ;F·] then

R[fdM ]· ∈ Mart[R[fdX]· ;F·] and R[|f |2dA]· ∈ Qvar[R[fdX]· ;F·].

Proof. R[fdX]t = R[fdM ]t +R[fd(X −M)]t for all t ∈ R≥0 a.s. by Lemma 9.8(ii).

10 Product of semimartingales is a semimartingale

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields. Martc(F·)denotes the space of all F·-local martingales with almost sure continuous sample path.

10.1 Lemma. Suppose that X·, Y· are F·-semimartingales. If f is a bounded F0-measurablefunction then fX· is an F·-semimartingale. X· + Y· is an F·-semimartingale.

Proof. Lemma 7.5(i) and Lemma 7.5(iii).

10.2 Lemma. Let X· be an F·+ ∨Null(P )-adapted process. If its almost every sample pathis left continuous then there exists an F·-adapted process X· such that Xt = Xt ∀t ∈ R≥0 a.s.

44

Page 45: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. We select and fix an F·+-adapted modification Y· of X·. There exists Ω0 ∈ F suchthat P (Ω0) = 1 and Xt(ω) = Yt(ω) for all t ∈ Q≥0 and ω ∈ Ω0. Since X· is left continuousa.s., we may assume that t 7→ Xt(ω) is left continuous for all ω ∈ Ω0. It follows that

lim infst:s∈Q

Ys(ω) = lim infst:s∈Q

Xs(ω) = Xt(ω) for all t ∈ R>0 and ω ∈ Ω0.

Thus the process X· : t 7→ lim infst:s∈Q Ys is a desired one.

10.3 Lemma. Suppose that X· and Y· are F·-semimartingales. Then there exists an F·-adapted process A· such that its almost every sample path is continuous and

sups∈R:0≤s≤t

|∑

i:φ(i)<s

(Xs∧φ(i+1) −Xs∧φ(i))(Ys∧φ(i+1) − Ys∧φ(i))− As|,

where φ runs through locally finite partitions of R≥0, converges to 0 in probability as mesh(φ)tends to 0 for all t ∈ R≥0. A posteriori almost every sample path of A· is of finite variationand t 7→ (Mt +m)(Nt + n) − At ∈ Martc(F·) for M· ∈ Mart[X· ;F·] and N· ∈ Mart[Y· ;F·]where m,n are finite valued F0-measurable random variables.

Proof. We see that X· + Y· and X· − Y· are F·-semimartingales, and

M· +N· ∈ Mart[X· + Y· ;F·] and M· −N· ∈ Mart[X· − Y· ;F·].

Corollary 8.18 claims Qvar[X·+Y· ;F·] and Qvar[X·−Y· ;F·] are not void. We select elementsof them, say B· and C· respectively. Due to the polarization identities

XY = (X + Y )2 − (X − Y )2 and MN = (M +N)2 − (M −N)2,

the process A· := (B· − C·)/4 shares the same property as the desired A· process except forthe F·-adaptedness. Lemma 10.2 takes care of the F·-adaptedness.

10.4 Definition. Given two F·-semimartingales X· and Y·, a process which shares the sameproperty as the A· process in Lemma 10.3 is called a cross variation of X· and Y·.

Given F·-semimartingales X·, Y· we write

Crv[X·, Y· ;F·] := A· : F·-cross variation of X· and Y·.

Clearly we have that Crv[X·, X· ;F·] = Qvar[X· ;F·] and Crv[X·, Y· ;F·] = Crv[Y·, X· ;F·].

10.5 Corollary. Suppose that X· and Y· are F·-semimartingales. Then t 7→ XtYt is anF·-semimartingale. If M· ∈ Mart[X· ;F·], N· ∈ Mart[Y· ;F·] and A· ∈ Crv[X·, Y· ;F·] then

M·N· − A· +R[(X −M)dN ]· +R[(Y −N)dM ]· ∈ Mart[X·Y· ;F·].

Proof. There is nothing new to mention about the sample path continuity. The processt 7→ MtNt − At is an F·-local martingale by Lemma 10.3. Both B· : t 7→ Xt − Mt andC· : t 7→ Yt−Nt are F·-finite variation processes. Their product t 7→ BtCt is also an F·-finitevariation process by Lemma 9.8(i). We see in (9.2) that

BtNt = R[BdN ]t +R[NdB]t, CtMt = R[CdM ]t +R[MdC]t for all t ∈ R≥0 a.s.

According to Theorem 9.10, the terms R[BdN ]t and R[CdM ]t define F·-local martingaleswhile R[NdB]t and R[MdC]t constitute F·-finite variation processes. Thus we verified thatt 7→ MtNt + BtNt + CtMt + BtCt is an F·-semimartingale and its local martingale part isgiven by t 7→MtNt − At +R[BdN ]t +R[CdM ]t.

45

Page 46: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

We postpone the discussion on Qvar[X·Y· ;F·] after the introduction of stochastic inte-gration. See Example 12.3.

10.6 Lemma. Suppose that X·, Y· and Z· are F·-semimartingales.(i) Crv[X·, Y·,F·] 6= ∅. If A·, B· ∈ Crv[X·, Y· ;F·] then At = Bt ∀t ∈ R≥0 a.s. If A· is F·-adapted, B· ∈ Crv[X·, Y· ;F·] and At = Bt ∀t ∈ R≥0 a.s. then A· ∈ Crv[X·, Y· ;F·].(ii) Suppose that A· is an F·-adapted process, M· ∈ Mart[X·, ;F·], N· ∈ Mart[Y·, ;F·] andm,n are finite valued F0-measurable random variables. Then A· ∈ Crv[X·, Y· ;F·] if and onlyif almost every sample path of A· starts from 0 and is continuous and of finite variation, andt 7→ (Mt +m)(Nt + n)− At is an F·-local martingale.(iii) If f is a bounded F0-measurable function then fCrv[X·, Z· ;F·] = Crv[fX·, Z· ;F·].Crv[X·, Z· ;F·] + Crv[Y·, Z· ;F·] = Crv[X· + Y·, Z· ;F·].(iv) If Crv[X·, Z· ;F·] ∩ Crv[Y·, Z· ;F·] 6= ∅ then 0 ∈ Crv[X· − Y·, Z· ;F·]. Conversely if0 ∈ Crv[X· − Y·, Z· ;F·] then Crv[X·, Z· ;F·] = Crv[Y·, Z· ;F·].

Proof. (i) Lemma 10.3 shows that Crv[X·, Y· ;F·] is not void. The second part is due to theuniqueness of limits in probability.

(ii) Lemma 10.3 shows the implication ⇒. To prove the converse suppose that A· is F·-adapted, its almost every sample path starts from 0 and is continuous and of finite variation,and M·N· − A· is an F·-local martingale. Choose an element B· form Crv[X·, Y· ;F·]. Itfollows that the difference A·−B· is an F·-local martingale whose almost every sample pathstarts from 0, and is continuous and of finite variation. With the help of Corollary 8.17 and(i) we infer that A· ∈ Crv[X·, Y· ;F·].

(iii) If A· ∈ Crv[X·, Z· ;F·], B· ∈ Crv[Y·, Z· ;F·] and c ∈ R then

A· +B· ∈ Crv[X· + Y·, Z· ;F·] and cA· ∈ Crv[cX·, Z· ;F·].

(iv) Combine (i) and (iii).

10.7 Lemma. Suppose that X· and Y· are F·-semimartingales and τ is an F· ∨ Null(P )-optional time. Then X·∧τ is an F·∨Null(P )-semimartingale, M·∧τ ∈ Mart[X·∧τ ;F·∨Null(P )]for all M· ∈ Mart[X· ;F·] and A·∧τ ∈ Crv[X·∧τ , Y· ;F· ∨Null(P )] for all A· ∈ Crv[X·, Y· ;F·].

Proof. Lemma 7.5(iii). The result about the cross variation derives by definition.

10.8 Theorem. (i) If X·, Y· and Z· are F·-semimartingales and t 7→ Xt − Yt is of finitevariation a.s. then Crv[X·, Z· ;F·] = Crv[Y·, Z· ;F·].(ii) If X· is an F·-semimartingale and 0 ∈ Qvar[X· ;F·] then X· is of finite variation a.s.(iii) If X· is an F·∨Null(P )-semimartingale and 0 ∈ Crv[X·,M· ;F·∨Null(P )] for all boundedM· ∈ Martc(F·) then X· is of finite variation a.s.(iv) If X· and Y· are F· ∨Null(P )-semimartingales and Crv[X·,M· ; F·]∩Crv[Y·,M· ; F·] 6= ∅for all bounded M· ∈ Martc(F·) then t 7→ Xt − Yt is of finite variation a.s.

Proof. (i) Lemma 8.16 shows that 0 ∈ Crv[X·−Y·, Z· ;F·]. Hence we see by Lemma 10.6(iv)that Crv[X·, Z· ;F·] = Crv[Y·, Z· ;F·].

(ii) Let M· ∈ Mart[X· ;F·]. Then t 7→ |Mt|2 is an F·-local martingale by Theorem 10.6(ii).Such process must satisfy Mt = 0 for all t ∈ R≥0 a.s. according to the proof of Corollary 8.17.Thus we reach 0 ∈ Mart[X· ;F·], i.e., X· is of finite variation a.s.

46

Page 47: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

(iii) Let N· ∈ Mart[X· ;F· ∨ Null(P )]. According to Corollary 8.14(i)

τ(k) := inft ∈ R≥0 : |Nt| > k, k ∈ N

forms a reducing sequence for the F·∨Null(P )-local martingale N·. It follows by Lemma 10.2that there exists an F·-adapted process M· such that

Mt = Nt∧τ(k) for all t ∈ R≥0 a.s.

Then M· is an F·-martingale with almost sure continuous sample path. Since |Nt∧τ(k)| ≤ kfor all t ∈ R≥0 a.s. by Lemma 8.13(i), we may assume that |Mt(ω)| ≤ k for all t ∈ R≥0 andω ∈ Ω. Invoking (i) we infer that

Crv[X·,M· ;F· ∨ Null(P )] = Crv[X·, N·∧τ(k) ; · · · ] = Crv[N·, N·∧τ(k) ;F· ∨ Null(P )].

Observe that the left hand side contains 0. Let A· ∈ Qvar[N· ;F· ∨ Null(P )]. Lemma 10.7claims that the right hand side contains A·∧τ(k). It follows by Lemma 10.6(i) that

At∧τ(k) = 0 for all t ∈ R≥0 a.s. for all k ∈ N.

Since τ(k) ≤ τ(k + 1) and supk∈N τ(k) = +∞, we infer that At = 0 for all t ∈ R≥0 a.s.Applying (i) again we reach that 0 ∈ Qvar[N· ;F·∨Null(P )] = Qvar[X· ;F·∨Null(P )], whichimplies that W· is of finite variation a.s. by (ii).

(iv) Combine (iii) in the present statement and Lemma 10.6(iv).

10.9 Corollary. (i) If X· ∈ Martc(F·) and 0 ∈ Qvar[X· ;F·] then Xt = X0 ∀t ∈ R≥0 a.s.(ii) If X· ∈ Martc(F· ∨ Null(P )) and 0 ∈ Crv[X·,M· ;F· ∨ Null(P )] for all bounded M· ∈Martc(F·) then Xt = X0 for all t ∈ R≥0 a.s.(iii) If X·, Y· ∈ Martc(F·∨Null(P )) and Crv[X·,M· ;F·∨Null(P )]∩Crv[Y·,M· ;F·∨Null(P )] 6=∅ for all bounded M· ∈ Martc(F·) then Xt −X0 = Yt − Y0 for all t ∈ R≥0 a.s.

Proof. Theorem 10.8 and Corollary 8.17.

10.10 Theorem. Suppose that X· and Y· are F·-semimartingales, f· and g· are F·-adaptedprocess whose almost every sample path is of finite variation. If A· ∈ Crv[X·, Y· ;F·] then

R[fdA]· ∈ Crv[R[fdX]·, Y· ;F·] and R[fgdA]· ∈ Crv[R[fdX]·, R[gdY ]· ;F·].

Proof. Let M· ∈ Mart[X· ;F·] and N· ∈ Mart[Y· ;F·]. Then Corollary 9.11 shows that

R[fdM ]· ∈ Mart[R[fdX]· ;F·] and R[gdN ]· ∈ Mart[R[gdY ]· ;F·].

Moreover it follows by Corollary 10.8(i) that

Crv[X·, Y· ;F·] = Crv[M·, N· ;F·] and Crv[R[fdX]·, Y· ;F·] = Crv[R[fdM ]·, N· ;F·].

Thus we may assume that X· and Y· are F·-local martingales. We select and fix Ω0 ∈ F suchthat P (Ω0) = 1, and f·(ω) is of finite variation, X0(ω) = 0, and X·(ω) and Y·(ω) are finitevalued and continuous for every ω ∈ Ω0. Then we have that

Yt(ω)

∫ t

0

f(ω) dX(ω) =

∫ t

0

f(ω) d(X(ω)Y (ω))−∫ t

0

h(ω) dY (ω) for all ω ∈ Ω0

47

Page 48: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

by Lemma 9.8(ii) where we write

ht(ω) :=

∫[0,t)

X(ω) df·−o(ω) ω ∈ Ω0

0 ω 6∈ Ω0

The process t 7→ ht is F·∨Null(P )-adapted and its almost every sample path is left continuousand of finite variation. It therefore follows by Lemma 9.8(ii) that

R[fdX]tYt −R[fdA]t = R[fd(XY − A)]t −R[hdY ]t for all t ∈ R≥0 a.s.

Lemma 10.6(ii) shows t 7→ XtYt − At ∈ Martc(F·). Thus, taking F·-adaptedness intoaccount, we see by Theorem 9.10 that t 7→ R[fdX]tYt − R[fdA]t is an F·-local martingale.Invoking Corollary 9.7 and Lemma 10.6(ii), we reach that R[fdA]· ∈ Crv[R[fdX]·, Y· ;F·].Finally, since R[fdR[gdA]]t = R[fgdA]t for all t ∈ R≥0 a.s. by Lemma 9.8(i), we obtain thatR[fgdA]· ∈ Crv[R[fdX]·, R[gdY ]· ;F·].

11 Stochastic integration – beyond Riemann-Stieltjes

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields. Martc(F·)denotes the space of all F·-local martingales with almost sure continuous sample path.

11.1 Theorem. Suppose that M· is an F·-local martingale such that its almost every samplepath is right continuous, and that A· is an F·-increasing process with almost sure continuoussample path such that t 7→ |Mt|2 − At is an F·-local martingale. Then

P (sup|Mt|; t ∈ R≥0, t ≤ τ > λ) ≤ E[Aτ ∧ ε]λ2

+ P (Aτ > ε) +E[|M0|2 ∧ λ2]

λ2

for all F· ∨ Null(P )-optional times τ and λ, ε ∈ R>0.

Proof. Given λ, ε ∈ R>0, we introduce the following F· ∨ Null(P )-optional times:

S := inft ∈ R≥0 : At > ε, T :=

S if |M0| ≤ λ

0 otherwise.

Since A0 = 0 < ε and t 7→ At is continuous a.s., we have that S > 0 a.s. and hence

|M0| ≤ λ = T > 0 mod Null(P ).

Observe that if T > 0 then T = S. Therefore we infer that

(?)P (sup|Mt|; t ≤ τ > λ, τ ≤ S, |M0| ≤ λ) ≤ P (sup|Mt|; t ≤ τ ∧ S > λ, |M0| ≤ λ)

= P (sup|Mt1T>0|; t ≤ τ ∧ T > λ, T > 0) ≤ P (sup|Mt1T>0|; t ≤ τ ∧ T > λ).

We write G0 := F0+ and Gt := Ft for t ∈ R>0. According to Lemma 8.13(iii),

At∧T ≤ At∧S = At ∧ ε ≤ ε for all t ∈ R≥0 a.s.

Since E[|M0|2 ;T > 0] = E[|M0|2 ; |M0| ≤ λ] < +∞, it follows by Theorem 7.17(ii) that

48

Page 49: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

t 7→Mt∧T1T>0 is a square integrable G· ∨ Null(P )-martingale.

Therefore Example 6.4 shows that

P (sup|Mt1T>0|; t ≤ τ ∧ n ∧ T > λ) ≤ E[|Mτ∧n∧T1T>0|2]/λ2 for all n ∈ N.

On the other hand we see by Theorem 7.17(iii) that

E[|Mτ∧n∧T |2 ;T > 0] = E[|M0|2 ;T > 0] + E[Aτ∧n∧T ] ≤ E[|M0|2 ; |M0| ≤ λ] + E[Aτ ∧ ε]

where we used that |M0| ≤ λ = T > 0 and Aτ∧n∧T ≤ Aτ ∧ ε a.s. Hence

P (sup|Mt1T>0|; t ≤ τ ∧ n ∧ T > λ) ≤ (E[|M0|2 ; |M0| ≤ λ] + E[Aτ ∧ ε])/λ2.

Tending n to ∞, we reach

P (sup|Mt1T>0|; t ≤ τ ∧ T > λ) ≤ (E[|M0|2 ; |M0| ≤ λ] + E[Aτ ∧ ε])/λ2.

The left hand side dominates P (sup|Mt|; t ≤ τ > λ, τ ≤ S, |M0| ≤ λ) by (?). Thus

P (sup|Mt|; t ≤ τ > λ) ≤ E[|M0|2 ; |M0| ≤ λ]

λ2+E[Aτ ∧ ε]

λ2+ P (τ > S) + P (|M0| > λ).

Finally, taking Lemma 8.13(i) into account, we get the claim.

11.2 Corollary. Suppose that Mn· is a sequence of Martc(F·), A

n· ∈ Qvar[Mn

· ;F·] and τ isan F· ∨ Null(P )-optional time. If both Mn

0 and Anτ converge to 0 in probability then so doesthe sequence sup|Mn

t |; t ∈ R≥0, t ≤ τ.

11.3 Lemma. Suppose that f is an F·+-adapted process whose almost every sample path islocally bounded and admits left hand limits everywhere, A· is an F·-finite variation process,and B· is an F·-increasing process. Then for each t ∈ R≥0

lim supn→∞

sups∈R:0≤s≤t

|R[f([n·]/n, ?)dA]s −R[fdA]s| = 0 a.s.,

lim supn→∞

R[|f([n·]/n, ?)− f |2dB]t = 0 a.s. and

lim supm,n→∞

R[|f([m·]/m, ?)− f([n·]/n, ?)|2dB]t = 0 a.s.

where [x] = max(i ∈ Z : i < x ∪ 0) for x ∈ R.

Proof. Select and fix Ω0 ∈ F such that P (Ω0) = 1, and f(·, ω) is locally bounded and admitsleft hand limits everywhere, A·(ω) is of finite variation and right continuous, and B·(ω) isfinite valued, nondecreasing and right continuous for every ω ∈ Ω0. Let n ∈ N. Then

R[f([n·]/n, ?)dA]t −R[fdA]t(ω) =

∫(0,t]

f([n·]/n, ω)− f(· − o, ω) dA·(ω)

for all t ∈ R≥0 and ω ∈ Ω0 by Lemma 9.6. The right hand side is dominated by∫(0,t]

|f([n·]/n, ω)− f(· − o, ω)| var(dA(ω), ·).

49

Page 50: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Observe that f([nt]/n, ω) converges to f(t − o, ω) as n tends to ∞ for each t ∈ R≥0 andω ∈ Ω0. Due to the local boundedness of f(·, ω), we see that

lim supn→∞

∫(0,t]

|f([n·]/n, ω)− f(· − o, ω)| var(dA(ω), ·) = 0 for all t ∈ R≥0 and ω ∈ Ω0

by the dominated convergence theorem. The argument for other limits is similar.

11.4 Theorem. Suppose that M· ∈ Martc(F·) and f is an F·-adapted process whose almostevery sample path is locally bounded and admits left hand limits everywhere. Then thereexists I· ∈ Martc(F·) such that I0 = 0 a.s. and

R[fdA]· ∈ Crv[I·, N· ;F·] for all N· ∈ Martc(F·) and A· ∈ Crv[M·, N· ;F·]

Proof. We introduce the following sequence of F·-finite variation processes:

fn· : t 7→ f([nt]/n, ?), n ∈ N where [x] = max(i ∈ Z : i < x ∪ 0).

Let n ∈ N. The process t 7→ R[fn· dM ]t belongs to Martc(F·) by Theorem 9.10 and

R[fn· dA]· ∈ Crv[R[fn· dM ]·, N· ;F·]

by Theorem 10.10. Let B· ∈ Qvar[M· ;F·]. We have by Lemma 9.8(i) that

R[fm· dM ]t −R[fn· dM ]t = R[(fm· − fn· )dM ]t for all t ∈ R≥0 a.s.

According to Theorem 9.10,

R[|fm· − fn· |2dB] ∈ Qvar[R[(fm· − fn· )dM ] ;F·].

Since the almost sure convergence implies the convergence in probability, it follows that

lim supm,n→∞

P (R[|fm· − fn· |2dB]t > ε) = 0 for all ε ∈ R>0 and t ∈ R≥0

by Lemma 11.3. Thus, invoking Theorem 11.1, we infer that

lim supm,n→∞

P ( sups∈R:0≤s≤t

|R[fm· dM ]s −R[fn· dM ]s| > λ) = 0 for all λ ∈ R>0 and t ∈ R≥0.

We can find φ ∈ Seq(N, ↑) such that

P ( sups∈R:0≤s≤T

|R[fm· dM ]s −R[fn· dM ]s| ≥ 1/2T ) ≤ 1/2T for all T ∈ N and m,n ∈ N≥φ(T ).

We write Int := R[fn· dM ]t for simplicity. The Borel-Cantelli lemma shows that there existsΩ0 ∈ F such that P (Ω0) = 1 and

t 7→ R[fn· dM ]t(ω) is continuous for all n ∈ N and

supt:t≤n |Iφ(n+1)t (ω)− I

φ(n)t (ω)| < 1/2n except for finitely many n ∈ N

50

Page 51: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

for all ω ∈ Ω0. We set

It(ω) := lim infn→∞

R[fφ(n)· dM ]t(ω) for t ∈ R≥0 and ω ∈ Ω.

The process t 7→ It is F·-adapted. Let ω ∈ Ω0 and T ∈ N. Then, since

∞∑k=n

supt:t≤T

|Iφ(k+1)t (ω)− I

φ(k)t (ω)| ≤

∞∑k=n

supt:t≤k

|Iφ(k+1)t (ω)− I

φ(k)t (ω)| ≤ 1/2n−1

for all n ∈ N≥T , it follows that

It(ω)− 1/2n−1 ≤ Iφ(n)t (ω) ≤ It(ω) + 1/2n−1 for t ∈ R[0,T ] and n ∈ N≥T .

Therefore the continuity of t 7→ It(ω) derives from that of t 7→ Int (ω). Moreover Lemma 9.5shows that I· is an F·-local martingale. We see by Lemma 11.3 that

lim supn→∞

P ( sups∈R:0≤s≤t

|R[fn· dA]s −R[fdA]s| > ε) = 0 for all ε ∈ R>0 and t ∈ R≥0.

We get by Lemma 9.5, Corollary 9.7 and Lemma 10.6(ii) that R[fdA] ∈ Crv[I·, N· ;F·].

Given M· ∈ Martc(F·) and an F·-adapted process f whose almost every samplepath is locally bounded and admits left hand limits everywhere, we set

Ito[fdM ;F·] := I· ∈ Martc(F·) : I0 = 0 a.s., and R[fdA]· ∈ Crv[I·, N· ;F·]

for any bounded N· ∈ Martc(F·) and A· ∈ Crv[M·, N· ;F·].

Stochastic integral of f with integrator M .

11.5 Corollary. Suppose M· ∈ Martc(F·), and f is an F·-adapted process whose almostevery sample path is locally bounded and admits left hand limits everywhere.(i) Ito[fdM ;F·] 6= ∅. If I·, L· ∈ Ito[fdM ;F·] then It = Lt ∀t ∈ R≥0 a.s. If I· is F·-adapted,L· ∈ Ito[fdM ;F·] and It = Lt ∀t ∈ R≥0 a.s. then I· ∈ Ito[fdM ;F·].(ii) Let I· ∈ Ito[fdM ;F·]. If A· ∈ Qvar[M· ;F·], N· ∈ Martc(F·) and B· ∈ Crv[M·, N· ;F·]then R[|f |2dA]· ∈ Qvar[I· ;F·] and R[fdB]· ∈ Crv[I·, N· ;F·].(iii) Ito[fdM ;F·] ⊂ Ito[fdM ;F· ∨ Null(P )].

Proof. (i) Let I·, L· ∈ Ito[fdM ]. Then Crv[I·, N· ;F·] ∩ Crv[L·, N· ;F·] 6= ∅ for any boundedN· ∈ Martc(F·). It follows that It = Lt for all t ∈ R≥0 a.s. by Corollary 10.9(iii). Converselysuppose that I· is F·-adapted, L· ∈ Ito[fdM ] and It = Lt for all t ∈ R≥0 a.s. ClearlyI· ∈ Martc(F·). Theorem 10.8(i) shows that Crv[I·, N· ;F·] coincides with Crv[L·, N· ;F·] forany bounded N· ∈ Martc(F·). Hence I· ∈ Ito[fdM ].

(ii) Combining (i) and Theorem 11.4, we get R[fdB]· ∈ Crv[I·, N· ;F·]. In particular wehave that R[fdA]· ∈ Crv[I·,M· ;F·] = Crv[M·, I· ;F·]. According to Lemma 9.6,

R[|f |2dA]t = R[fdR[fdA]]t for all t ∈ R≥0 a.s.

It therefore follows by Lemma 10.6(i) that R[|f |2dA]· ∈ Crv[I·, I· ;F·] = Qvar[I· ;F·].(iii) We select J· ∈ Ito[fdM ; F·] 6= ∅ where F· := F· ∨ Null(P ). Let I· ∈ Ito[fdM ;F·].

Then we infer that

51

Page 52: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Crv[I·, N· ; F·] ∩ Crv[J·, N· ; F·] 6= ∅ for any bounded N· ∈ Martc(F·).

Indeed, given A· ∈ Crv[M·, N· ;F·] ⊂ Crv[M·, N· ; F·], we have that

R[fdA] ∈ Crv[I·, N· ;F·] ⊂ Crv[I·, N· ; F·] and R[fdA] ∈ Crv[J·, N· ; F·]

It therefore follows that It = Jt for all t ∈ R≥0 a.s. by Corollary 10.9(iii). Hence we concludethat I· ∈ Ito[fdM ; F·] by using (i) in the present corollary.

11.6 Lemma. Suppose that M·, N· ∈ Martc(F·), and f , g are F·-adapted processes whosealmost sure sample paths are locally bounded and admit left hand limits everywhere.(i) Let A ∈ Qvar[M· ;F·]. Then 0 ∈ Ito[fdM ;F·] if and only if R[|f |2dA]t = 0 a.s. ∀t ∈ R≥0.(ii) For h bounded and F0-measurable hIto[fdM ;F·] = Ito[hfdM ;F·] = Ito[fd(hM) ;F·].(iii) Ito[fdM ;F·] + Ito[gdM ;F·] = Ito[(f + g)dM ;F·].

Ito[fdM ;F·] + Ito[fdN ;F·] = Ito[fd(M +N) ;F·].(iv) If J· ∈ Ito[gdM ;F·] then Ito[fgdM ;F·] = Ito[fdJ ;F·].(v) If σ ∈ Time(F·) and I· ∈ Ito[fdM ;F·] then I·∧σ ∈ Ito[1(0,σ]fdM ;F· ∨ Null(P )] moreprecisely It∧σ = Jt for all t ∈ R≥0 a.s. for J· ∈ Ito[1(0,σ]fdM ;F·].(vi) If σ ∈ Time(F·∨Null(P )) then Ito[1(0,σ]fdM ;F·∨Null(P )] = Ito[fdM·∧σ ;F·∨Null(P )].

Proof. Let I· ∈ Ito[fdM ] and J· ∈ Ito[gdM ]. (i) Since R[|f |2dA]· ∈ Qvar[I· ;F·] by Corol-lary 11.5(ii), we get the following series of equivalence relations:

0 ∈ Ito[fdM ] ⇔ It = 0∀t ∈ R≥0 a.s. ⇔ 0 ∈ Qvar[I· ;F·] ⇔ R[|f |2dA]t = 0∀t ∈ R≥0 a.s.

by Corollary 11.5(i), Corollary 10.9(i) and Theorem 8.21(iii) respectively.(iii) and (iv) Given L· ∈ Martc(F·) and B· ∈ Crv[M·, L· ;F·]. Then we have that

R[(f + g)dB]t = R[fdB]t +R[gdB]t for all t ∈ R≥0 a.s.,

R[fdB]· ∈ Crv[I·, L· ;F·] and R[gdB]· ∈ Crv[J·, L· ;F·].

It follows by Lemma 10.6(i), (iii) that

R[(f + g)dB]· ∈ Crv[I·, L· ;F·] + Crv[J·, L· ;F·] = Crv[I· + J·, L· ;F·].

This means I· + J· ∈ Ito[(f + g)dM ] and hence

Ito[fdM ] + Ito[gdM ] ⊂ Ito[(f + g)dM ].

Actually the equality holds by Corollary 11.5(i). Similar discussion also works for the secondrelation in (iii). To show (iv) we select and fix K· ∈ Ito[fdJ ]. Since

R[fgdB]t = R[fdR[gdB]]t for all t ∈ R≥0 a.s.

by Lemma 9.6 and R[fdR[gdB]]· ∈ Crv[K·, L· ;F·], it follows by Lemma 10.6(i) that

R[fgdB]· ∈ Crv[K·, L· ;F·].

This means K· ∈ Ito[fgdM ]. Clearly we can revert the argument above.

52

Page 53: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

(v) Let N· ∈ Martc(F·) and A· ∈ Crv[M·, N· ; F·] where F· := F· ∨ Null(P ). Then, sinceI· ∈ Ito[fdM ; F·] according to Corollary 11.5(iii), we have that R[fdA]· ∈ Crv[I·, N· ; F·]. Itfollows by Lemma 10.7 that

R[fdA]·∧σ ∈ Crv[I·∧σ, N· ; F·].

On the other hand R[fdA]t∧σ = R[1(0,σ]fdA]t for all t ∈ R≥0 a.s. Lemma 10.6(i) shows that

R[1(0,σ]fdA] ∈ Crv[I·∧σ, N· ; F·].

Thus I·∧σ ∈ Ito[1(0,σ]fdM ; F·]. Finally, since J· ∈ Ito[1(0,σ]fdM ; F·] by Corollary 11.5(iii),it follows that It∧σ = Jt for all t ∈ R≥0 a.s. by Corollary 11.5(i).

11.7 Corollary. Suppose X· is an F·-semimartingale, and f is an F·-adapted process whosealmost every sample path is locally bounded and admits left hand limits everywhere. Then

R[fd(X −M)]· + Ito[fdM ;F·] = R[fd(X −N)]· + Ito[fdN ;F·]

for all M·, N· ∈ Mart[X·;F·].

Proof. R[fd(X −M)]t = R[fd(X −N)]t for all t ∈ R≥0 a.s. and Ito[fdM ] = Ito[fdN ].

Given an F·-semimartingale X·, and an F·-adapted process f whose almost everysample path is locally bounded and admits left hand limits everywhere, we set

Ito[fdX ;F·] := R[fd(X −M)]· + Ito[fdM ;F·]

where M· ∈ Mart[X·;F·]. Stochastic integral of f with integrator X.

12 Semimartingale and Ito’s formula

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields. Martc(F·)denotes the space of all F·-local martingales with almost sure continuous sample path.

12.1 Lemma. Suppose that X·, Y· are F·-semimartingales, and f , g are F·-adapted processeswhose almost sure sample paths are locally bounded and admit left hand limits everywhere.(i) For h bounded F0-measurable hIto[fdX ;F·] = Ito[hfdX ;F·] = Ito[fd(hX) ;F·].(ii) Ito[fdX ;F·] + Ito[gdX ;F·] = Ito[(f + g)dX ;F·].

Ito[fdX ;F·] + Ito[fdY ;F·] = Ito[fd(X + Y ) ;F·].(iii) If Z· ∈ Ito[gdX ;F·] then Ito[fgdX ;F·] = Ito[fdZ ;F·].(iv) I· ∈ Ito[fdX ;F·], J· ∈ Ito[gdY ;F·], A· ∈ Crv[X, Y ;F·] ⇒ R[fgdA]· ∈ Crv[I, J ;F·].(v) If σ is an F·-optional time and I· ∈ Ito[fdX ;F·] then I·∧σ ∈ Ito[1(0,σ]fdX ;F·∨Null(P )]more precisely It∧σ = Jt for all t ∈ R≥0 a.s. for J· ∈ Ito[1(0,σ]fdX ;F·].

12.2 Lemma. Suppose that X· is an F·-semimartingale, and f is an F·-adapted processwhose almost every sample paths is locally bounded and admits left hand limits everywhere.(i) If almost every sample path of X or f is of finite variation then R[fdX]· ∈ Ito[fdX].(ii) Let I· ∈ Ito[fdX]. Then sups≤t |R[f([n·]/n, ?)dX]s − Is| converges to 0 in probabilitywhere [x] := max(i ∈ Z : i < x ∪ 0) for x ∈ R.

53

Page 54: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. Let M· ∈ Mart[X· ;F·]. (i) We have that R[fdM ]· ∈ Ito[fdM ] by Theorem 10.10.(ii) Let A· ∈ Qvar[M· ;F·]. Set Jt := It − R[fd(X −M)]t. Then J· ∈ Ito[fdM ]. We see

by Theorem 10.10 and Corollary 11.5(ii) that

R[f([n·]/n, ?)fdA]· ∈ Crv[R[f([n·]/n, ?)dM ]·, J· ;F· ∨ Null(P )].

This implies that R[|f([n·]/n, ?) − f |2dA]· is a quadratic variation of the local martingaleR[f([n·]/n, ?)dM ]· − J·. Invoking Lemma 11.3 and Corollary 11.2, we infer that

lim supn→∞

P (sups≤t

|R[f([n·]/n, ?)dM ]s − Js| > ε) = 0 for all ε ∈ R≥0 and t ∈ R≥0.

On the other hand Lemma 9.8(ii) shows that

R[f([n·]/n, ?)dX]s = R[f([n·]/n, ?)dM ]s +R[f([n·]/n, ?)d(X −M)]s for all s ∈ R≥0 a.s.

The second term in the right hand side converges to R[fd(X −M)]· locally uniformly a.s.by Lemma 11.3. Thus we get the claim.

12.3 Example. Suppose that X· and Y· are F·-semimartingales. Then

X·Y· ∈ X0Y0 + Ito[XdY ] + Ito[Y dX] + Crv[X·, Y· ;F·]

Mart[X·Y· ;F·] = Ito[XdN ] + Ito[Y dM ]

for M· ∈ Mart[X· ;F·] and N· ∈ Mart[Y· ;F·].

Proof. Let A· ∈ Crv[X·, Y· ;F·]. According to Lemma 10.3,

lim supn→∞

P ( sups∈R:0≤s≤t

|∑i:i<ns

(Xs∧(i+1)/n −Xs∧i/n)(Ys∧(i+1)/n − Ys∧i/n)− As| > ε) = 0

for all ε ∈ R>0 and t ∈ R≥0. Arranging the summation we have that

XsYs −X0Y0 −∑i:i<ns

(Xs∧(i+1)/n −Xs∧i/n)(Ys∧(i+1)/n − Ys∧i/n)

=∑i:i<ns

(Xs∧(i+1)/n −Xs∧i/n)Yi/n +∑i:i<ns

(Ys∧(i+1)/n − Ys∧i/n)Xi/n.

The left hand side converges to XsYs−X0Y0−As locally uniformly in probability while theright hand equals R[Y[n·]/ndX]s + R[X[n·]/ndY ]s for all s ∈ R≥0 a.s. where we write [x] :=max(i ∈ Z : i < x∪0) for x ∈ R. Thus we get the claim by invoking Lemma 12.2(ii).

12.4 Lemma. Suppose that fn, f are F·-adapted processes with almost sure left contin-uous and locally bounded sample path, A· is an F·-finite variation process, and X· is anF·-semimartingale. If fn(t, ·) converges to f(t, ·) for all t ∈ R≥0 a.s. and there exist anF·-increasing process g such that |fn(t, ·)| ≤ g(t, ·) for all t ∈ R≥0 a.s. then for each t ∈ R≥0

lim supn→∞

sups∈R:0≤s≤t

|R[fndA]s −R[fdA]s| = 0 a.s.,

sups∈R:0≤s≤t

|Ins − Is| converges to 0 in probability.

where In· ∈ Ito[fndX] and I· ∈ Ito[fdX].

54

Page 55: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. Let B· ∈ Qvar[X· ;F·]. The argument in the proof of Corollary 11.3 shows that

lim supn→∞

sups∈R:0≤s≤t

|R[fndA]s −R[fdA]s| = 0 a.s. and lim supn→∞

R[|fn − f |2dB]t = 0 a.s.

Let M· ∈ Mart[X· ;F·]. The above applies to At = Xt −Mt. Set

Jnt := Int −R[fnd(X −M)]t − It +R[fd(X −M)]t.

Then we see by Lemma 12.1(ii) and Corollary 11.5(ii) that

Jn· ∈ Ito[(fn − f)dM ] and R[|fn − f |2dB]· ∈ Qvar[Jn· ;F·].

Invoking Corollary 11.2, we reach the convergence of sups≤t |Jns | in probability.

12.5 Definition. Let d ∈ N. A d-dimensional F·-semimartingale is an Rd-valued processeach component of which is an F·-semimartingale. A complex F·-semimartingale is a C-valued process whose real part and well as imaginary part are F·-semimartingales.

12.6 Theorem. Suppose that X· is a d-dimensional F·-semimartingale with components X i· ,

i ∈ N≤d. Then, given f ∈ C2(Rd), the composition f X· is an F·-semimartingale and

f(X·) ∈ f(X0) +d∑i=1

Ito[∇if(X·)dXi] +

1

2

d∑i,j=1

R[∇2ijf(X·)dA

ij]·

where Aij· ∈ Crv[X i· , X

j· ;F·]. Ito’s formula

Proof. The statement holds for affine functions. Denote by D the set of all C2-functions forwhich the statement holds. We first show that D is an algebra over R. Since the differentialoperators∇ and∇2 are linear, the linearity of D follows by Lemma 12.1(i), (ii). Now supposethat f, g ∈ D, that is,

f(X·) ∈ f(X0) +d∑i=1

Ito[∇if(X·)dXi] +

1

2

d∑i,j=1

R[∇2ijf(X·)dA

ij]· and

g(X·) ∈ g(X0) +d∑i=1

Ito[∇ig(X·)dXi] +

1

2

d∑i,j=1

R[∇2ijg(X·)dA

ij]·.

It then follows by Lemma 12.1(ii), (iii) and Lemma 12.2(i) that

d∑i=1

Ito[f(X·)∇ig(X·)dXi] +

1

2

d∑i,j=1

R[f(X·)∇2ijg(X·)dA

ij]· = Ito[f(X·)dg(X·)]·

On the other hand Lemma 12.1(iv), Lemma 10.6(iii) and Lemma 8.16 tell us that

d∑i,j=1

R[∇if(X·)∇jg(X·)dAij]· ∈ Crv[f(X·), g(X·) ; F·]

55

Page 56: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Observe that ∇i(f + g) = ∇if +∇ig and ∇2ij(fg) = f∇2

ijg + g∇2ijf +∇if∇jg +∇ig∇jf .

Taking into account that Aijt = Ajit for all t ∈ R≥0 a.s., we infer by Example 12.3 that

f(X·)g(X·) ∈ f(X0)g(X0) +d∑i=1

Ito[∇i(fg)(X·)dXi] +

1

2

d∑i,j=1

R[∇2ij(fg)(X·)dA

ij]·.

Thus we verified that fg ∈ D. In particular it follows that D contains all polynomialfunctions. With the help of Lemma 12.4 one can verify that the space D is closed with respectto the C2-topology. Consequently D = C2(Rd) by Weierstrass approximation theorem.

12.7 Example. Suppose that X· is an F·-semimartingale and A· ∈ Qvar[X· ;F·]. Then

expX· −A·

2 ∈ expX0+ Ito[expX· −

2dX].

In particular if X· ∈ Martc(F·) then expX· − A·/2 ∈ Martc(F·).

Proof. We apply Theorem 12.6 to the semimartingale t 7→ Xt − At/2 and the functionf : x 7→ ex. We see that Qvar[X· − A·/2 ;F·] = Qvar[X· ;F·] by Theorem 10.8(i). Sincef ′ = f and f ′′ = f , it then follows that

expX· −A·

2 ∈ expX0+ Ito[expX· −

2d(X − A

2)] +

1

2R[expX· −

2dA].

The right hand side equals expX0+ Ito[expX· − A·/2dX] by Lemma 12.2(i).

12.8 Definition. A d-dimensional standard F·-Brownian motion X· is a d-dimensional pro-cess such that it is F·-adapted, its almost every sample path is continuous, X0 = 0 a.s., andfor each pair s, t ∈ R≥0 with s < t the increment Xt−Xs is independent of Fs and normallydistributed with mean 0 and covariance (t− s)δij.

We denote the non-random increasing process t 7→ t by λ·

12.9 Theorem. Suppose that X· is a d-dimensional process whose almost every sample pathis continuous. Then X· is a standard F·-Brownian motion if and only if it is an F·-localmartingale, X0 = 0 a.s. and δijλ· ∈ Crv[X i, Xj ;F·]. Levy’s characterization

Proof. Suppose that X· is a d-dimensional F·-local martingale with almost sure continuoussample path, X0 = 0 a.s. and t 7→ δijt ∈ Crv[X i, Xj ;F·]. Given ξ ∈ Cd, we consider thefollowing C-valued C∞-function

f : Rd × R → C, (x, y) 7→ exp√−1ξ · x+ q(ξ)y/2.

Here q : Cd → C is the standard quadratic form ξ 7→∑d

i=1(ξi)2. We apply Theorem 12.6 to

the semimartingale t 7→ (Xt, λt) and the function f . Observe that

∇if =√−1ξif for i ∈ N≤d, ∇d+1f = q(ξ)f/2 and ∇i∇jf = −ξiξjf for i, j ∈ N≤d.

56

Page 57: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Since 0 ∈ Crv[X i, λ ;F·] and 0 ∈ Qvar[λ ;F·], we have that

f(X·, λ·) ∈ f(0, 0) +d∑i=1

√−1ξiIto[f(X·, λ·)dX

i]

+q(ξ)

2Ito[f(X·, λ·)dλ] +

1

2

d∑i,j=1

−ξiξjR[f(X·, λ·)d(δijλ)]·.

The last two terms cancel each other by Lemma 12.2(i). It follows that f(X·, λ·) is a complexlocal martingale. If ξ ∈ Rd then f(X·, λ·) is bounded and hence it is a martingale. Supposethat s, t ∈ R≥0 and s ≤ t. Then we have that

exp√−1ξ ·Xs + q(ξ)s/2 = f(Xs, λs)

∈ E[f(Xt, λt)|Fs] = E[exp√−1ξ ·Xt + q(ξ)t/2|Fs] for all ξ ∈ Rd.

Consequently we reach that

exp−q(ξ)(t− s)/2 ∈ E[exp√−1ξ · (Xt −Xs)|Fs] for all ξ ∈ Rd.

This means that Xt − Xs is independent of Fs and normally distributed with mean 0 andcovariance (t− s)δij.

12.10 Corollary. Suppose that X· is a d-dimensional stochastic process whose almost everysample path is continuous. Then X· −X0 is a standard F·-Brownian motion if and only ifX· is F·-adapted and f(X·)−R[∆f(X)dλ]·/2 is an F·-martingale for each f ∈ C∞

0 (Rd).

Proof. Let g ∈ C∞(Rd). We show that t 7→ g(Xt)−R[∆g(X)dλ]t/2 is an F·-local martingale.Once this is established, choosing g as coordinate functions x 7→ xi and their quadraticmonomials x 7→ xixj, we infer with the help of Lemma 10.6(ii) that

X· is an F·-local martingale and δijλ· ∈ Crv[X i, Xj ;F·].

Then the claim derives by Theorem 12.9. The followings are F· ∨ Null(P )-optional times:

τ(k) := inft ∈ R≥0 : |Xt| > k, k ∈ N.

Given k ∈ N, there exists f ∈ C∞0 (Rd) such that g(x) = f(x) for all x ∈ Rd with |x| < k+1.

We select and fix Ω0 ∈ F such that P (Ω0) = 1 and t 7→ Xt(ω) is continuous for all ω ∈ Ω0.If ω ∈ Ω0 and τ(k, ω) > 0 then |Xt(ω)| ≤ k for all t ∈ R≥0 with t ≤ τ(k, ω). It follows that

g(Xt∧τ )− g(X0) = f(Xt∧τ )− f(X0) and

∫ t∧τ(k)

0

∆g(Xs) ds =

∫ t∧τ(k)

0

∆f(Xs) ds

for all t ∈ R≥0 on Ω0. This means that

g(Xt∧τ(k))− g(X0)−1

2R[∆g(X)dλ]t∧τ(k)

= f(Xt∧τ(k))− f(X0)−1

2R[∆f(X)dλ]t∧τ(k)

for all t ∈ R≥0 a.s.

The right hand side is a stopped process of the F·-martingale f(X·)−R[∆f(X)dλ]·/2−f(X0).Therefore g(X·∧τ(k))− 1

2R[∆g(X)dλ]·∧τ(k)−g(X0) is an F·∨Null(P )-martingale. On the other

hand E[|g(X0)| ; τ(k) > 0] ≤ supx:|x|≤k |g(x)| < +∞. It follows that k 7→ τ(k) is a reducingsequence for the F·-local martingale g(X·)−R[∆g(X)dλ]·/2.

57

Page 58: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

13 Burkholder-Davis-Gundy inequality

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields. Theindex set is R≥0. The following is known with the keyword domination relation and theinequality (i) is called the Lenglart inequality.

13.1 Lemma. Suppose that X· is an F·-adapted process such that almost every sample pathis right continuous and Xt ≥ 0 a.s. for all t ∈ R≥0, A· is an F·-increasing process such thatalmost every sample path is continuous and τ is an F· ∨ Null(P )-optional time. If

E[Xσ ;X0 ≤ λ] ≤ E[X0 ;X0 ≤ λ] + E[Aσ]

for all bounded F·-optional times σ and λ ∈ R>0 then the following holds:

P (supt≤τ

Xt > λ) ≤ E[X0 ∧ λ]

λ+E[Aτ ∧ ε]

λ+ P (Aτ > ε) ∀ε ∈ R>0.(i)

E[f(supt≤τ

Xt)] ≤ E[X0

∫[X0,+∞)

df

x+ f(X0) ;X0 > 0](ii)

+ E[Aτ

∫[Aτ ,+∞)

df

x+ 2f(Aτ ) ;Aτ > 0]

where f is any left continuous, non-decreasing function R≥0 → R with f(0) = 0 = f(0+).

(iii) E[supt≤τ

Xtp] ≤ 1

1− pE[X0

p] +2− p

1− pE[Aτ

p] for all p ∈ R(0,1).

Proof. There exists Ω0 ∈ F such that P (Ω0) = 1, the following holds for every ω ∈ Ω0:

X·(ω) is right continuous and X·(ω) ≥ 0 on R≥0,A·(ω) is finite valued, continuous and non-decreasing and A0(ω) = 0.

The process with sample path t 7→ 1Ω0Xt (respectively t 7→ 1Ω0At) is F· ∨ Null(P )-adapted.Given λ ∈ R>0 and ε ∈ R>0, we introduce the following F· ∨ Null(P )-optional times:

T := inft ∈ R≥0 : 1Ω0Xt > λ, S := inft ∈ R≥0 : 1Ω0At > ε.

According to Lemma 4.8, there exist F·-optional times T , S and τ such that

T = T a.s., S = S a.s. and τ = τ a.s.

Observe that T < +∞ ⊂ Ω0. Due to the right continuity of t 7→ 1Ω0Xt,

XT ≥ λ on T < +∞.

To save the space we write B := X0 ≤ λ. Let n ∈ N. We have that XT∧τ∧n ≥ 0 on Ω0. Ittherefore follows that

P (T ≤ τ, T < n,B) ≤ E[XT∧τ∧n ;T ≤ τ, T < n,B]/λ ≤ E[XT∧τ∧n ;B ∩ Ω0]/λ.

On the other hand, T ∧ τ ∧ n being a bounded F·-optional time,

E[XT∧τ∧n ;B] ≤ E[X0 ;B] + E[AT∧τ∧n]

= E[X0 ;B] + E[AT∧τ∧n ; Ω0] ≤ E[X0 ;B] + E[Aτ ; Ω0].

58

Page 59: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Here the last inequality is due to the monotonicity of A· on Ω0. Consequently

P (T ≤ τ, T < n,X0 ≤ λ) ≤ E[X0 ;X0 ≤ λ]/λ+ E[Aτ ; Ω0]/λ for all n ∈ N.

The monotone convergence theorem shows

P (T ≤ τ, T < +∞, X0 ≤ λ) ≤ E[X0 ;X0 ≤ λ]/λ+ E[Aτ ; Ω0]/λ.

Thus we reach

(?) P (supt≤τ

Xt > λ,X0 ≤ λ,Ω0) ≤ E[X0 ;X0 ≤ λ]/λ+ E[Aτ ; Ω0]/λ

since sup1Ω0Xt; t ≤ τ > λ implies T < τ or ‘T = τ , T < +∞ and XT > λ’. Applying (?)with τ replaced by τ ∧ S, we get

P ( supt≤τ∧S

Xt > λ,X0 ≤ λ,Ω0) ≤ E[X0 ;X0 ≤ λ]/λ+ E[Aτ∧S ; Ω0]/λ.

The left hand side dominates

P (supt≤τ

Xt > λ, τ ≤ S,X0 ≤ λ,Ω0).

Consequently it follows that

P (supt≤τ

Xt > λ,Ω0) ≤E[X0 ;X0 ≤ λ]

λ+E[Aτ∧S ; Ω0]

λ+ P (τ > S) + P (X0 > λ).

According to Lemma 8.13(iii), Aτ∧S = Aτ ∧ ε on Ω0 and τ > S = Aτ > ε ∩ Ω0.(ii) Put ε = λ and apply Lemma 13.3.(iii) If f : x 7→ xp then f(a) + a

∫[a,+∞)

df/x = ap/(1− p) for a ∈ R>0.

13.2 Remark. Let X· be an F·-adapted process whose every sample path is continuous andnon-negative valued, A· an F·-adapted process whose every sample path is finite valued,continuous and non-decreasing and takes the value 0 at time 0 and τ an F·-stopping time.If E[Xσ ;X0 ≤ λ] ≤ E[X0 ;X0 ≤ λ] + E[Aσ] for all bounded F·-stopping times σ then

P (supt<τ

Xt > λ) ≤ E[X0 ∧ λ]/λ+ E[Aτ ∧ ε]/λ+ P (Aτ ≥ ε).

Proof. We introduce the following stopping times:

T := inft ∈ R≥0 : Xt ≥ λ, S := inft ∈ R≥0 : At ≥ ε.

Note that supXt; t ∈ R[0,τ) > λ implies T < τ and that τ ≥ S = Aτ ≥ ε.

13.3 Lemma. Let X be a non-negative random variable and f : R≥0 → R be a left contin-uous and non-decreasing function such that f(0+) = 0 (f(+∞) := sup f). Then

E[f(X) ;X > 0] =

∫(0,+∞)

P (X > ·)df, E[X

∫[X,+∞)

df

x;X > 0] =

∫(0,+∞)

E[X ;X ≤ x]

xdf.

Proof. Fubini’s theorem.

59

Page 60: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

13.4 Lemma. Suppose that M· is an F·-adapted process with almost sure right continuoussample path and A· is an F·-increasing process such that |M·|2−A· is an F·-local martingale.If σ is an F· ∨ Null(P )-optional time and σ < +∞ a.s. then

E[|Mσ|2 ;B] ≤ E[|M0|2 ;B] + E[Aσ ;B] ≤ E[sup|Mt|2 ; t ≤ σ ;B] for all B ∈ F0+.

Proof. See the proof of Theorem 7.17(iii). Invoke Fatou’s lemma instead.

13.5 Theorem. Suppose that M· is an F·-adapted process with almost sure right continuoussample path, A· is an F·-increasing process such that |M·|2 − A· is an F·-local martingale,and τ is an F· ∨ Null(P )-optional time.(i) If almost every sample path of A· is continuous then

P (supt≤τ

|Mt| > λ) ≤ E[|M0|2 ∧ λ2]

λ2+E[Aτ ∧ ε]

λ2+ P (Aτ > ε) for all λ, ε ∈ R>0.

E[supt≤τ

|Mt|p] ≤2

2− pE[|M0|p] +

4− p

2− pE[Aτ

p/2] for all p ∈ R(0,2).

(ii) If almost every sample path of M· is continuous and M0 = 0 a.s. then

P (Aτ > λ) ≤E[supt≤τ |Mt|2 ∧ ε]

λ+ P (sup

t≤τ|Mt|2 > ε) for all λ, ε ∈ R>0.

E[Aτp/2] ≤ 4− p

2− pE[sup

t≤τ|Mt|p] for all p ∈ R(0,2).

Proof. (i) We get the claim by Lemma 13.4 and Lemma 13.1.(ii) Since t 7→ sup|Ms|2; s ≤ t is an F·-increasing process with almost sure continuous

sample path, we get the claim by Lemma 13.4 and Lemma 13.1.

Recall that Martc(F·) denotes the space of all F·-local martingales with almost surecontinuous sample path.

13.6 Example. Suppose that Mn· is a sequence of Martc(F·), A

n· ∈ Qvar[Mn

· ;F·] andτ ∈ Time(F· ∨ Null(P )). Then sup|Mn

t |; t ∈ R≥0, t ≤ τ converges to 0 in probability ifand only if both Mn

0 and Anτ converge to 0 in probability.

So far the martingale property of M· itself is redundant. The lack of martingale propertyprevents us from extending the moment estimate to the case p = 2 in Theorem 13.5.

13.7 Example. Suppose that M· is an F·-local martingale such that almost every samplepath is right continuous, A· is an F·-increasing process such that almost every sample pathis continuous and |M·|2 − A· is an F·-local martingale and τ ∈ Time(F· ∨ Null(P )).(i) If E[|M0| ; τ > 0] < +∞ and E[

√At∧τ ] < +∞ for all t ∈ R>0 then t 7→ Mt∧τ −M0 is an

F· ∨ Null(P )-martingale.(ii) If τ < +∞ a.s., E[|M0| ; τ > 0] < +∞ and E[

√Aτ ] < +∞ then E[|Mτ | ; τ > 0] < +∞,

E[Mτ ; τ > 0] = E[M0 ; τ > 0] and E[|Mτ |2 ; τ > 0] = E[|M0|2 ; τ > 0] + E[Aτ ].

60

Page 61: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. We show (ii). According to Theorem 13.5(i), the family Mσ∧τ1τ>0 ;σ ∈ Time(F·)is uniformly integrable. Let σ ∈ Time(F· ∨ Null(P )). we infer by Lemma 7.15 that

Mσ∧τ1τ>0 ∈ E[Mτ1τ>0|(F· ∨ Null(P ))σ+].

In particular E[Mτ1τ>0] = E[M01τ>0]. Theorem 1.13 tells us that

E[|Mσ∧τ1σ∧τ>0|2] ≤ E[|Mσ∧τ1τ>0|2] ≤ E[|Mτ1τ>0|2].

The left hand side equals E[|M01σ∧τ>0|2] +E[Aσ∧τ ] provided the latter is finite according toTheorem 7.17(iii). For each n ∈ N the following is a bounded F· ∨ Null(P )-optional time:

σ(n) :=

inft ∈ R≥0 : At > n if |M0| ≤ n

0 if |M0| > n.

Lemma 8.13(ii) shows that σ(·) is non-decreasing and supσ(·) = +∞. Hence E[Aσ(n)∧τ ]tends to E[Aτ ] by the monotone convergence theorem. Thus

E[|M01τ>0|2] + E[Aτ ] ≤ E[|Mτ1τ>0|2].

The converse inequality E[|Mτ1τ>0|2] ≤ E[|M01τ>0|2] + E[Aτ ] holds by Lemma 13.4.

13.8 Remark. The difference between Theorem 7.17 and the present situation is that A· ismerely right continuous in the former while A· is continuous in the latter.

13.9 Lemma. Suppose that M· ∈ Martc(F·), M0 = 0 a.s., A ∈ Qvar[M· ;F·] and σ is anF· ∨ Null(P )-optional time with σ < +∞ a.s. Then the followings hold:

(i) If E[√Aσ] < +∞ then E[Aσ

p] ≤ ppE[|Mσ|2p] for all p ∈ R≥1.(ii) E[|Mσ|2p] ≤ p(2p− 1)p2p/(2p− 1)2p(p−1)E[Aσ

p] for all p ∈ R≥1.(iii) E[|Mσ|2p] ≤ p(2p− 1)pE[Aσ

p] for all p ∈ R≥3/2.

Proof. (i) Let p ∈ R>1. According to Example 12.3, |M·|2 ∈ Ito[2MdM ] + A·. Since A· iscontinuous and nondecreasing a.s., we have that pR[Ap−1dA]t = Apt for all t ∈ R≥0 a.s. Wetherefore see by Example 12.3 that

|M·|2Ap−1· ∈ Ito[Ap−12MdM ] +

1

pAp· +R[|M |2dAp−1]·

with Lemma 12.1(ii), (iii), Lemma 12.2(i) and Lemma 8.16 taking into account. The secondand the third terms are increasing processes. Let τ be a bounded F·∨Null(P )-optional timesuch that

sup|Mt(ω)| ; t ∈ R[0,τ ], ω ∈ Ω < +∞ and sup|At(ω)| ; t ∈ R[0,τ ], ω ∈ Ω < +∞.

It then follows that E[R[|Ap−12M |2dA]τ ] < +∞. This implies that

1

pE[Apτ ] ≤ E[R[|M |2dAp−1]τ +

1

pApτ ] = E[|Mτ |2Ap−1

τ ]

61

Page 62: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

by Corollary 11.5(ii) and Example 13.7(ii). Invoking Holder’s inequality we get

E[Apτ ] ≤ pE[|Mτ |2p]1/pE[Apτ ](p−1)/p.

If E[Apτ ] = 0 then E[|Mτ |2] ≤ E[Aτ ] = 0 by Lemma 13.4 and hence Mτ = 0 a.s. Otherwisedividing by E[Apτ ]

(p−1)/p we get the claim (i) for optional times τ with additional require-ments. As for general optional times with E[

√Aσ] < +∞, since M0 = 0 a.s., it follows by

the proof of Example 13.7 that Mσ is integrable and

Mτ∧σ ∈ E[Mσ|(F· ∨ Null(P ))(τ∧σ)+].

Invoking the claim (i) for the optional time τ ∧ σ and Theorem 1.13, we see that

E[Apτ∧σ] ≤ ppE[|Mτ∧σ|2p] ≤ ppE[|Mσ|2p].

The inequality above is valid for τ(k) in place of τ where

τ(k) := inft ≥ 0 : |Mt| > k ∧ inft ≥ 0 : At > k ∧ k for k ∈ N.

Tending k to ∞ we get the full statement by the monotone convergence theorem.(ii) & (iii) Fix δ ∈ R>0 and set f : x 7→ (δ + x2)p where p > 1. It follows that

f ′(x) = 2p(δ + x2)p−1x and f ′′(x) = 2p(δ + x2)p−2(δ + (2p− 1)x2).

Invoking Ito’s formula we get

(δ + |M·|2)p ∈ δp + Ito[2p(δ + |M |2)p−1MdM ] +R[p(δ + |M |2)p−2(δ + (2p− 1)|M |2)dA]·.

Let τ be as above. It follows that

E[(δ + |Mτ |2)p] = δp + pE[R[(δ + |M |2)p−2(δ + (2p− 1)|M |2)dA]τ ].

Since (2p − 1) > 1, the integrand in the last term is dominated by (2p − 1)(δ + Ms2)p−1.

Therefore tending δ to 0 we get by the dominated convergence theorem that

E[|Mτ |2p] = p(2p− 1)E[R[|M |2p−2dA]τ ].

Up to the factor p(2p− 1), the right hand side is dominated by

E[sup|Ms| ; s ∈ R[0,τ ]2p−2Aτ ] ≤ E[sup|Ms| ; s ∈ R[0,τ ]2p](p−1)/pE[Aτp]1/p.

where Holder’s inequality is applied. We reach that

E[|Mτ |2p] ≤ p(2p− 1)2p/(2p− 1)2p−2E[|Mτ |2p](p−1)/pE[Aτp]1/p

by invoking Lemma 6.2 to the F·-submartingale |M·∧τ |. When p ∈ R≥3/2, since |M·∧τ |2p−2 isan F·-submartingale, another discussion works. First observe that as n tends to ∞∑

i:i<nτ

|Mτ∧(i+1)/n|2p−2(Aτ∧(i+1)/n − Aτ∧i/n) converges to R[|M |2p−2dA]τ a.s.

62

Page 63: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Since the integrand is non-negative, it follows by Fatou’s lemma that

ER[|M |2p−2dA]τ ] ≤ lim infn→∞

∞∑i=0

E[|Mτ∧(i+1)/n|2p−2(Aτ∧(i+1)/n − Aτ∧i/n)].

The submartingale property of |M·∧τ |2p−2 and monotonicity if A· yield that

E[|Mτ∧(i+1)/n|2p−2(Aτ∧(i+1)/n − Aτ∧i/n)] ≤ E[|Mτ |2p−2(Aτ∧(i+1)/n − Aτ∧i/n)].

Consequently we obtain that

E[R[|M |2p−2dA]τ ] ≤ E[|Mτ |2p−2Aτ ] ≤ E[|Mτ |2p](p−1)/pE[Aτp]1/p.

If Mτ = 0 a.s. then E[Aτ ] = E[|Mτ |2] = 0 and hence Aτ = 0 a.s. Otherwise dividing byE[|Mτ |2p](p−1)/p we get the claims (ii) and (iii). As for general optional times σ we start from

E[|Mσ∧τ(k)|2p] ≤ p(2p− 1)pE[Aσ∧τ(k)p] ≤ p(2p− 1)pE[Aσ

p].

We reach the full statement by Fatou’s lemma.

13.10 Example. The Burkholder-Davis-Gundy inequality : Suppose that M· ∈ Martc(F·),M0 = 0 a.s., A ∈ Qvar[M· ;F·], τ ∈ Time(F· ∨ Null(P )), q ∈ R≥3/2 and p ∈ R(0,2q). Then

E[(Aτ )p/2] ≤ 4q − p

2q − pqp/2E[sup

t<τ|Mt|p] and E[sup

t<τ|Mt|p] ≤

4q − p

2q − p(q(2q − 1))p/2E[(Aτ )

p/2].

Proof. The claim (i) and (iii) in Lemma 13.9 read

E[Aσq] ≤ qqE[|Mσ|2q] ≤ qqE[sup

t<σ|Mt|2q] and E[|Mσ|2q] ≤ (q(2q − 1))qE[Aσ

q]

for all F·-optional times σ with σ < +∞ a.s. and E[Aσ] < +∞. Thus we get the presentclaim by Lemma 13.1(ii).

63

Page 64: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

14 Sample path regularity of submartingale

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields. Wediscuss submartingales with continuous parameter. So the index space is R≥0.

14.1 Lemma. Suppose that Q is a subset of R, f ∈ Map(Q,R) and a and b is a pair of

real numbers with a < b. If lim infst f(s) < a and lim supst f(s) > b for some t ∈ Qleftor

lim infst f(s) < a and lim supst f(s) > b for some t ∈ Qrightthen

sup]∆ ; ∆ family of disjoint closed intervals s.t.

min J ∈ Q,max J ∈ Q, f(min J) < a, f(max J) > b ∀J ∈ ∆ = +∞.

14.2 Lemma. Suppose that Q is a subset of R, f ∈ Map(Q,R) and a and b is a pair of real

numbers with a < b. If either ]t ∈ Qleft ∩Qright: lim infst f(s) < a, lim supst f(s) > b =

∞ or ]t ∈ Qleft ∩Qright: lim infst f(s) < a, lim supst f(s) > b = ∞ then

sup]∆ ; ∆ family of disjoint closed intervals s.t.

min J ∈ Q,max J ∈ Q, f(min J) < a, f(max J) > b ∀J ∈ ∆ = +∞.

14.3 Lemma. Let f ∈ Map(Q≥0,R). Define g : R≥0 → R and h : R>0 → R as follows:

g(t) := lim infst f(s) for t ∈ R≥0 and h(t) := lim infst f(s) for t ∈ R>0.

(i) lim infst g(s) ≥ g(t) for all t ∈ R≥0 and g is Borel measurable. If g(t) = lim supst f(s)for all t ∈ R≥0 then g is right continuous.(ii) lim infst h(s) ≥ h(t) for all t ∈ R>0 and h is Borel measurable. If h(t) = lim supst f(s)for all t ∈ R>0 then h is left continuous.

Proof. The discussion being the same, it suffices to show (i). Let t ∈ R≥0 and n ∈ N. Then

infu∈Q:t<u<t+1/n

f(u) ≤ infu∈Q:s<u<t+1/n

f(u) ≤ lim infus

f(u) = g(s) for all s ∈ R(t,t+1/n).

This implies that infu∈Q:t<u<t+1/n f(u) ≤ infs:t<s<t+1/n g(s). Therefore, tending n to ∞, weget g(s) ≤ lim infst g(s). To see the measurability we observe that

t ∈ R≥0 : infs∈Q:t<s≤t+1/n

f(s) < a =⋃

s∈Q:s>0,f(s)<a

[maxs− 1/n, 0, s) ∈ Borel(R≥0)

for all a ∈ R. It follows that g is Borel measurable. Changing the role of inf and sup we get

lim supst g′(s) ≤ g′(t) for all t ∈ R≥0

where g′ : R≥0 → R, t 7→ lim supst f(s). Thus if g = g′ then g is right continuous.

14.4 Lemma. Suppose that F· is a filtration with index space Q>0 and X· is an F·-adaptedstochastic process with index space Q>0. Then (t, ω) 7→ lim infstXs(ω) is F·+-progressivelymeasurable where Ft+ :=

⋂s∈Q:s>tFs for t ∈ R≥0.

64

Page 65: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. Let t ∈ R>0. If s ∈ R(0,t), n ∈ N and n ≥ 1/(t− s) then

(u, ω) ∈ R[0,t] × Ω : u ≤ s, infq∈Q:u<q≤u+1/n

Xq(ω) < a

=⋃

q∈Q:0<q≤t

u ∈ R[0,t] : u ≤ s, q − 1/n ≤ u < q × ω ∈ Ω : Xq(ω) < a

for all a ∈ R. The right hand side is a Borel(R[0,t]) ⊗ Ft-measurable set where we writeFt := σ(

⋃q∈Q:0<q≤tFq). (The right hand coincides with the old Ft if t ∈ Q). It follows that

(u, ω) ∈ R[0,t] × Ω : u ≤ s, lim infqu

Xq(ω) > b ∈ Borel(R[0,t])⊗Ft

for all s ∈ R(0,t) and b ∈ R. Since ω ∈ Ω : lim infqtXq(ω) > b ∈ Ft+, we get

(u, ω) ∈ R[0,t] × Ω : lim infqu

Xq(ω) > b ∈ Borel(R[0,t])⊗Ft+

for all b ∈ R. Consequently (t, ω) 7→ lim infstXs(ω) is F·+-progressively measurable.

14.5 Corollary. Suppose that X· is an F·+ ∨ Null(P )-adapted process whose almost everysample path is right continuous (left continuous). Then there exists an F·+-progressively(F·−-progressively) measurable process Y· such that Xt = Yt for all t ∈ R≥0 a.s.

Proof. There exists Ω0 ∈ F such that P (Ω0) = 1 and

t 7→ Xt(ω) is right continuous for each ω ∈ Ω0.

Given t ∈ Q≥0, we select an Ft+-measurable random variable Xt so that Xt = Xt a.s. Bychoosing a proper subset of Ω0 if necessary, we have that Xt(ω) = Xt(ω) for all t ∈ Q≥0 andω ∈ Ω0. The process t 7→ lim infst:s∈Q Xs is a desired one by Lemma 14.4.

14.6 Lemma. Suppose that X· is an F·-submartingale, a and b is a pair of real numberswith a < b and Q is a countable subset of R≥0.

N(Q) := sup]∆ ; ∆ family of disjoint closed intervals s.t.

min J ∈ Q,max J ∈ Q,Xmin J < a,Xmax J > b ∀J ∈ ∆

Then N(Q) is F-measurable and

E[N(Q)] ≤ (supt∈Q

E[maxXt, a]− E[maxX0, a])/(b− a).

Proof. If Q is a finite set then the claim holds according to Lemma 5.2. Given a countablyinfinite set Q there exists a sequence Fn of finite sets such that

Fn ⊂ Fn+1 for all n ∈ N and⋃∞n=1 Fn = Q.

Then, since N(Fn) ≤ N(Fn+1) and N(Q) = supn∈NN(Fn), we get the claim by the monotoneconvergence theorem.

65

Page 66: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

14.7 Corollary. Suppose that X· is an F·-submartingale with a.s.-right continuous samplepath, a and b is a pair of real numbers with a < b and T ∈ R>0. Then

N := sup]∆ ; ∆ family of disjoint closed intervals s.t.

max J ≤ T,Xmin J < a,Xmax J > b ∀J ∈ ∆

is F-measurable and E[N ] ≤ (E[maxXT , a]− E[maxX0, a])/(b− a).

Proof. There exists Ω0 ∈ F such that P (Ω0) = 1 and t 7→ Xt(Ω) is right continuous forall ω ∈ Ω0. Due to the right continuity we have that N = N(Q[0,T ] ∪ T) on Ω0. Thus,invoking Lemma 14.6, we get the claim. Note that Null(P ) ⊂ F .

14.8 Theorem. Let X· be an F·-submartingale with a.s.-right continuous sample path.(i) Almost every sample path is locally bounded on R≥0, admits left-hand limits everywhereand at most countably many discontinuous points on R>0.(ii) If supt≥0E[maxXt, 0] < +∞ then X∞ := lim inft→+∞Xt = lim supt→+∞Xt a.s. andE[|Xσ|] < +∞ for all F· ∨ Null(P )-optional times σ.(iii) If X· is uniformly integrable then X· converges to X∞ in L1-sense and

Xσ ∈ E≤[Xτ |(F· ∨ Null(P ))σ+]

for any pair σ and τ of F· ∨ Null(P )-optional times with σ ≤ τ a.s.

Proof. (i) derives from Lemma 6.1, Lemma 14.1, Lemma 14.2 and Corollary 14.7. Thediscussion for (ii) and (iii) is the same as in the proof of Theorem 5.4. Few technical pointswe must take into account: Since |Xσ| = lim infn→∞ |Xσ∧n| a.s.,

E[|Xσ|] = E[lim infn→∞

|Xσ∧n|] ≤ lim infn→∞

E[|Xσ∧n|] ≤ 2 supt≥0

E[maxXt, 0]− E[X0].

We also have that E[lim inft→+∞Xt ;A] = E[lim infn→∞Xn ;A] for all A ∈ F .

E[Xσ ;A ∩ σ < n] = E[Xσ∧n ;A ∩ σ < n] ≤ E[Xτ∧n ;A ∩ σ < n].

for A ∈ (F· ∨ Null(P ))σ+ and n ∈ N where we use Corollary 3.17(i) this time.

14.9 Remark. Let X· be an F·-submartingale with almost sure right continuous sample path.Unlike the discrete parameter case, even if the family Xt ; t ∈ R≥0 is uniformly integrablethe family Xσ ;σ F·-stopping times need not be uniformly integrable. This is the subtlepoint related to the Doob-Meyer decomposition theorem.

14.10 Lemma. Suppose that X· is an F·-submartingale.(i) sup|Xs| ; s ∈ Q[0,T ] < +∞ for all T ∈ R>0 a.s.(ii) lim infst:s∈QXs = lim supst:s∈QXs for all t ∈ R>0 a.s.(iii) E[| lim infst:s∈QXs|] < +∞ and lim infst:s∈QXs ∈ E≤[Xt|Ft−] for all t ∈ R>0.(iv) lim infst:s∈QXs = lim supst:s∈QXs for all t ∈ R≥0 a.s.(v) E[| lim infst:s∈QXs|] < +∞ and Xt ∈ E≤[lim infst:s∈QXs|Ft] for all t ∈ R≥0.(vi) lim infst:s∈QXs ∈ E≤[XT |Ft+] for all t ∈ R≥0 and T ∈ R>t.

Proof. We get (i) by Lemma 6.1. (ii) and (iv) derive from Lemma 14.1 and Lemma 14.6.To show (iii) given t ∈ R>0, fix a sequence q(·) of positive rational numbers such that

q(n) < q(n+ 1) and supn∈N q(n) = t. We have that

66

Page 67: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

lim infst:s∈QXs = lim infn→∞Xq(n) a.s. and | lim infst:s∈QXs| = lim infn→∞ |Xq(n)| a.s.

Using the latter relation we deduce that E[| lim infst:s∈QXs|] < +∞ as in the proof ofTheorem 5.6. Let Yn ∈ E[Xt|Fq(n)] for n ∈ N. It then follows by Theorem 5.6 thatlim infn→∞ Yn ∈ E[Xt|Ft−]. Since Xq(n) ≤ Yn a.s. for all n ∈ N and lim infst:s∈QXs isFt−-measurable, we infer that lim infst:s∈QXs ∈ E≤[Xt|Ft−].

To show (v) and (vi) given t ∈ R≥0 and T ∈ R>t, fix a sequence q(·) of rational numberssuch that T ≥ q(n) > q(n+ 1) and infn∈N q(n) = t. According to (iv),

the sequence Xq(·) converges to lim infst:s∈QXs a.s.

On the other hand, its uniformly integrability derives as in the proof of Lemma 3.8. Hencelim infst:s∈QXs is integrable. Let A ∈ Ft and B ∈ Ft+. Observe that

E[Xt ;A] ≤ E[Xq(n) ;A] and E[Xq(n) ;B] ≤ E[XT ;B] for all n ∈ N.

We thus get E[Xt ;A] ≤ E[lim infst:s∈QXs ;A] and E[lim infst:s∈QXs ;B] ≤ E[XT ;B].

14.11 Theorem. Suppose that X· is an F·-submartingale.(i) The process t 7→ lim infst:s∈QXs is an F·+-progressively measurable F·+-submartingalesuch that almost every sample path is locally bounded and right continuous on R≥0, admitsleft-hand limits on R>0 and at most countably many discontinuous points.(ii) If X· is right continuous in probability or ‘t 7→ E[Xt] is right continuous and Ft+ ⊂Ft ∨ Null(P ) for all t ∈ R≥0’ then the process t 7→ lim infst:s∈QXs is a modification of X·.(Being an F·+-adapted modification of an F·+-submartingale, X· is an F·+-submartingale.)

Proof. (i) We get the progressive measurability by Lemma 14.4, the almost sure right continu-ity by Lemma 14.3 and Lemma 14.10(iv), and the submartingale property by Lemma 14.10(v)and (vi). Theorem 14.8 shows the other properties.

(ii) The stochastic right continuity implies Xt = lim infst:s∈QXs a.s. for all t ∈ R≥0

by Lemma 14.10(iv). We discuss the other sufficient condition. Let t ∈ R≥0. Then, sinceFt+ ⊂ Ft ∨ Null(P ), we see by Lemma 14.10(v) that

Xt ≤ lim infst:s∈QXs a.s.

On the other hand Lemma 14.10(vi) shows

E[lim infst:s∈QXs] ≤ E[XT ] for all T ∈ R>t.

Since t 7→ E[Xt] is right continuous, it follows that E[lim infst:s∈QXs] ≤ E[Xt]. Conse-quently we obtain Xt = lim infst:s∈QXs a.s.

Recall that our definition of Fσ+ is not A ∈ σ(F·)∨Null(P ) : A∩σ < t ∈ Ft ∀t ∈ R>0but A ∈ F : A ∩ σ < t ∈ Ft ∀t ∈ R>0, where there is no prescript for A ∩ σ = +∞.

14.12 Corollary. Let X be an integrable random variable. Then there exists an F·+-progressively measurable stochastic process X· such that almost every sample path is rightcontinuous on R≥0, admits left-hand limits everywhere on R>0, and Xt ∈ E[X|Ft+] for allt ∈ R≥0. If X is σ(F·) ∨ Null(P )-measurable and σ is an F· ∨ Null(P )-optional time then

Xσ ∈ E[X|(F· ∨ Null(P ))σ+] where X∞ = X.

67

Page 68: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. Set X∞ := lim inft→+∞Xt. Then we see by Theorem 14.8(iii) that

Xσ ∈ E[X∞|(F· ∨ Null(P ))σ+].

Let A ∈⋃t≥0Ft. There exists n ∈ N such that A ∈ Fn+. It follows that

E[X∞ ;A] = E[Xn ;A] = E[X ;A].

The monotone class theorem shows that E[X∞ ;A] = E[X ;A] holds for all A ∈ σ(F·). SinceX is σ(F·) ∨ Null(P )-measurable, this implies that X∞ = X a.s.

15 Natural increasing process and martingale

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields. Wediscuss the relation between martingales and finite variation processes.

Given v : R≥0 → R and t ∈ R≥0 we set

var+(v)t :=

limst:s∈Q var+(dv|Q ; R(0,s]) lims0:s∈Q var+(dv|Q ; R(0,s]) = 0

0 lims0:s∈Q var+(dv|Q ; R(0,s]) 6= 0.

var−(v)t :=

limst:s∈Q var−(dv|Q ; R(0,s]) lims0:s∈Q var−(dv|Q ; R(0,s]) = 0

0 lims0:s∈Q var−(dv|Q ; R(0,s]) 6= 0.

Here var+(dv|Q ; R(0,s]) = +∞ if v(a) ∈ +∞,−∞ for some a ∈ Q(0,s].

15.1 Lemma. (i) The functions t 7→ var+(v)t and t 7→ var−(v)t are right continuous andnon-decreasing and takes the value 0 at time 0.(ii) Suppose that v : R≥0 → R is right continuous and of finite variation. Then

var+(dv ; R(0,t]) = var+(v)t and var−(dv ; R(0,t]) = var−(v)t for all t ∈ R≥0.

If v is continuous in addition then t 7→ var+(v)t and t 7→ var−(v)t are continuous.

15.2 Corollary. Let A· be an F·-finite variation process.(i) The processes var+(A·)· and var−(A·)· are F·+-progressively measurable as well as F· ∨Null(P )-progressively measurable, and right continuous almost surely.(ii) var+(dA· ; R(0,t]) = var+(A·)t and var−(dA· ; R(0,t]) = var−(A·)t for all t ∈ R≥0 a.s.(iii) If A· is continuous a.s. then so are var+(A·)· and var−(A·)·.(iv) If every sample path of A· is right continuous and of finite variation then var+(A·)· andvar−(A·)· are F·-adapted.

Proof. There exists Ω0 ∈ F such that P (Ω0) = 1 and t 7→ At(ω) is right continuous and offinite variation for all ω ∈ Ω0. We then have that

var+(A·(ω))t = var+(dA·(ω) ; R(0,t]) = var+(dA·(ω)|Q∪t ; R(0,t])

for all t ∈ R≥0 and ω ∈ Ω0.

68

Page 69: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Given a metrizable topological space E we denote by D(R≥0, E) the set of allright continuous mappings R≥0 → E which admit left-hand limits everywhere.

15.3 Lemma. If w ∈ D(R≥0,R) then w is locally bounded and t ∈ R>0 : |w(t)−w(t−)| > εis locally finite for all ε ∈ R>0.

Proof. Suppose that T ∈ N, g : R[0,T ] → R is right continuous and supt∈[0,T ] |g(t)| = +∞.Then t ∈ Q[0,T ] : |g(t)| > k is a non-empty set for all k ∈ N. Exploiting an enumeration ofQ (this is simply to avoid the axiom of choice) we can choose a sequence a· such that

ak ∈ Q[0,T ] and |g(ak)| > k for all k ∈ N.

Then clust(a·) 6= ∅ due to the sequential compactness of R[0,T ]. Let t ∈ clust(a·). Sincet ∈ R[0,T ] and g is right continuous, k ∈ N : t ≤ ak < t+ δ is a finite set for some δ ∈ R>0.Hence k ∈ N : t − ε ≤ ak < t is an infinite set for all ε ∈ R>0. Thus we infer that t > 0and lim supst |g(s)| = +∞, which denies the existence of left-hand limit at t ∈ R(0,T ].

Suppose that ε ∈ R>0 and a ∈ R>0. Due to the right continuity and the existence ofleft-hand limit there exists δ ∈ R(0,a) such that

|w(t)− w(a)| ≤ ε/2 for all t ∈ R[a,a+δ] and |w(t)− w(a−)| ≤ ε/2 for all t ∈ R[a−δ,a)

It follows that |w(t−) − w(a)| ≤ ε/2 for all t ∈ R(a,a+δ] and |w(t−) − w(a−)| ≤ ε/2 for allt ∈ R(a−δ,a). We thus infer that |w(t)− w(t−)| ≤ ε for all t ∈ R(a,a+δ] and t ∈ R(a−δ,a).

15.4 Corollary. Suppose that E is metrizable, ρ is a compatible metric on E, w ∈ D(R≥0, E)and F is a closed set. Then

inft ∈ R≥0 : w(s) ∈ F ≤ t or inft ∈ R>0 : w(s− o) ∈ F ≤ t⇔ infs≤t

ρ(w(s), F ) = 0

Proof. If f : R≥0 → R is right continuous and admits left-hand limits everywhere then

∃s ∈ R[0,t] s.t. m := infu∈[0,t] f(u) = f(s) or ∃s ∈ R(0,t] s.t. m = f(s− o).

Indeed we may assume that f(0) > m and f(t) > m else the above is obvious. Since m isfinite by Lemma 15.3, there exists a sequence b· such that bk ∈ Q(0,t) and f(bk) < m − 1/kfor all k ∈ N. Let s ∈ clust(b·) 6= ∅. It follows that either k ∈ N : s ≤ ak < s + ε is aninfinite set for all ε ∈ R>0 or k ∈ N : s − ε ≤ ak < s is an infinite set for all ε ∈ R>0. Ifthe former holds then f(s) = m and 0 < s < t. Otherwise f(s− o) = m and 0 < s ≤ t. Weapply this result to t 7→ ρ(w(t), F ). Then we get

infs≤t ρ(w(s), F ) = 0 ⇔ ∃s ∈ R[0,t] s.t. w(s) ∈ F or ∃s ∈ R(0,t] s.t. w(s− o) ∈ F .

Taking into account that limstw(s− o) = w(t) for all t ∈ R≥0, we reach the statement.

15.5 Remark. If w : R≥0 → E is right continuous then

infs≤t

ρ(w(s), F ) = min infs∈Q:s≤t

ρ(w(s), F ), ρ(w(t), F ).

15.6 Lemma. Suppose w ∈ D(R≥0,R) and f : R≥0 → R is of finite variation(i) If f is left continuous then

∑i:i<2ntw(t ∧ (i+ 1)/2n)−w(t ∧ i/2n)f(i/2n) converges to

−∫

[0,t]w df + f(t+ o)w(t)− f(0)w(0) as n tends to ∞ for all t ∈ R≥0.

(ii) If f is right continuous then∑

i:i<2ntw(t ∧ (i+ 1)/2n)− w(t ∧ i/2n)f(i/2n) convergesto −

∫(0,t]

w df + f(t)w(t)− f(0)w(0) as n tends to ∞ for all t ∈ R≥0.

69

Page 70: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. (i) Fix n ∈ N and write φ(i) = i/2n for i ∈ Z≥0. We see that∑i:φ(i)<t

w(t ∧ φ(i+ 1))− w(φ(i))f(φ(i)) +∑

i:φ(i)<t

f(t ∧ φ(i+ 1))− f(φ(i))w(t ∧ φ(i+ 1))

equals f(t)w(t)−f(0)w(0). Note that −∫

[0,t)w df+f(t)w(t) = −

∫[0,t]

w df+f(t+o)w(t).

15.7 Corollary. Suppose that f is an F·+-adapted process whose almost every sample pathis of finite variation and left continuous (right continuous). If X· is an F·-adapted processwith almost sure D(R≥0,R)-sample path then so is t 7→ R[fdX]t.

We see that R[1(a,b]dX]t = Xt∧b −Xt∧a = R[1[a,b)dX]t and R[1[0,b]dX]t = Xt∧b −X0.

Given v : R≥0 → R and t ∈ R≥0 we set var(v)t := var+(v)t + var−(v)t.

15.8 Lemma. Suppose that M· is an F·-submartingale (F·-local martingale) with almostsure right continuous sample path, f· is a non-negative F·-finite variation process (F·-finitevariation process) and σ is an F· ∨ Null(P )-optional time. If

E[ sups∈Q:s≤t∧σ

|Ms|(var(f·)t∧σ + |f0|) ; σ > 0] < +∞ for all t ∈ R≥0

then t 7→ R[fdM ]t∧σ is an F· ∨ Null(P )-submartingale (F· ∨ Null(P )-martingale).

Proof. We see by Corollary 15.7 and Corollary 3.14 that t 7→ R[fdM ]t∧σ is F· ∨ Null(P )-adapted. Set M· := M·∧σ and f· := f·∧σ. Then we see by Lemma 15.6(ii) that

(?)∑

i∈Z:0≤i<2nt

(Mt∧(i+1)/2n − Mt∧i/2n)fi/2n converges to R[fdM ]t∧σ a.s.

Since M· is an F· ∨ Null(P )-submartingale by Corollary 3.17, fi/2n is Fi/2n ∨ Null(P )-

measurable and non-negative a.s. and both Mt∧(i+1)/2n fi/2n and Mt∧i/2n fi/2n are integrable,it follows by the proof of Lemma 8.2 that

t 7→ (Mt∧(i+1)/2n − Mt∧i/2n)fi/2n is an F· ∨ Null(P )-submartingale.

Let n ∈ N. Invoking Abel’s trick, we see that∑i∈Z:0≤i<2nt

(Mt∧(i+1)/2n − Mt∧i/2n)fi/2n

= Mtft −∑

i∈Z:0≤i<2nt

(ft∧(i+1)/2n − ft∧i/2n)Mt∧(i+1)/2n − M0f0.

Since 2 sups≤t∧σ |Ms|(var(f·)t∧σ+f0)1σ>0 is a dominating function, the above convergence (?)occurs in L1. Being an L1-limit of a sequence of submartingales, the process t 7→ R[fdM ]t∧σis an F· ∨ Null(P )-submartingale.

Let S(·) be a reducing sequence for the local martingale M·. It follows that

t 7→ R[fdM ]t∧σ∧S(k) is an F· ∨ Null(P )-martingale.

The function 2 sups≤t∧σ |Ms|(var(f·)t∧σ + |f0|)1σ>0 serves as a dominating function. Tendingk to ∞, we infer that t 7→ R[fdM ]t∧σ is an F· ∨ Null(P )-martingale.

70

Page 71: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

15.9 Lemma. Suppose f· is an F·-adapted process whose almost every sample path is locallybounded and admits left hand limits everywhere, M· is not only an F·-local martingale butalso an F·-finite variation process and σ is an F· ∨ Null(P )-optional time. If

E[ sups∈Q:s<t∧σ

|fs|var(M·)t∧σ] < +∞ for all t ∈ R>0

then R[fdM ]·∧σ is an F· ∨ Null(P )-martingale.

Proof. Let S(·) be a reducing sequence for M·. Then t 7→ Mt∧S(n) −M0 is an F· ∨ Null(P )-

martingale. In what follows set M· := M·∧σ∧S(n) and f· := f·∧σ∧S(n). Then∑i∈Z:0≤i<2nt

(Mt∧(i+1)/2n − Mt∧i/2n)fi/2n converges to R[fdM ]t∧σ∧S(n) a.s.

as n tends to ∞ by Lemma 9.6. Since sups<t∧σ |fs|var(M·)t∧σ is a dominating function, theabove convergence occurs in L1. Repeating the discussion for Lemma 15.8, we infer thatR[fdM ]·∧σ is an F· ∨ Null(P )-martingale.

15.10 Lemma. (i) If f : R≥0 → R is locally bounded and right continuous, and v : R≥0 → Ris of finite variation and right continuous then∑i:0≤i<2nt

v(t ∧ (i+ 1)/2n)− v(t ∧ i/2n)f(t ∧ (i+ 1)/2n) converges to

∫(0,t]

f dv ∀t ∈ R≥0.

(ii) If f : R≥0 → R is of finite variation and right continuous, and w ∈ D(R≥0,R) then∑i:0≤i<2nt

w(t ∧ (i+ 1)/2n)− w(t ∧ i/2n)f(t ∧ (i+ 1)/2n)

converges to f(t)w(t)− f(0)w(0)−∫

(0,t]w(· − o) df for all t ∈ R≥0.

Given t ∈ R≥0 and finite valued processes f· and X·, we set

RfdXt := lim infn→∞

∑i∈Z:0≤i<2nt

(Xt∧(i+1)/2n −Xt∧i/2n)ft∧(i+1)/2n .

15.11 Corollary. Suppose that Y· is an F·-adapted processes whose almost every samplepath is right continuous and locally bounded, and G· is an F·-finite variation process.(i) t 7→ RY dGt is an F·-finite variation process and

YtGt − Y0G0 −RY dGt = R[GdY ]t for all t ∈ R≥0 a.s.

(ii) If almost very sample path of Y· admits left hand-limits everywhere then

YtGt − Y0G0 −R[Y dG]t = RGdY t for all t ∈ R≥0 a.s.

Proof. Lemma 15.6(ii) and Lemma 15.10.

71

Page 72: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Given t ∈ R≥0 and locally bounded processes Y· and Z·, we set

jmpv(Y,G)t :=∑

s:0<s≤t

max(Ys−Ys−)(Zs−Zs−), 0+∑

s:0<s≤t

min(Ys−Ys−)(Zs−Zs−), 0

with convention Yt− := lim infst:s∈Q Ys for t ∈ R>0 and ∞−∞ = −∞.

15.12 Lemma. Let Y· be an F·-adapted processes with almost sure D(R≥0,R)-sample pathand G· an F·-finite variation process.(i) If Z· is an F·-adapted process such that t 7→ Zt −Gt is continuous almost surely then

RY dGt − jmpv(Y, Z)t = R[Y dG]t for all t ∈ R≥0 a.s.

In particular t 7→ jmpv(Y, Z)t is indistinguishable from an F·-finite variation process.(ii) If f· is an F·-adapted process whose almost every sample path is locally bounded andadmits left-hand limits everywhere then

jmpv(Y,R[fdG])t = R[fdjmpv(Y,G)]t for all t ∈ R≥0 a.s.

Proof. (i) There exists Ω0 ∈ F such that P (Ω0) = 1, Y·(ω) is right continuous admits left-hand limits everywhere for every ω ∈ Ω0, G·(ω) is right continuous and of finite variation,and Z·(ω)−G·(ω) is continuous for every ω ∈ Ω0. Let ω ∈ Ω0 and t ∈ R≥0. Then Lemma 9.6and Lemma 15.10 show that

RY dGt(ω)−R[Y dG]t(ω) =

∫(0,t]

Ys(ω) dGs(ω)−∫

(0,t]

Ys−(ω) dGs(ω).

According to Lemma 15.3 the set s ∈ R(0,t] : Ys(ω) − Ys−(ω) 6= 0 is countable. Thereforethe right hand side reads

∑s:0<s≤t(Ys(ω)− Ys−(ω))(Gs(ω)−Gs−(ω)) = jmpv(Y, Z)t(ω).

Martb(F·) stands for the space of all bounded F·-martingales withalmost sure right continuous sample path.

15.13 Definition. An F·-finite variation process A· is said to be natural if

E[var(A·)t] < +∞ and E[RMdAt] = E[R[MdA]t] for all t ∈ R≥0 and M· ∈ Martb(F·+).

15.14 Lemma. Let A· be an F·-finite variation process. If its almost every sample path iscontinuous and E[var(A·)t] < +∞ for all t ∈ R≥0 then it is natural.

Proof. Let M· ∈ Martb(F·+). According to Theorem 14.8, almost every sample path of M·admits left-hand limits everywhere. Since t 7→ At is continuous a.s., jmpv(M,A)t = 0 a.s.On the other hand RMdAt − jmpv(M,A)t = R[MdA]t a.s. by Lemma 15.12(i).

We prove the uniqueness part of the Doob-Meyer decomposition theorem.

15.15 Lemma. If A· and B· are natural F·-finite variation processes then so is A· −B·.

Proof. Let M· ∈ Martb(F·+). There exists Ω0 ∈ F such that P (Ω0) = 1 and for each ω ∈ Ω0,

72

Page 73: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

t 7→Mt(ω) is right continuous and admits left-hand limits everywhere,t 7→ At(ω) and t 7→ Bt(ω) are right continuous and of finite variation.

Let t ∈ R≥0. If ω ∈ Ω0 then RMd(A−B)t(ω) = RMdAt(ω)−RMdBt(ω). Indeed∫(0,t]

Ms(ω) d(As(ω)−Bs(ω)) =

∫(0,t]

Ms(ω) dAs(ω)−∫

(0,t]

Ms(ω)Bs(ω).

Similarly we get R[Md(A−B)]t(ω) = [MdA]t(ω)−R[MdB]t(ω) for all ω ∈ Ω0. On the otherhand we have that var(A·−B·)t(ω) ≤ var(A·)t(ω)+var(B·)t(ω) for all ω ∈ Ω0. Consequentlythe difference A· −B· is natural.

15.16 Remark. We also infer that linear combinations of natural F·-finite variation processesare natural. We shall later verify that a natural F·-finite variation process admits F·+-naturalpositive variation part in Corollary 18.9(iv).

Given finite valued processes f· and X·, set

R[fdX]∞ := lim inft→+∞

R[fdX]t and RfdX∞ := lim inft→+∞

RfdXt.

15.17 Lemma. Suppose that M· is an F·+∨Null(P )-martingale whose almost every samplepath is right continuous and σ is an F· ∨ Null(P )-optional time. Then there exists an F·+-martingale N· such that Mt∧σ = Nt for all t ∈ R≥0 a.s.

Proof. Lemma 4.8(i) shows that (F· ∨ Null(P ))·+ = F·+ ∨ Null(P ). In particular the setof F· ∨ Null(P )-optional times coincides with the set of F·+ ∨ Null(P )-stopping times. Wesee by Corollary 3.17(ii) that t 7→Mt∧σ is an F·+ ∨Null(P )-martingale, whose almost everysample path is right continuous. Corollary 14.5 shows that there exists an F·+-progressivelymeasurable process N· such that Mt∧σ = Nt for all t ∈ R≥0 a.s.

15.18 Lemma. Suppose that A· is an F·-finite variation process, M· is an F·+-local mar-tingale with almost sure right continuous sample path, σ is an F· ∨ Null(P )-optional time,E[var(A)σ] < +∞ and sups:s≤σ |Ms| is bounded. Set M∞ := lim inft→+∞Mt.(i) If [|A0| ;σ > 0] < +∞ then E[MσAσ ;σ > 0] = E[M0A0 ;σ > 0] + E[RMdAσ].(ii) If A· is natural then E[RMdAσ] = E[R[MdA]σ].

Proof. (i) We see that t 7→Mt∧σAt∧σ −M0A0−RMdAt∧σ is an F·+ ∨Null(P )-martingaleby Corollary 15.11(i) and Lemma 15.8. Clearly this is dominated by the integrable randomvariable 2 sups:s≤σ |Ms|(|A0|+var(A)σ)1σ>0. Thus, taking Theorem 14.8(ii) into account, wereach the equality E[MσAσ −M0A0 −RMdAσ] = 0.

(ii) According to Lemma 15.17, t 7→ Mt∧σ is indistinguishable from a bounded F·+-martingale, say, t 7→ Nt. Since A· is a natural F·-finite variation process, we infer that

E[RM·∧σdAt] = E[RNdAt] = E[R[NdA]t] = E[R[M·∧σdA]t] for all t ∈ R≥0.

There exists K ∈ R>0 and Ω0 ∈ F such that P (Ω0) = 1 and for each ω ∈ Ω0,

t 7→ At(ω) is right continuous and of finite variation, var(dA(ω); (0, σ(ω)]) < +∞t 7→Mt(ω) is right continuous and admits left-hand limits, |Mt(ω)| ≤ K.

73

Page 74: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Here read (0,+∞] = R>0. Let ω ∈ Ω0. The we see that

limst

Ms∧σ(ω) =

Mt−(ω) 0 < t ≤ σ(ω)

Mσ(ω) t > σ(ω).

Therefore for each t ∈ R≥0 the difference RM·∧σdAt(ω)−R[M·∧σdA]t(ω) equals∫(0,t]

Ms∧σ(ω)− (M·∧σ)s−(ω) dAs(ω) =

∫(0,t∧σ]

Ms(ω)−Ms−(ω) dAs(ω).

The right hand side coincides with RMdAt∧σ(ω)−R[MdA]t∧σ(ω). Consequently

E[RMdAt∧σ] = E[R[MdA]t∧σ] for all t ∈ R≥0.

If ω ∈ Ω0 then RMdAt∧σ(ω) converges to∫

(0,σ(ω)]Ms(ω) dAs(ω) while R[MdA]t∧σ(ω)

converges to∫

(0,σ(ω)]Ms−(ω) dAs(ω) as t tends to +∞. Both of them are dominated by the

integrable random variable sups:s≤σ |Ms|var(A)σ. Tending t to +∞ we get the claim.

15.19 Theorem. Suppose that A· and B· are natural F·-finite variation processes. If A·−B·is an F·-martingale and A0 = B0 a.s. then At = Bt for all t ∈ R≥0 a.s.

Proof. Let Λ ∈ F . According to Corollary 14.12, there exist a stochastic process X· suchthat almost every sample path is right continuous and

Xt ∈ E[1Λ|Ft+] for all t ∈ R≥0.

We may regard that 0 ≤ Xt(ω) ≤ 1 for all ω ∈ Ω. Indeed (X·∨0)∧1 is a desired modification.Hence X· is a bounded F·+-martingale. Let t ∈ R≥0. Since A· − B· is a natural F·-finitevariation process by Lemma 15.15 and A0 = B0 a.s., it follows by Lemma 15.18 that

E[At −Bt ; Λ] = E[Xt(At −Bt)]− E[X0(A0 −B0)] = E[R[Xd(A−B)]t].

The right hand side vanishes by Lemma 15.9. Indeed A· − B· is an F·+-martingale due tothe F·-martingale property and the right continuity. Thus we conclude that

E[At −Bt ; Λ] = 0 for all Λ ∈ F , t ∈ R≥0.

This together with the right continuity implies that At −Bt = 0 for all t ∈ R≥0 a.s.

16 Local martingale with finite variation path

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields. Wediscuss the relation between martingales with continuous sample path and martingales withfinite variation sample path. So the index space is R≥0.

Mart(F·) denotes the set of all F·-local martingales with almost sure right contin-uous sample paths. Martc(F·) := M· ∈ Mart(F·) : t 7→Mt continuous a.s.

74

Page 75: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

16.1 Lemma. Suppose that M·, N· ∈ Martc(F·), F· and G· are F·-finite variation processes,and C· ∈ Crv[M,N ;F·]. Set Y· := t 7→Mt + Ft and Z· := t 7→ Nt +Gt.(i) There exists an F·-finite variation process Q· such that

t 7→ YtZt −R[Y dG]t −R[ZdF ]t −Qt ∈ Y0Z0 + Ito[Y dN ] + Ito[ZdM ].

(ii) Let Q· be an F·-finite variation process with Q0 = 0 a.s. Then

t 7→ YtZt −R[Y dG]t −R[ZdF ]t −Qt ∈ Martc(F·)

⇔ Qt = jmpv(F,G)t + Ct for all t ∈ R≥0 a.s.

⇔ t 7→ YtZt −R[Y dG]t −R[ZdF ]t −Qt ∈ Mart(F·)

Qt −Qt− = (Ft − Ft−)(Gt −Gt−) for all t ∈ R≥0 a.s.

Proof. (i) Since t 7→ Mt and t 7→ Nt are continuous a.s., it follows that jmpv(M,G)t = 0and jmpv(N,F )t = 0 for all t ∈ R≥0 a.s. We get

MtGt −M0G0 = R[MdG]t +R[GdM ]t, NtFt −N0F0 = R[NdF ]t +R[FdN ]t

for all t ∈ R≥0 a.s. by Corollary 15.11(i) and Lemma 15.12(i). On the other hand

FtGt − F0G0 − jmpv(F,G)t = R[FdG]t +R[GdF ]t for all t ∈ R≥0 a.s.

Since YtZt −MtNt = FtNt + MtGt + FtGt, the process t 7→ YtZt − R[Y dG]t − R[ZdF ]t isindistinguishable from

t 7→ Y0Z0 +MtNt −M0N0 + jmpv(F,G)t +R[GdM ]t +R[FdN ]t.

Finally M·N· ∈M0N0 + Ito[MdN ] + Ito[NdM ] + Crv[M,N ;F·] by Example 12.3.(ii) Write X· := t 7→ YtZt −R[Y dG]t −R[ZdF ]t. Then we see that

t 7→ Xt −Qt ∈ Mart(F·) ⇔ t 7→ Ct + jmpv(F,G)t −Qt ∈ Mart(F· ∨ Null(P ))

t 7→ Xt −Qt continuous a.s. ⇔ t 7→ jmpv(F,G)t −Qt continuous a.s.

On the other hand any local martingale whose almost every sample paths is continuous andof finite variation, and starts from 0 is evanescent according to Corollary 8.17.

16.2 Lemma. Suppose that f, g : R≥0 → R are right continuous and of finite variation and∆ runs through locally finite partitions of R≥0. Then as mesh(∆) tends to 0∑

J∈∆:inf J<t

f(t ∧ sup J)− f(t ∧ inf J)g(t ∧ sup J)− g(t ∧ inf J)

converges to∑

s:s≤tf(s)− f(s− o)g(s)− g(s− o) locally uniformly on R≥0.

Proof. We denote the union of discontinuous points of f and g by D, which is countable set.Suppose that T ∈ R>0 and 0 ≤ a < b ≤ T . Then

f(b)g(b)− f(a)g(a)−∑

s∈D:a<s≤b

f(s)− f(s− o)g(s)− g(s− o)

=

∫(a,b]

f(· − o) dg +

∫(a,b]

g(· − o) df.

75

Page 76: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Since f(b)g(a) + f(a)g(b)− 2f(a)g(a) =∫

(a,b]f(a) dg +

∫(a,b]

g(a) df , it follows that

f(b)− f(a)g(b)− g(a) −∑

s∈D:a<s≤b

f(s)− f(s− o)g(s)− g(s− o)

=

∫(a,b]

f(· − o)− f(a) dg +

∫(a,b]

g(· − o)− g(a) df.

Let ∆ be a locally finite partition of R≥0 and t ∈ R[0,T ]. The absolute value of the difference∑J∈∆:inf J<t

f(t ∧ sup J)− f(t ∧ inf J)g(t ∧ sup J)− g(t ∧ inf J)

−∑

s∈D:s≤t

f(s)− f(s− o)g(s)− g(s− o)

is dominated by∫(0,T ]

|f(· − o)−∑J∈∆

f(inf J)1J | var?(dg; ·) +

∫(0,T ]

|g(· − o)−∑J∈∆

g(inf J)1J | var?(df ; ·).

This converges to 0 as mesh(∆) tends to 0.

16.3 Example. Let N· be an F·-Poisson process. If ∆ runs through locally finite partitionsof R≥0 and mesh(∆) tends to 0 then

∑J∈∆:inf J<t(Nt∧sup J −Nt∧inf J)

2 converges to Nt locallyuniformly on R≥0 a.s. Indeed

∑s:s≤t(Ns −Ns−)2 = Nt for all t ∈ R≥0 a.s.

16.4 Corollary. Suppose that M·, N· ∈ Martc(F·), F· and G· are F·-finite variation pro-cesses, and C· ∈ Crv[M,N ;F·]. Set Y· : t 7→ Mt + Ft and Z· : t 7→ Nt + Gt. If ∆ runsthrough locally finite partitions of R≥0 and mesh(∆) tends to 0 then

sups:s≤t

|∑

J∈∆:inf J<s

(Ys∧sup J − Ys∧inf J)(Zs∧sup J − Zs∧inf J)− Cs − jmpv(F,G)t|

converges to 0 in probability for all t ∈ R≥0.

Proof. Lemma 10.3, Lemma 8.16 and Lemma 16.2.

16.5 Definition. An F·-locally bounded process is an F·-adapted process f· such that thereexist a sequence S(·) of F· ∨ Null(P )-optional times and K· ∈ Seq(R) such that

S(n) ≤ S(n+ 1) a.s., supn∈N S(n) = +∞ a.s. and |ft∧S(n)1S(n)>0| ≤ Kn for all t ∈ R≥0 a.s.

Such pair S(·), K· is called a reducing sequence.

16.6 Example. (i) Any left continuous function R≥0 → R which admits right hand limitseverywhere is locally bounded on R≥0.(ii) If f· is an F·-adapted process and its almost every sample path is left continuous andlocally bounded then it is an F·-locally bounded process.

76

Page 77: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. (i) Invoke Lemma 15.3 with the role of the left and the right interchanged.(ii) Set τ(k) := inft ∈ R≥0 : |ft| > k. We select and fix Ω0 ∈ F such that P (Ω0) = 1

and f·(ω) is left continuous and locally bounded for all ω ∈ Ω0. Let ω ∈ Ω0. If k issufficiently large then |ft(ω)| ≤ k for all t ∈ R[0,1], which means that τ(k, ω) > 0 and, dueto the left continuity, |ft∧τ(k)(ω)| ≤ k for all t ∈ R≥0. (Different from continuous case itmay happen that τ(k) < +∞ and |fτ(k)| < k.) The local boundedness also implies thatsupk∈N τ(k, ω) = +∞. Thus the pair k 7→ τ(k), k 7→ k is a reducing sequence.

Martfv(F·) := F· ∈ Mart(F·) : t 7→ Ft of finite variation a.s.

16.7 Lemma. Suppose F· ∈ Martfv(F·) and τ(k) := inft ∈ R≥0 : var(F )t > k for k ∈ N.(i) If σ is an F· ∨ Null(P )-optional time and t 7→ Ft∧σ − F0 is an F· ∨N -martingale then

E[var(F )t∧σ∧τ(k)p]1/p ≤ 2k + E[|Ft∧σ − F0|p]1/p for all t ∈ R≥0, k ∈ N and p ∈ R≥1.

(ii) There exists a reducing sequence S(·) such that E[var(F )t∧S(n)] < +∞ ∀t ∈ R≥0 ∀n ∈ N.

Proof. We select and fix Ω0 ∈ F such that P (Ω0) = 1 and F·(ω) is is right continuous andof finite variation for all ω ∈ Ω0. Let ω ∈ Ω0 and t ∈ R≥0.

var(dF·(ω); R(0,t]) = var(dF·(ω); R(0,t)) + |Ft(ω)− Ft−(ω)|≤ var(dF·(ω); R(0,t)) + |Ft−(ω)− F0(ω)|+ |Ft(ω)− F0(ω)|≤ 2var(dF·(ω); R(0,t)) + |Ft(ω)− F0(ω)|.

We see that var(F )(t∧τ(k))−(ω) ≤ k. It therefore follows that

var(F )t∧τ(k) ≤ 2k + |Ft∧τ(k) − F0| for all t ∈ R≥0 a.s.

Since t 7→ Ft∧σ − F0 is a martingale, we have by Corollary 3.17(i) that

Ft∧σ∧τ(k) − F0 ∈ E[Ft∧σ − F0|(F· ∨ Null(P ))τ(k)+].

Lemma 1.13(i) shows that E[|Ft∧σ∧τ(k) − F0|p] ≤ E[|Ft∧σ − F0|p] for p ∈ R≥1.

16.8 Theorem. Suppose that F·, G· ∈ Martfv(F·).(i) If `· is an F·-adapted process whose almost every sample path is locally bounded andadmits left-hand limits then R[`dF ]t ∈ Martfv(F·).(ii) t 7→ FtGt − jmpv(F,G)t ∈ Martfv(F· ∨ Null(P )).(iii) If `· is as in (i) then t 7→ R[`dF ]tGt −R[`d jmpv(F,G)]t ∈ Martfv(F· ∨ Null(P )).

Proof. (i) With the help of Example 16.6 and Lemma 16.7, we apply Lemma 15.9.(ii) t 7→ FtGt − jmpv(F,G)t is indistinguishable from t 7→ F0G0 +R[FdG]t +R[GdF ]t.(iii) jmpv(R[`dF ], G)t = R[`d jmpv(F,G)]t for all t ∈ R≥0 a.s. by Corollary 15.11(iii).

Mart2(F·) stands for the space of all square integrable F·-martingaleswith almost sure right continuous sample path.

77

Page 78: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

16.9 Theorem. Suppose M·, N· ∈ Martc(F·), C· ∈ Crv[M,N ;F·] and F·, G· ∈ Martfv(F·).Set Y· : t 7→Mt + Ft and Z· : t 7→ Nt +Gt.(i) There exists an F·-finite variation process indistinguishable from t 7→ Ct + jmpv(F,G)t.(ii) An F·-finite variation process Q· is indistinguishable from t 7→ Ct + jmpv(F,G)t

⇔ t 7→ YtZt −Qt ∈ Mart(F·), Q0 = 0, Qt −Qt− = (Ft − Ft−)(Gt −Gt−)∀t ∈ R≥0 a.s.

(iii) If Y· ∈ Mart2(F·) then M·, F· ∈ Mart2(F·) and E[jmpv(F, F )t] < +∞ for all t ∈ R≥0.(iv) If Y·, Z· ∈ Mart2(F·) then t 7→ YtZt − Ct − jmpv(F,G)t is an F· ∨ Null(P )-martingale.

Proof. (i) By Lemma 16.1(i). (ii) By Lemma 16.1(ii) and Theorem 16.8(ii).(iii) We select and fix a reducing sequence S(·) for t 7→ |Yt|2 − At − jmpv(F, F )t where

A· ∈ Qvar[M ;F·]. Since t 7→ |Yt|2 is a submartingale, it follows that

E[|Y0|2] + E[At∧S(n) + jmpv(F, F )t∧S(n)] = E[|Yt∧S(n)|2] ≤ E[|Yt|2] < +∞

for all t ∈ R≥0 and n ∈ N. Note that t 7→ At + jmpv(F, F )t is an increasing process. Themonotone convergence theorem shows that

E[At + jmpv(F, F )t] ≤ E[|Yt|2]− E[|Y0|2] < +∞ for all t ∈ R≥0.

This implies the martingale property of t 7→ |Yt|2 − At − jmpv(F, F )t by Theorem 7.17(ii).On the other hand we also have that E[At] < +∞ for all t ∈ R≥0. Invoking Theorem 7.17(ii)again we infer that M· ∈ Mart2(F·).

(iv) The polarization identity 4jmpv(F,G)t = jmpv(F+G,F+G)t−jmpv(F−G,F−G)tfor all t ∈ R≥0 a.s. establishes the statement with the help of Lemma 10.6(iii).

17 Predictable process and martingale with finite variation path

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields.

Assume that F· is an F·-local martingale whose every (not almost every) samplepath is of finite variation and right continuous.

If ` is an F·-adapted process whose every sample path is locally bounded and left con-tinuous then

R[`dF ]t(ω) =

∫(0,t]

`(·, ω) dF·(ω) for all (t, ω) ∈ R≥0 × Ω

and hence the process with finite variation sample path t 7→∫

(0,t]`(·, ω) dF·(ω) is an F·-local

martingale by Theorem 16.8(i). We shall extend this result with keeping our eye on themeasurability.

17.1 Definition. We denote by Pred(F·) the σ-field on R≥0 ×Ω generated by the set of allleft continuous F·-adapted processes. A stochastic process X· is said to be F·-predictable ifthe mapping (t, ω) 7→ Xt(ω) is Pred(F·)-measurable.

17.2 Lemma. (i) If τ is an F·-optional time then (t, ω) : τ(ω) < t ∈ Pred(F·).(ii) Pred(F·) is generated by R(u,v] ×A ;u, v ∈ R, 0 ≤ u < v,A ∈ Fu ∪ 0×A ;A ∈ F0.(iii) Pred(F·) is generated by (t, ω) : σ(ω) < t, 0 × σ = 0;σ F·-stopping time.(iv) Pred(F·) is generated by the set of all continuous F·-adapted processes.

78

Page 79: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. (i) We see that R≥0 → R, t 7→ t ∧ τ(ω) is continuous for each ω ∈ Ω and Ω → R,ω 7→ t∧τ(ω) is Ft-measurable for each t ∈ R≥0 (see the prof of Lemma 3.13). It then followsthat (t, ω) 7→ t ∧ τ(ω) is Pred(F·)-measurable. Therefore we have that

(t, ω) : τ(ω) < t = (t, ω) : t ∧ τ(ω) < t ∈ Pred(F·).

(ii) We write [x] := maxi ∈ Z : i < x for x ∈ R. If f : R≥0 → R is left continuous thenf([2nt]/2n) converges to f(t) for all t ∈ R>0. Note that maxt− 1/2n, 0 ≤ [2nt]/2n < t.

(iii) Suppose that u, v ∈ R, 0 ≤ u < v and A ∈ Fu+. The function σ such that σ = u onA and σ = v on Ω \ A is an F·-optional time (F·-stopping time if A ∈ Fu). Indeed

(t, ω) : σ(ω) < t = (R(u,v] × A) ∪ (R>v × Ω)

We see that R(u,v] × A = (t, ω) : σ(ω) < t ∩ (R≤v × Ω).(iv) Suppose that u, v ∈ R, 0 ≤ u < v and A ∈ Fu. For each n ∈ N the function

fn : R≥0 → R, t 7→ maxminn(t − u), 1, n(v − t) + 1, 0 is continuous and the sequencefn converges to 1(u,v] point wise. Since fn(t) = 0 for t ≤ u, the process with samplepath t 7→ fn(t)1A(ω) is F·-adapted. Finally observe that for each n ∈ N the functiongn : t 7→ max1− nt, 0 is continuous and the sequence gn converges to 10 point wise.

17.3 Lemma. Suppose that F· is a filtration with index space Q≥0 and X· is an F·-adaptedstochastic process with index space Q≥0. Then (t, ω) 7→ lim infstXs(ω) is G·-predictablewhere lim infs0Xs(ω) := X0(ω), Gt :=

⋂s∈Q:s<tFs for t ∈ R>0 and G0 := F0.

Proof. Let s ∈ R>0. If n ∈ N and n > 1/s then

(t, ω) ∈ R≥0 × Ω : t > s, infq∈Q:t−1/n≤q<t

Xq(ω) < a

=⋃

q∈Q:q≥0

t ∈ R≥0 : t > s, q < t ≤ q + 1/n × ω ∈ Ω : Xq(ω) < a for all a ∈ R.

The right hand side belongs to Pred(G·) according to Lemma 17.2(ii). It follows that

(t, ω) ∈ R≥0 × Ω : t > s, lim infqt

Xq(ω) > b ∈ Pred(G·)

for all s ∈ R>0 and b ∈ R. Since X0 > b ∈ F0, we get by Lemma 17.2(ii) that

(t, ω) ∈ R≥0 × Ω : lim infqt

Xq(ω) > b

= (t, ω) ∈ R≥0 × Ω : t > 0, lim infqt

Xq(ω) > b ∪ (0 × X0 > b) ∈ Pred(G·)

for all b ∈ R. Consequently (t, ω) 7→ lim infstXs(ω) is G·-predictable.

17.4 Corollary. Suppose that X· is an F·+ ∨ Null(P )-adapted process whose almost everysample path is left continuous. If X0 is F0 ∨ Null(P )-measurable then there exists an F·-predictable process Y· such that Xt = Yt for all t ∈ R≥0 a.s.

Proof. There exists Ω0 ∈ F such that P (Ω0) = 1 and

t 7→ Xt(ω) is left continuous for each ω ∈ Ω0.

79

Page 80: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Given t ∈ Q>0, we select an Ft+-measurable random variable Xt so that Xt = Xt a.s. SetX0 := X0. By choosing a proper subset of Ω0 if necessary, we have that Xt(ω) = Xt(ω) for allt ∈ Q≥0 and ω ∈ Ω0. The process t 7→ lim infst:s∈Q Xs is a desired one by Lemma 17.3.

We first establish the F·-adaptedness of the process with sample path

t 7→∫

(0,t]

`(·, ω) dF·(ω)

for each F·-predictable process ` with locally bounded sample path. See Corollary 17.8.

17.5 Definition. Prog(F·) := A ⊂ R≥0×Ω : A∩ (R[0,t]×Ω) ∈ Borel(R≥0)⊗Ft ∀t ∈ R≥0.

17.6 Lemma. (i) Prog(F·) is a σ-field on R≥0 × Ω. Pred(F·) ⊂ Prog(F·).(ii) A stochastic process X· with state space E is F·-progressively measurable if and only ifR≥0 × Ω → E, (t, ω) 7→ Xt(ω) is Prog(F·)-measurable.(iii) τ : Ω → R≥0∪+∞ is an F·-stopping time if and only if (t, ω) : τ(ω) ≤ t ∈ Prog(F·).If τ is an F·-optional time then (t, ω) : τ(ω) ≤ t ∈ Prog(F·+).

Proof. (i) R[0,t] × Ω ∈ Borel(R≥0)⊗Ft. Lemma 3.12 and Lemma 17.6.(ii) Let B1 respectively B2 be σ-fields on T1 and T2. Given ϕ1 : S1 → T1 and ϕ2 : S2 → T2,

(ϕ∗1B1)⊗ (ϕ∗2B2) = σ(ϕ∗1B1 ∗ ϕ∗2B2) = σ((ϕ1 × ϕ2)∗(B1 ∗ B2)) = (ϕ1 × ϕ2)

∗(B1 ⊗ B2).

Let t ∈ R≥0 and ι be the canonical mapping R[0,t] → Borel(R≥0). Since ι∗Borel(R≥0) =Borel(R[0,t]) and R[0,t] × Ω ∈ Borel(R≥0)⊗Ft, it follows that

Borel(R[0,t])⊗Ft = A ∈ Borel(R≥0)⊗Ft : A ⊂ R[0,t] × Ω for all t ∈ R≥0.

Consequently, given B ⊂ E, we have that (s, ω) : s ≤ t,Xs(ω) ∈ B ∈ Borel(R[0,t])⊗ Ft ifand only if (s, ω) : Xs(ω) ∈ B ∩ (R[0,t] × Ω) ∈ Borel(R≥0)⊗Ft.

(iii) Let τ be an F·-stopping time. We see that R≥0 → R, t 7→ 1[τ(ω),+∞)(t) is rightcontinuous for each ω ∈ Ω and that Ω → R, ω 7→ 1[0,t](τ(ω)) is Ft-measurable for each t ∈R≥0. It follows by Lemma 3.12 that (t, ω) 7→ 1[τ(ω),+∞)(t) is F·-progressively measurable.

17.7 Lemma. Let t ∈ R≥0 and v : R[0,t] × Ω → R. Suppose v(·, ω) is finite valued, rightcontinuous and non-decreasing for each ω ∈ Ω and v(s, ·) is Ft-measurable for each s ∈ R[0,t].(i) (s, ω) 7→ dv(A ∩ R[0,s], ω) is Borel(R[0,t])⊗Ft-measurable for every A ∈ Borel(R[0,t]).

(ii) The function R[0,t]×Ω → R, (s, ω) 7→∫

(0,s]1B(·, ω) dv(·, ω) is Borel(R[0,t])⊗Ft-measurable

for every B ∈ Borel(R[0,t])⊗Ft.

Proof. (i) We set J := R(a,b] ; a, b ∈ R[0,t], a < b ∪ ∅ and consider the following family:

D := A ∈ Borel(R[0,t]) : (s, ω) 7→ dv(A ∩ R[0,s], ω) is Borel(R[0,t])⊗Ft-measurable.

Clearly J is closed under intersection and J ∪ R[0,t] ⊂ D. The monotone convergencetheorem shows that D is a Dynkin system. Consequently Borel(R[0,t]) ⊂ D.

(ii) The body of the discussion is the same as that for (i). The Dynkin system

B ∈ Borel(R[0,t])⊗Ft : (s, ω) 7→∫

(0,s]

1B(·, ω) dv(·, ω) is Borel(R[0,t])⊗Ft-measurable

contains Borel(R[0,t]) ∗ Ft by (i).

80

Page 81: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

17.8 Corollary. Let A· be an F·-adapted process whose every sample path is finite val-ued, right continuous and non-decreasing and X· a non-negative F·-progressively measurableprocess. Then t 7→

∫(0,t]

Xs dAs is F·-progressively measurable and hence F·-adapted.

17.9 Remark. The sample path t 7→∫

(0,t]Xs dAs is not necessarily right continuous. It may

happen that∫

(0,t]Xs dAs < +∞ but

∫(0,u]

Xs dAs = +∞ for all u > t.

17.10 Definition. An F·-increasing process A· is said to be F·-locally integrable if thereexist a sequence S(·) of F· ∨ Null(P )-optional times such that

S(n) ≤ S(n+ 1) a.s., supn∈N S(n) = +∞ a.s. and E[At∧S(n)] < +∞ for all t ∈ R≥0.

Such S(·) is called a reducing sequence.

Mart(F·) denotes the set of all F·-local martingales with almost sure right continuoussample paths. Martfv(F·) := M· ∈ Mart(F·) : t 7→Mt of finite variation a.s.

17.11 Lemma. Suppose that A· is an F·-locally integrable increasing process and S(·) is areducing sequence for A·. If M· ∈ Mart(F·), E[|M0|] < +∞ and |Mt −M0| ≤ At for allt ∈ R≥0 a.s. then S(·) is a reducing sequence for M· as well.

Proof. Let τ(·) be a reducing sequence for M·. Since M0 is integrable, it follows that t 7→Mt∧τ(k) is an F· ∨ Null(P )-martingale for each k ∈ N. Fix n ∈ N. We see that

t 7→Mt∧S(n)∧τ(k) is an F· ∨ Null(P )-martingale for each k ∈ N

by Corollary 3.17(ii). On the other hand

|Mt∧S(n)∧τ(k)| ≤ |M0|+ At∧S(n)∧τ(k) ≤ |M0|+ At∧S(n) a.s. for all t ∈ R≥0 and k ∈ N.

Thus t 7→Mt∧S(n) is an F·∨Null(P )-martingale by the dominated convergence theorem.

17.12 Corollary. Suppose Mn· is a sequence of Mart(F·), M· is a process with almost sure

right continuous path, and sups≤t |Mns −Ms| converges to 0 in probability for all t ∈ R≥0.

Then t 7→Mt ∈ Mart(F·∨Null(P )) provided there exists a sequence Ak· of F·-locally integrableincreasing processes and a sequence S(·) of F· ∨ Null(P )-optional times such that

S(k) ≤ S(k + 1) a.s., supn∈N S(k) = +∞ a.s. and |Mnt∧S(k)| ≤ Akt for all t ∈ R≥0 a.s.

Proof. By choosing a subsequence we may assume that

lim supn→∞

sups≤t

|Mns −Ms| = 0 for all t ∈ R≥0 a.s.

It follows that Mt = lim infn→∞Mnt a.s. for all t ∈ R≥0 and hence M· is F·∨Null(P )-adapted.

Fix k ∈ N. It suffices to show that

N· : t 7→Mt∧S(k) ∈ Mart(F· ∨ Null(P )).

We write Nn· : t 7→ Mn

t∧S(k) for each n ∈ N, which is an F· ∨ Null(P )-local martingale by

Lemma 7.5(ii). Let τ(·) be a reducing sequence for Ak· . We have that |Nnt | ≤ Akt for all

t ∈ R≥0 a.s. In particular Nn0 = 0 a.s. Therefore, according to Lemma 17.11,

81

Page 82: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

τ(·) also serves as a reducing sequence of t 7→ Nnt for each n ∈ N.

On the other hand we see that

|Nnt∧τ(i) −Nt∧τ(i)| ≤ sup

s≤t|Nn

s −Ns| ≤ sups≤t

|Mns −Ms| and |Nn

t∧τ(i)| ≤ Akt∧τ(i).

Tending n to ∞, we infer that t 7→ Nt∧τ(i) is an F· ∨ Null(P )-martingale for each i ∈ N bythe dominated convergence theorem. Thus τ(·) also serves as a reducing sequence of theF· ∨ Null(P )-local martingale t 7→ Nt.

17.13 Definition. A family of F·-locally bounded processes is said to be equi-locallybounded if there exist a common reducing sequence.

17.14 Lemma. Suppose that `n is an F·-equi locally bounded sequence of F·-predictableprocess with locally bounded sample path and the sequence converges to a function ` pointwise. If the process with sample path t 7→

∫(0,t]

`n(·, ω) dF·(ω) is an F·-local martingale for

each n ∈ N then so is the process with sample path t 7→∫

(0,t]`(·, ω) dF·(ω).

Proof. Corollary 17.8 shows that the process in question is F·-adapted. There exist Ω0 ∈ F ,a sequence S(·) of Time(F· ∨ Null(P )) and a sequence K· of R such that P (ω0) = 1,

S(i, ω) ≤ S(i+ 1, ω), supi∈N S(i, ω) = +∞ for all ω ∈ Ω0 and|`n| ≤ Ki on (t, ω) : ω ∈ Ω0, 0 < t ≤ S(i, ω)

By Lemma 16.7 the F·-increasing process t 7→ var(F )t is locally integrable. Moreover∣∣∣ ∫(0,t∧S(i,ω)]

`n(·, ω) dF·(ω)∣∣∣ ≤ Kivar(F )t(ω) for all t ∈ R≥0 and ω ∈ Ω0.

If t ∈ R>0 and ω ∈ Ω0 then t ≤ S(i, ω) for sufficiently large i and hence∫(0,t]

`n(·, ω) dF·(ω) converges to

∫(0,t]

`(·, ω) dF·(ω) as n tends to ∞

by the dominated convergence theorem. According to Corollary 17.12, the limit processinherits the F· ∨ Null(P )-local martingale property.

17.15 Theorem. If ` is an F·-locally bounded F·-predictable process with locally boundedsample path then the process with finite variation sample path I· : t 7→

∫(0,t]

`(·, ω) dF·(ω) is

an F·-local martingale. jmpv(I, I)t(ω) =∫

(0,t]|`(·, ω)|2 djmpv(F, F )·(ω)

Proof. We write C := R(u,v] × A ;u, v ∈ R, 0 ≤ u < v,A ∈ Fu ∪ 0 × A ;A ∈ F0. Since

(u, v] ∩ (u′, v′] =

(u ∨ u′, v ∧ v′] if u < v′ and u′ < v

∅ otherwise

the family C is closed under intersections. If A ∈ C then the process with sample patht 7→ 1A(t, ω) is a bounded F·-adapted process with left continuous sample path. Hence C isincluded in the following family D by Theorem 16.8(i):

D := A ∈ Pred(F·) : t 7→∫

(0,t]

1A(·, ω) dF·(ω) is an F·-local martingale.

82

Page 83: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

We see by Lemma 17.14 that D is a Dynkin system. On the other hand the σ-field Pred(F·)is generated by C as Lemma 17.2(ii) shows. Thus D = Pred(F·). Invoking the dominatedconvergence theorem and Lemma 17.14, we get the statement.

17.16 Lemma. Suppose that a process X· is indistinguishable from an F·-predictable process.Then there exists Ω0 ∈ F such that P (Ω0) = 1 and t 7→ Xt is an F· ∩Ω0-predictable processon the probability space (Ω0,F ∩ Ω0, P |Ω0).

Proof. Let Y· be an F·-predictable process indistinguishable from X·.

There exists Ω0 ∈ F such that P (Ω0) = 1 and Xt(ω) = Yt(ω) for all t ∈ R≥0 and ω ∈ Ω0.

We work on the probability space (Ω0,F ∩ Ω0, P |Ω0) equipped with the filtration F· ∩ Ω0.Accordingly introduce the system

C := R(u,v] ×B ;u, v ∈ R, 0 ≤ u < v,B ∈ Fu ∪ 0 ×B ;B ∈ F0

so that we have Pred(F·) = σ(C) by Lemma 17.2(ii). On the other hand let ι be the canonicalmapping R≥0 × Ω0 → R≥0 × Ω. Since

ι∗C := R(u,v] ×B ;u, v ∈ R, 0 ≤ u < v,B ∈ Fu ∩ Ω0 ∪ 0 ×B ;B ∈ F0 ∩ Ω0,

it follows that Pred(F· ∩ Ω0) = σ(ι∗C) = ι∗σ(C) = Pred(F·) ∩ (R≥0 × Ω0). Therefore

t 7→ Yt(ω) is an F· ∩ Ω0-predictable process on (Ω0,F ∩ Ω0, P |Ω0).

The statement is now trivial.

17.17 Corollary. Suppose M· ∈ Martfv(F·), f· is an F·-locally bounded F·-adapted processwith almost sure right continuous sample path. If f· is indistinguishable from an F·∨Null(P )-predictable process then t 7→ RfdMt ∈ Martfv(F·) and jmpv(RfdM, RfdM)t =R|f |2 djmpv(M,M)t for all t ∈ R≥0 a.s.

Proof. We see by Lemma 17.16 that there exists Ω0 ∈ F such that P (Ω0) = 1 and

t 7→Mt(ω) is of finite variation and right continuous for each ω ∈ Ω0,t 7→ ft(ω) is locally bounded and right continuous for each ω ∈ Ω0, andt 7→ ft(ω) is an (F· ∨ Null(P )) ∩ Ω0-predictable process on (Ω0,F ∩ Ω0, P |Ω0).

Moreover f· is (F· ∨ Null(P )) ∩ Ω0-locally bounded. Since∫(0,t]

f·(ω) dM·(ω) = RfdMt(ω) for all t ∈ R≥0 and ω ∈ Ω0,

we see by Theorem 17.15 that

t 7→ RfdMt is an (F· ∨ Null(P )) ∩ Ω0-local martingale on (Ω0,F ∩ Ω0, P |Ω0).

As a process on (Ω,F , P ), being F·-adapted, it is an F·-local martingale.

83

Page 84: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

17.18 Theorem. Suppose that M· ∈ Martfv(F·), f· is an F·-adapted process with almostsure locally bounded sample path, and σ ∈ Time(F· ∨ Null(P )).(i) If almost every sample path of f· admits left-hands limits everywhere, and

E[R[|f |dvar(M)]t∧σ] < +∞ for all t ∈ R≥0

then t 7→ R[fdM ]t∧σ is an F· ∨ Null(P )-martingale.(ii) If almost every sample path of f· is right continuous it is indistinguishable from anF· ∨ Null(P )-predictable process and

E[R|f |dvar(M)t∧σ] < +∞ for all t ∈ R≥0

then t 7→ RfdMt∧σ is an F· ∨ Null(P )-martingale.

Proof. The argument being parallel, we concentrate on the claim (ii). We introduce a se-quence of functions φn : R → R, x 7→ maxminx, n,−n. The composition processt 7→ maxminft, n,−n is right continuous almost surely and |φn(ft)| ≤ n for all t ∈ R≥0.Moreover it is also indistinguishable from an F· ∨ Null(P )-predictable process. We see byCorollary 17.17 (as for (i) use Theorem 16.8 instead) that

t 7→ Rφn(f)dMt ∈ Martfv(F·).

The stopped process t 7→ Rφn(f)dMt∧σ is an F· ∨ Null(P )-martingale. Indeed

|Rφn(f)dMt∧σ| ≤ R|f |dvar(M)t∧σ a.s.

and Lemma 17.11 shows the martingale property. On the other hand

|Rφn(f)dMt∧σ −RfdMt∧σ| ≤ R|φn(f)− f |dvar(M)t∧σ ≤ R|f |dvar(M)t∧σ a.s.

Therefore we infer that Rφn(f)dMt∧σ converges to RfdMt∧σ in L1. Thus we get themartingale property of the stopped process t 7→ RfdMt∧σ.

18 Doob-Meyer decomposition theorem

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields.

Let X· be an F·-submartingale with stochastic right continuity.

According to Theorem 14.11(ii), X· is an F·+-submartingale as well. It follows that ift, s ∈ R≥0, s < t and Y ∈ E[Xt −Xs|Fs+] then Y ≥ 0 a.s.

18.1 Lemma. Let n ∈ Z≤0. If Yq ∈ E[Xq −Xq−2n|F(q−2n)+] for q ∈ 2nN then

E[∑q≤T

Yq ;∑q≤T

Yq > a] ≤ 6 supσ∈Time(F·,Q)

supB∈F :P (B)≤δ

E[|XT∧σ| ;B].

for all T ∈ N, a ∈ R>0 and δ ∈ R>0 with 2E[XT −X0] ≤ δa where

Time(F·,Q) := σ ∈ Time(F·) : σ(ω) ∈ Q ∪ ∞∀ω ∈ Ω.

84

Page 85: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. Since maxYq, 0 is a modification of Yq, we may regard that Yq(ω) ≥ 0 for all ω ∈ Ω.We have that E[Yq] < +∞ for all q ∈ 2nN.

The process t 7→ Xt −∑

q≤t Yq with index set 2nZ≥0 is an G·-martingale.

Here G· denotes the filtration q 7→ Fq+ with index set 2nZ≥0. For each a ∈ R>0 we set

σ(a) := inft ∈ 2nZ≥0 :∑

q≤t+2nYq > a.

Then σ(a) is a G·-stopping time since σ(a) ∈ 2nZ≥0 ∪ +∞ and

σ(a) ≤ t = ∑

q≤t+2n Yq > a ∈ Ft+ for all t ∈ 2nZ≥0.

We fix T ∈ N and a ∈ R>0. The G·-martingale property of t 7→ Xt −∑

q≤t Yq shows

E[∑q≤T

Yq ;∑q≤T

Yq > a] = E[∑q≤T

Yq ;σ(a) < T ]

= E[∑

q≤T∧σ(a)

Yq ;σ(a) < T ] + E[XT ;σ(a) < T ]− E[XT∧σ(a) ;σ(a) < T ].

Clearly∑

q≤T∧σ(a) Yq ≤∑

q≤T Yq. On the other hand∑q≤σ(a/2)

Yq ≤ a/2 on σ(a/2) < +∞ and∑q≤T

Yq > a on σ(a) < T.

Since σ(a/2) ≤ σ(a), it follows that∑q≤T∧σ(a)

Yq ≤ 2∑q≤T

Yq − 2∑

q≤T∧σ(a/2)

Yq on σ(a) < T.

Taking into account that∑

q≤T∧σ(a/2) Yq ≤∑

q≤T Yq, we get

E[∑

q≤T∧σ(a)

Yq ;σ(a) < T ] ≤ 2E[∑

σ(a/2)<q≤T

Yq] = 2E[XT −XT∧σ(a/2)].

The right hand side reads 2E[XT ;σ(a/2) < T ]− 2E[XT∧σ(a/2) ;σ(a/2) < T ]. Consequently

E[∑q≤T

Yq ;∑q≤T

Yq > a] ≤ E[|XT | ;σ(a) < T ] + E[|XT∧σ(a)| ;σ(a) < T ]

+ 2E[|XT | ;σ(a/2) < T ] + 2E[|XT∧σ(a/2)| ;σ(a/2) < T ].

Invoking Markov’s inequality we see that

P (∑q≤T

Yq > a/2) ≤ 2

aE[

∑q≤T

Yq] =2

aE[XT −X0].

Suppose that δ ∈ R>0 and 2E[XT −X0] ≤ δa. Then we have that

P (σ(a) < T ) ≤ P (σ(a/2) < T ) = P (∑q≤T

Yq > a/2) ≤ δ.

Finally σ(a) is an F·-optional time. Indeed σ(a) < t =⋃q<tσ(a) ≤ q for all t ∈ R≥0.

85

Page 86: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

18.2 Definition. Let X· be an F·-adapted process. It is said to be of class DL over Q ifthe system XT∧σ ;σ ∈ Time(F·,Q) is uniformly integrable for each T ∈ N.

For each n ∈ Z≤0 choose a stochastic process Y n· with index space 2nN such that

Y nq ≥ 0, E[Y n

q ] < +∞ and Y nq ∈ E[Xq −Xq−2n|F(q−2n)+] for all q ∈ 2nN.

18.3 Corollary. If X· is of class DL over Q then there exist φ ∈ Map(N,Z≤0) and a sequence

of integrable random variables AT such that φ(k) > φ(k+1) and∑

q∈2φ(k)N:q≤T Yφ(k)q converges

to AT L∞-weakly as k →∞ for all T ∈ N.

Proof. According to Lemma 18.1, the sequence of random variables∑

q∈2nN:q≤T Ynq is uni-

formly integrable for each T ∈ N. Invoking the Dunford-Pettis compactness criterion andthe diagonal trick we get the claim.

Let φ and AT be as in Corollary 18.3. For each T ∈ N choose a stochastic process ZT·

with index space R[0,T ) such that almost every sample path is right continuous and

ZTt ∈ E[XT − AT |Ft+] for all t ∈ R[0,T ),

which exists by Corollary 14.12. ATt := lim infst:s∈QXs − ZTt . Q(2) :=

⋃n∈Z:n≤0 2nZ.

18.4 Lemma. If X· is of class DL over Q then∑

q∈2φ(k)N:q≤t Yφ(k)q converges to ATt L∞-

weakly as n→∞ for all T ∈ N and t ∈ Q(2)[0,T ).

Proof. If n ∈ Z≤0 is sufficiently negative so that t ∈ 2nZ≥0 then

Xt −∑

q∈2nN:q≤t

Y nq ∈ E[XT −

∑q∈2nN:q≤T

Y nq |Ft+].

Let f be a bounded F -measurable function and g ∈ E[f |Ft+]. We may suppose that g isbounded. If k is sufficiently large so that t ∈ 2φ(k)Z≥0 then

E[(Xt −∑

q∈2φ(k)N:q≤t

Y φ(k)q )f ] = E[(XT −

∑q∈2φ(k)N:q≤T

Y φ(k)q )g].

The right hand side converges to E[(XT − AT )g] as k tends to ∞. Since g ∈ E[f |Ft+],ZTt ∈ E[XT − AT |Ft+] and ATt = Xt − ZT

t a.s. by Theorem 14.11(ii), we have that

E[(XT − AT )g] = E[ZTt f ] = E[(Xt − ATt )f ].

The bounded F -measurable function f being arbitrary, we infer that∑

q∈2φ(k)N:q≤t Yφ(k)q

converges to ATt L∞-weakly as k →∞.

Martb(F·) stands for the space of all bounded F·-martingales withalmost sure right continuous sample path.

We prove the existence part of the Doob-Meyer decomposition theorem. The uniquenesspart is discussed at Theorem 15.19. Before we start we note the following:

86

Page 87: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

18.5 Remark. We have that (F·+)t+ = Ft+ for all t ∈ R≥0. Thus an F·+-increasing processA· is natural if E[RMdAt] = E[R[MdA]t] for all t ∈ R≥0 whenever M· ∈ Martb(F·+).

18.6 Theorem. Suppose that X· is an F·-submartingale whose almost every sample path isright continuous. If X· is of class DL then there exists a natural F·+-increasing process A·such that t 7→ Xt − At is an F·+-martingale.

Proof. For each T ∈ N, according to Lemma 18.4, AT0 = 0 a.s. and if s, t ∈ Q(2)[0,T ) and

s ≤ t then ATs ≤ ATt a.s. Moreover almost every sample path of AT· is right continuous byTheorem 14.11(i). Hence there exists Ω0 ∈ F such that P (Ω0) = 1 and

t 7→ ATt (ω) is finite valued, right continuous and non-decreasing and AT0 (ω) = 0

for each ω ∈ Ω0 and T ∈ N. Suppose that S, T ∈ N and S < T . If t ∈ Q(2)[0,S) then, since

both ASt and ATt are L∞-weak limits of∑

q∈2φ(k)N:q≤t Yφ(k)q , they must coincide a.s. We set

Ω :=⋂

t∈Q(2):t≥0

⋂S,T∈N:S>t,T>t

ω ∈ Ω0 : ASt (ω) = ATt (ω) and At := infATt ;T ∈ N, T > t.

Then Ω ∈ F , P (Ω) = 1 and, each ATt being Ft+-measurable, A· is F·+-adapted. Moreover

At(ω) = ATt (ω) for all ω ∈ Ω, T ∈ N and t ∈ R[0,T ).

Consequently t 7→ At(ω) is finite valued, right continuous and non-decreasing and A0(ω) = 0for all ω ∈ Ω. Thus A· is an F·+-increasing process. Since Xt−ATt = ZT

t a.s. for all t ∈ R[0,T )

by Theorem 14.11(ii) and Xt is Ft+-measurable, we see that

Xt − At ∈ E[XT − AT |Ft+] for all T ∈ N and t ∈ R[0,T ),

which means that t 7→ Xt − At is an F·+-martingale. Given M· ∈ Martb(F·+) and t ∈ Q(2)>0.

Suppose that n ∈ Z≤0 and t ∈ 2nZ≥0. Then

E[MtYnq ] = E[Mq−2n(Xq −Xq−2n)] = E[Mq−2n(Aq − Aq−2n)] for all q ∈ 2nN with q ≤ t.

Indeed if q ∈ 2nN and q ≤ t then Mq−2n ∈ E[Mt|F(q−2n)+], Y nq ∈ E[Xq − Xq−2n|F(q−2n)+]

and Xq−2n − Aq−2n ∈ E[Xq − Aq|F(q−2n)+]. On the other hand by Lemma 18.4∑q∈2φ(k)N:q≤t Y

φ(k)q converges to At L

∞-weakly as k →∞ for all t ∈ Q(2)≥0.

We thus get E[MtAt] = E[R[MdA]t], which hold for all t ∈ R≥0 due to the right continuity.Since E[MtAt] = E[RMdAt] by Lemma 15.18(i), it follows that A· is natural.

The conclusion in Theorem 18.6 remains valid for F·-submartingales of class DL over Qwith stochastic right continuity.

Given a stochastic process X·, we set

DM[X·,F·] := A· : natural F·-increasing process, t 7→ Xt − At F·-martingale

with stochastic right continuity.

87

Page 88: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

18.7 Theorem. (i) If A·, B· ∈ DM[X·,F·] then At = Bt for all t ∈ R≥0 a.s.(ii) If A· ∈ DM[X·,F·], B· is an F·-adapted process with almost sure right continuous pathand At = Bt a.s. for all t ∈ R≥0 then B· ∈ DM[X·,F·].(iii) If DM[X·,F·] ∩DM[Y·,F·] 6= ∅ then DM[X·,F·] = DM[Y·,F·].(iv) DM[X·,F·] ⊂ DM[X·,F·+]. X· is F·-adapted and DM[X·,F·+] 6= ∅ if and only if X· isan F·-submartingale of class DL over Q with stochastic right continuity.(v) If X· and Y· are F·-adapted and DM[X·,F·+] ∩ DM[Y·,F·+] 6= ∅ then t 7→ Xt − Yt isan F·-martingale with stochastic right continuity. If t 7→ Xt − Yt is an F·-martingale withstochastic right continuity then DM[X·,F·] = DM[Y·,F·].(vi) If X0 = 0 a.s., Xt ≥ 0 a.s. ∀t ∈ R≥0 and 0 ∈ DM[X·,F·] then Xt = 0 a.s. ∀t ∈ R≥0.

Proof. (iv) An F·-martingale with stochastic right continuity is an F·+-martingale by The-orem 14.11(ii). This shows the first implication.

Let X· be an F·-submartingale (with almost sure right continuous path). According toLemma 7.12, if Xt ≥ 0 a.s. for all t ∈ R≥0 then X· is of class DL over Q (of class DL).

18.8 Definition. (i) An F·-increasing process A· is said to be integrable if E[At] < +∞ forall t ∈ R≥0. (Recall that A0 = 0 a.s. by definition.)(ii) Let X· be an integrable F·-increasing process. An F·-compensator of X· is a naturalF·-increasing process A· such that t 7→ Xt − At is an F·-martingale.(iii) Let X· be an F·-finite variation process such that E[var(X·)t] < +∞ for all t ∈ R≥0. AnF·-natural projection of X· is a natural F·-finite variation process Y· such that Y0 = X0 a.s.and t 7→ Xt − Yt is an F·-martingale.

18.9 Corollary. Let X· be an F·-finite variation process with E[var(X·)t] < +∞ ∀t ∈ R≥0.(i) An F·+-natural projection of X· exists. F·+-natural projections are indistinguishable.(ii) If t 7→ Xt is non-decreasing a.s. then so are F·+-natural projections.(iii) If Y· is an F·+-natural projection of X· then t 7→ var+(X·)t − var+(Y·)t as well ast 7→ var−(X·)t − var−(Y·)t are F·+-submartingales.(iv) If X· is natural then both var+(X·) and var−(X·) are natural F·+-increasing processes.

Proof. (i) An integrable F·+-increasing process is an F·+-submartingale and, by Lemma 7.12,it is of class DL. We apply Theorem 18.6 to F·+-increasing processes t 7→ var+(X·)t andt 7→ var−(X·)t. There exist natural F·+-increasing processes A· and B· such that t 7→var+(X·)t − At and t 7→ var−(X·)t − Bt are F·+-martingales. According to Lemma 15.15,Y· : t 7→ X0 + At − Bt is a natural F·+-finite variation process. The uniqueness is due toTheorem 15.19. (iii) We have that var+(Y·)t − var+(Y·)s ≤ At − As for s ≤ t a.s. Hence

E[var+(Y·)t − var+(Y·)s ;C] ≤ E[At − As ;C] = E[var+(X·)t − var+(X·)s ;C]

for s ≤ t, C ∈ Fs+. The equality is due to that t 7→ var+(X·)t − At is an F·+-martingale.Consequently t 7→ var+(X·)t − var+(Y·)t is an F·+-submartingale.

(iv) Suppose that X· is natural. It is a natural F·+-finite variation process as well. Dueto the uniqueness, we have that Xt = Yt for all t ∈ R≥0 a.s. It follows that var+(X·)t ≤ Ata.s. and E[var+(X·)t] = E[At] for all t ∈ R≥0. We thus infer that var+(X·)t = At for allt ∈ R≥0 a.s. Consequently t 7→ var+(X·)t is a natural F·+-increasing process.

88

Page 89: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

18.10 Definition. An F·-Poisson process N· is an F·-adapted process such that its almostevery sample path is right continuous, N0 = 0 a.s., and for each pair s, t ∈ R≥0 with s < tthe increment Nt −Ns is independent of Fs and Poisson distributed with mean t− s.

18.11 Example. (i) Let N· be an F·-Poisson process. Then t 7→ Nt(Nt − 1) · · · (Nt − n) isan F·-increasing process with F·-compensator t 7→ (n + 1)R[N·(N· − 1) · · · (N· − n + 1)dλ]tfor each n ∈ Z≥0 where λ· : (t, ω) 7→ t. Moreover λ· ∈ DM[(N· − λ·)

2,F·].(ii) Let N· and M· be two mutually independent Poisson processes. Then X· : t 7→ Nt −Mt

is a martingale, and var+(X·)t = Nt and var−(X·)t = Mt for all t ∈ R≥0 a.s.

Proof. Since k(k− 1) · · · (k− n) = (n+ 1)∑k

i=1(i− 1) · · · (i− n) for all k ∈ N, we have that

Nt(Nt − 1) · · · (Nt − n) = (n+ 1)

∫(0,t]

N·−(N·− − 1) · · · (N·− − n+ 1) dN·

Taking into account that λ· is an F·-compensator of N·, we get the first statement byLemma 15.9. Indeed the right hand side below is integrable:

sups≤t

|Ns−(Ns− − 1) · · · (Ns− − n+ 1)|var(N· − λ·)t ≤ Nt(Nt − 1) · · · (Nt − n+ 1)(Nt + t).

The F·-martingale property of t 7→ (Nt − t)2 − t derives from

(Nt − t)2 = Nt(Nt − 1)− 2

∫ t

0

N· dλ· − 2

∫(0,t]

λ· d(N· − λ·) +Nt

The second term of the right hand side is a compensator of the first term.(ii) t : Nt(ω) 6= Nt−(ω) ∩ t : Mt(ω) 6= Mt−(ω) = ∅ P -a.s.

19 Compensator of bounded increasing process

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields. Given asubmartingale M· we write M∞ := lim inft→+∞Mt provided supt:t≥0E[maxMt, 0] < +∞.

19.1 Lemma. Suppose that X· is an F·-finite variation process such that E[var(X)t] < +∞for all t ∈ R≥0, A· is a F·+-natural projection of X·, M· is an F·+-martingale whose almostevery sample path is right continuous, and σ is an F· ∨ Null(P )-optional time.(i) If M· is bounded and [|X0| ;σ > 0] + E[var(X)σ] < +∞ then E[var(A)σ] < +∞ and

E[MσAσ ;σ > 0] = E[M0X0 ;σ > 0] + E[R[MdX]σ].

(ii) If t 7→ Xt is non-decreasing a.s., X0 ≥ 0 a.s., Mt ≥ 0 for all t ∈ R≥0 a.s. and σ ≤ Ta.s. for some T ∈ R>0 then

E[MσAσ ;σ > 0] = E[M0X0 ;σ > 0] + E[R[MdX]σ].

Proof. (i) Lemma 4.8(i) shows that (F·∨Null(P ))·+ = F·+∨Null(P ). In particular the set ofF· ∨Null(P )-optional times coincides with that of F·+ ∨Null(P )-stopping times. Accordingto Corollary 18.9(iii), t 7→ var(X)t − var(A)t is an F·+-submartingale. It follows that

E[var(A)t∧σ] ≤ E[var(A)t∧σ] + E[var(X)t∧σ − var(A)t∧σ] = E[var(X)t∧σ]

89

Page 90: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

by Corollary 3.17(ii). Since E[var(X)σ] dominates the right hand side, we get

E[var(A)σ] ≤ E[var(X)σ] < +∞

by invoking the dominated convergence theorem. Therefore Lemma 15.18 shows that

E[MσAσ ;σ > 0] = E[M0X0 ;σ > 0] + E[R[MdA]σ].

On the other hand t 7→ R[Md(X − A)]t∧σ is an F·+ ∨ Null(P )-martingale by Lemma 15.9and this is dominated by the integrable random variable sups:s≥0 |Ms|(var(A)σ + var(X)σ).This implies that E[R[MdX]σ]− E[R[MdA]σ] = E[R[Md(X − A)]σ] = 0.

(ii) Observe that E[var(X)σ] ≤ E[var(X)T ] < +∞. Let n ∈ N. There exist a stochasticprocess Mn

· such that almost every sample path is right continuous and

Mnt ∈ E[minMT , n|Ft+] for all t ∈ R≥0

according to Corollary 14.12. We may regard that 0 ≤ Mnt (ω) ≤ n for all ω ∈ Ω. Indeed

(Mn· ∨ 0) ∧ n is a desired modification. Hence Mn

· is a bounded F·+-martingale. We get

E[Mnσ (Aσ −X0) ; σ > 0] = E[R[MndX]σ] for all n ∈ N

by applying (i) to the pair X·−X0 and A·−A0. Since E[Mnσ ; Λ] = E[Mn

0 ; Λ] for all Λ ∈ F0+

and the integrands are non-negative, we see by the monotone convergence theorem that

E[MnσX0 ;σ > 0] = E[Mn

0 X0 ;σ > 0].

Adding side by side we infer that

E[MnσAσ ;σ > 0] = E[Mn

0 X0 ;σ > 0] + E[R[MndX]σ] for all n ∈ N.

There exists Ω0 ∈ F such that P (Ω0) = 1 and

t 7→Mnt (ω) is right continuous and admits left-hand limits everywhere

for each ω ∈ Ω0 and n ∈ N.

We infer by Lemma 1.8 that

Mnt (ω) ≤Mn+1

t (ω) ≤Mt(ω) for all t ∈ R[0,T ], n ∈ N and ω ∈ Ω0

where we replace Ω0 by a smaller set with full measure from F if necessary. It follows that

Mnt−(ω) ≤Mn+1

t− (ω) and 0 ≤Mt−(ω)− supk∈N

Mkt−(ω) ≤Mt−(ω)−Mn

t−(ω)

for all t ∈ R(0,T ], n ∈ N and ω ∈ Ω0. Let ε ∈ R>0. Example 6.4 shows that

P ( supt∈[0,T ]

|Mnt −Mt| ≥ ε) ≤ E[|Mn

T −MT |]/ε ≤ E[MT ;MT > n]/ε for all n ∈ N.

The right hand side vanishes at the limit n tends to ∞. Consequently

P ( supt∈[0,T ]

| supk∈N

Mkt− −Mt−| ≥ ε,Ω0) = 0 for all ε ∈ R>0,

which means that

90

Page 91: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

supk∈NMkt−(ω) = Mt−(ω) for all t ∈ R(0,T ] and ω ∈ Ω0.

Here we choose a smaller set with full measure from F if necessary. Similarly we have that

σ(ω) ≤ T , Mnσ (ω) ≤Mn+1

σ (ω) and supk∈NMkσ (ω) = Mσ(ω) for all ω ∈ Ω0

with a necessary modification of Ω0. Thus the monotone convergence theorem shows that

supn∈N

E[R[MndX]σ] = supn∈N

E[

∫(0,σ]

Mns− dXs ; Ω0] = E[

∫(0,σ]

Ms− dXs ; Ω0] = E[R[MdX]σ].

Similarly supn∈NE[MnσAσ ;σ > 0] and supn∈NE[Mn

0 X0 ;σ > 0] equal E[MσAσ ;σ > 0] andE[M0X0 ;σ > 0] respectively.

19.2 Lemma. Suppose that X· and Y· are integrable F·-increasing process, A· ∈ DM[X,F·+],B· ∈ DM[Y,F·+], σ is an F· ∨ Null(P )-optional time, and Y∞ ≤ K a.s. for some K ∈ R≥0.Then E[AσBσ] + E[(Y∞ − Yσ)Aσ] ≤ E[XσBσ−] +KE[Xσ] and E[(Bσ)

2] ≤ 2KE[Yσ] < +∞where B0− := B0 and Bt− := lim infst:s∈QBs for t ∈ R>0.

Proof. Let n ∈ N. According to Corollary 14.12, there exist a stochastic process M· suchthat almost every sample path is right continuous and

Mt ∈ E[Bn|Ft+] for all t ∈ R≥0.

We may regard that 0 ≤Mt(ω) for all ω ∈ Ω. Indeed maxM·, 0 is a desired modification.Hence M· is a non-negative F·+-martingale. We get by Lemma 19.1(ii) that

E[BnAσ∧n] = E[Mσ∧nAσ∧n] = E[R[MdX]σ∧n].

Let t ∈ R≥0. Then E[Bn|Ft+] = E[Yn|Ft+] +Bt∧n − Yt∧n implies that

E[Mt ; Λ] = E[Bn ; Λ] = E[Yn ; Λ] + E[Bt∧n − Yt∧n ; Λ] for all Λ ∈ Ft+.

The right hand side is dominated by E[K +Bt∧n − Yt∧n ; Λ]. It follows that

Mt ≤ K +Bt − Yt for all t ∈ R[0,n] a.s.

due to the right continuity of sample paths and hence

Mt− ≤ K +Bt− − Yt− ≤ K +B(σ∧n)− for all t ∈ R(0,σ∧n] a.s.

due to the left hand regularity of sample paths. We thus get

R[MdX]σ∧n ≤ (K +B(σ∧n)−)Xσ∧n ≤ (K +Bσ−)Xσ a.s.

Consequently we infer that

E[Bσ∧nAσ∧n] + E[(Yn − Yσ∧n)Aσ∧n] = E[BnAσ∧n] ≤ KE[Xσ] + E[XσBσ−] for all n ∈ N.

Tending n to∞ we reach the statement by the monotone convergence theorem. Finally, sinceE[Bσ∧n] = E[Yσ∧n], we have that E[Bσ] = E[Yσ]. On the other hand E[Bσ−] ≤ E[Bσ].

91

Page 92: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Mart2(F·) stands for the space of all square integrable F·-martingales withalmost sure right continuous sample path.

19.3 Lemma. Let X· be an F·-increasing process such that X∞ ≤ K a.s. for some K ∈ R≥0,A· ∈ DM[X·,F·+], and σ a bounded F· ∨ Null(P )-optional time. Then E[(Aσ)

2] ≤ 2K2 and

E[MσXσ] = E[RMdXσ] and E[MσAσ] = E[R[MdX]σ] for all M· ∈ Mart2(F·+).

Proof. LetM· ∈ Mart2(F·+) and n ∈ N. According to Corollary 14.12 there exist a stochasticprocess Mn

· such that almost every sample path is right continuous and

Mnt ∈ E[maxminMT , n,−n|Ft+] for all t ∈ R≥0.

We may regard that |Mnt (ω)| ≤ n for all ω ∈ Ω. Indeed maxminMn

· , n,−n is a desiredmodification. Hence Mn

· is a bounded F·+-martingale. We get that

E[MnσXσ] = E[RMndXσ] and E[Mn

σAσ] = E[R[MndX]σ] for all n ∈ N

by Lemma 15.18(i) and Lemma 19.1. Choose T ∈ R>0 such that σ ≤ T a.s. Then

|Mnσ −Mσ| ≤ sup

t∈[0,T ]

|Mnt −Mt| a.s. and |R[MndX]σ −R[MdX]σ| ≤ sup

t∈[0,T ]

|Mnt −Mt|XT .

According to Lemma 19.2, E[(Aσ)2] ≤ 2KE[Xσ] < +∞ while Example 6.4 shows that

E[ supt∈[0,T ]

|Mnt −Mt|2] ≤ 4E[|Mn

T −MT |2] ≤ 4E[|MT |2 ;MT > n] for all n ∈ N.

The right hand side vanishes at the limit n tends to∞. Consequently we reach the claim.

19.4 Lemma. If A ∈ Prog(F·) then ω ∈ Ω : ∃s ∈ R[0,t) s.t. (s, ω) ∈ A ∈ Ft ∨ Null(P ).

19.5 Corollary. Suppose that E is a metrizable topological space and X· is an E-valuedprocess indistinguishable from an F· ∨ Null(P )-progressively measurable process. Then foreach F ∈ Borel(E) both the entry time inft ∈ R≥0 : Xt ∈ F to F and the hitting timeinft ∈ R>0 : Xt ∈ F to F are F· ∨ Null(P )-optional times.

Proof. We may assume that X· itself is F·-progressively measurable. We have that

A := (s, ω) ∈ R≥0 × Ω : Xs ∈ F ∈ Prog(F·).

On the other hand if τ := infs ≥ 0 : Xs ∈ F and t ∈ R>0 then

τ < t = ω ∈ Ω : ∃s ∈ R[0,t) s.t. (s, ω) ∈ A.

Thus we get τ < t ∈ Ft ∨ Null(P ) for all t ∈ R>0.

Martc2(F·) := M· ∈ Mart2(F·) : t 7→Mt continuous a.s..Martfv2 (F·) := M· ∈ Mart2(F·) : t 7→Mt of finite variation a.s..

19.6 Theorem. Suppose that M· ∈ Mart2(F·+). Then M· is indistinguishable from a processin Martc2(F·) if and only if t 7→MtFt is an F·+-martingale whenever F· ∈ Martfv2 (F·+).

92

Page 93: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. We have that Mart2(F·) ⊂ Mart2(F·+) by Theorem 14.11(ii). Thus the implication⇒ is stated in Theorem 16.9(iv) with the filtration F· replaced by F·+.

Conversely suppose that t 7→MtFt is an F·+-martingale for all F· ∈ Martfv2 (F·+). Givenan F· ∨ Null(P )-optional time σ there exists an F·-optional time τ such that σ = τ a.s. byLemma 4.8(iii). We see that

X· : t 7→ 1[τ,∞)(t)− 1τ=0 is a bounded F·+-increasing process.

Theorem 18.6 shows that DM[X,F·+] 6= ∅. We select and fix A· ∈ DM[X,F·+]. Then

F· : t 7→ Xt − At ∈ Mart2(F·+).

The square integrability derives by Lemma 19.2. Clearly it is of finite variation almost surely.It then follows that t 7→MtFt is an F·+-martingale and hence

E[MTXT ] = E[MTAT ] + E[MTFT ] = E[MTAT ] + E[M0F0] = E[MTAT ] for all T ∈ R>0.

The left hand side equals E[RMdXT ] while the right hand side equals E[R[MdX]T ]according to Lemma 19.3. Consequently we get that

(?) E[RMd(1[τ,∞) − 1τ=0)T ] = E[R[Md(1[τ,∞) − 1τ=0)]T ] for all T ∈ R>0.

We select and fix Ω0 ∈ F such that P (Ω0) = 1 and t 7→Mt(ω) is right continuous and admitsleft-hand limits for each ω ∈ Ω0. Let ω ∈ Ω0. We see that∫

(0,t]

Ms−(ω) d1[τ,∞)(s) =

0 τ(ω) = 0 or τ(ω) > t

Mτ−(ω) 0 < τ(ω) ≤ t.

Let T ∈ R>0. Since R[Md(1[τ,∞) − 1τ=0)]T (ω) =∫

(0,T ]Ms−(ω) d1[τ,∞)(s), we get by (?)

E[Mσ ; 0 < σ ≤ T ] = E[Mτ ; 0 < τ ≤ T ] = E[Mτ−, ; 0 < τ ≤ T ] = E[Mσ−, ; 0 < σ ≤ T ].

Now for each ε ∈ R>0 we introduce the following:

σ(ε) := inft ∈ R>0 : Mt −Mt− > ε and τ(ε) := inft ∈ R>0 : Mt −Mt− < −ε

which are F· ∨ Null(P )-optional times by Corollary 19.5. We see that σ(ε, ω) > 0 andτ(ε, ω) > 0 for all ω ∈ Ω0 by Lemma 15.3. It also shows that if ω ∈ σ(ε) < +∞∩Ω0 thenMσ(ε)(ω)−Mσ(ε)−(ω) > ε. Therefore

0 ≤ εP (σ(ε) ≤ T ) = εP (0 < σ(ε) ≤ T ) ≤ E[Mσ(ε) −Mσ(ε)− ; 0 < σ(ε) ≤ T ] = 0,

which implies that P (σ(ε) ≤ T ) = 0 for all T ∈ R>0, that is, P (σ(ε) < +∞) = 0.Similarly we obtain that P (τ(ε) < +∞) = 0. The number ε ∈ R>0 being arbitrary, weconclude that Mt = Mt− for all t ∈ R>0 a.s. Consequently M· is indistinguishable fromt 7→ lim infst:s∈QMs, which is F·-adapted and continuous almost surely.

19.7 Lemma. Suppose that τ is an F·-optional time, Y· is an integrable F·-increasing pro-cess, A· ∈ DM[1[τ,∞) − 1τ=0,F·+], B· ∈ DM[Y,F·+], σ is an F· ∨ Null(P )-optional time. Ifthere exists K ∈ R≥0 such that Y∞ := supt:t≥0 Yt ≤ K a.s. then

E[AσBσ] + E[Aσ(Y∞ − Yσ)] ≤ E[Bτ− +K − Yτ− ; 0 < τ ≤ σ, τ < +∞].

93

Page 94: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. Since E[B∞] = supt;t≥0E[Bt] = supt;t≥0E[Yt] ≤ K, the F·+-martingale t 7→ Yt − Bt

is uniformly integrable. It follows by Theorem 14.8 that

Yt −Bt ∈ E[Y∞ −B∞|Ft+] for all t ∈ R≥0.

According to Corollary 14.12, there exist a stochastic process M· and Ω0 ∈ F such that

Ms ∈ E[B∞|Fs+] for all s ∈ R≥0, P (Ω0) = 1 and t 7→ Mt(ω) is right continuousand admits left-hand limits for each ω ∈ Ω0.

Observe that M· is a non-negative F·+-martingale. Let ω ∈ Ω0. We see that∫(0,t]

Ms−(ω) d1[τ,∞)(s) =

0 τ(ω) = 0 or τ(ω) > t

Mτ−(ω) 0 < τ(ω) ≤ t.

Since R[Md(1[τ,∞) − 1τ=0)]σ∧T (ω) =∫

(0,σ∧T ]Ms−(ω) d1[τ,∞)(s), Lemma 19.1(ii) shows that

E[B∞Aσ∧T ] = E[MσAσ∧T ] = E[Mτ− ; 0 < τ ≤ σ ∧ T ].

The first equality is due to that E[B∞ ; Λ] = E[Mσ∧T ; Λ] for all Λ ∈ F(σ∧T )+ and Aσ∧T isnon-negative. Let s ∈ R≥0. Then E[B∞|Fs+] = E[Y∞|Fs+] +Bs − Ys implies that

Ms− ≤ K +Bs− − Ys− for all s ∈ R>0 a.s.

according to the proof of Lemma 19.2. Thus we get

E[B∞Aσ∧T ] ≤ E[K +Bτ− − Yτ− ; 0 < τ ≤ σ ∧ T ].

The left hand side equals E[(Bσ∧T + Y∞ − Yσ∧T )Aσ∧T ].

19.8 Corollary. Let σ, τ ∈ Time(F·), A· ∈ DM[1[σ,∞),F·+] and B· ∈ DM[1[τ,∞),F·+].(i) E[At] = P (0 < σ ≤ t) = E[At∧σ] for all t ∈ R≥0. At∧σ = At for all t ∈ R≥0 a.s.(ii) E[AtBt] + E[At ; t < τ < +∞] ≤ E[Bσ− ; 0 < σ ≤ t] + P (0 < σ ≤ t ∧ τ) for all t ∈ R≥0.(iii) If σ ≤ τ a.s. then E[AtBt] ≤ E[At ; τ = ∞] + E[At ; τ ≤ t] + E[Bσ− ; 0 < σ ≤ t] andE[(1σ≤t − At)(1τ≤t −Bt)] ≤ P (τ = 0) + E[At ; τ = ∞] + E[Bσ− ; 0 < σ ≤ t] + E[Bt ;σ > t].(iv) If τ < σ a.s. then E[AtBt] + E[At ; t < τ < +∞] ≤ E[Bτ ; 0 < σ ≤ t].

Proof. (i) We see that E[At] = P (σ ≤ t)−P (σ = 0) = P (0 < σ ≤ t) ≤ 1. The compensatorof t 7→ At∧σ is given by t 7→ 1[σ,∞)(t ∧ σ) = 1[σ,∞)(t).

(ii) Lemma 19.7.(iii) P (τ ≤ t) = P (τ = 0) + E[Bt]. Bσ− = (B·∧τ )σ− = Bτ on σ > τ.

94

Page 95: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

20 Stable spaces of square integrable martingales

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields.

Mart2(F·) stands for the space of all square integrable F·-martingaleswith almost sure right continuous sample path.

20.1 Lemma. Suppose that M· is an F·-adapted process whose almost every sample path isright continuous. If T ∈ R>0, E[|Mσ|] < +∞ and E[Mσ] = E[M0] for any F·-stopping timeσ with σ ≤ T then M· is an F·-martingale up to T .

Proof. Suppose that s, t ∈ R, 0 ≤ s < t ≤ T and A ∈ Fs. We see that

σ ≤ u =

∅ if u < s

A if s ≤ u < t

Ω if t ≤ u

where σ :=

s on A

t off A.

It follows that σ is an F·-stopping time with σ ≤ T . The constant function ω 7→ t is also anF·-stopping time. The relation E[Mσ] = E[M0] = E[Mt] reads E[Ms ;A] = E[Mt ;A].

20.2 Corollary. Let M·, N· ∈ Mart2(F·) and T ∈ R>0. If E[MT∧σNT ] = E[M0N0] for anyF·-stopping time σ then t 7→Mt∧TNt∧T is an F·-martingale.

Proof. Suppose that T ∈ R>0 and σ is an F·-stopping time. We see that

MT∧σ is (F· ∨ Null(P ))(T∧σ)+-measurable and NT∧σ ∈ E[NT |(F· ∨ Null(P ))(T∧σ)+]

by Corollary 3.17. It follows that

E[MT∧σNT∧σ] = E[MT∧σNT ] = E[M0N0] for all F·-stopping times σ.

Lemma 20.1 shows that t 7→MtNt is an F·-martingale up to T .

20.3 Definition. A set M of F·-martingales whose almost all sample paths are right con-tinuous is said to be F·-stable if for each M· ∈ M and an F·-stopping time σ there existsM· ∈M such that Mt∧σ = Mt for all t ∈ R≥0 a.s.

20.4 Example. All of the followings are F·+ ∨ Null(P )-stable.(i) Mart2(F·+) as well as the set Martb(F·+) of all bounded F·+-martingales whose almostall sample paths are right continuous.(ii) Martfv2 (F·+) := M· ∈ Mart2(F·+) : t 7→Mt of finite variation a.s..(iii) Martc2(F·) := M· ∈ Mart2(F·) : t 7→Mt continuous a.s..

Proof. Let M· be an F·+-martingale whose almost every sample path is right continuous andσ be an F·+ ∨Null(P )-stopping time. Since F·+ ∨Null(P )-stopping times are F· ∨Null(P )-optional times, according to Lemma 15.17, there exists an F·+-martingale M· such thatMt∧σ = Mt for all t ∈ R≥0 a.s. Since Mt∧σ ∈ E[Mt|(F· ∨ Null(P ))σ+] by Corollary 3.17(ii),Theorem 1.13(i) shows that

E[|Mt|2] = E[|Mt∧σ|2] ≤ E[|Mt|2] < +∞.

On the other hand Martc2(F·) ⊂ Mart2(F·) ⊂ Mart2(F·+). If almost every sample path ofM· is continuous then Lemma 10.2 shows the existence of an F·-adapted process M· suchthat Mt∧σ = Mt for all t ∈ R≥0 a.s.

95

Page 96: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

20.5 Example. Let M· ∈ Martc(F·) and A· ∈ Qvar[M· ;F·]. Then⋃Ito[fdM ;F·] ; f ∈ L, E[R[|f |2dA]t] < +∞∀t ∈ R≥0 is F·+ ∨ Null(P )-stable

where L denotes the set of all F·-adapted process whose almost all sample paths are locallybounded and admit left hand limits everywhere.

Proof. Suppose that f ∈ L, E[R[|f |2dA]t] < +∞ for all t ∈ R≥0 and I· ∈ Ito[fdM ;F·].Given an F·+ ∨Null(P )-stopping time σ, being an F· ∨Null(P )-optional time as well, thereexists σ ∈ Time(F·) such that σ = σ a.s. by Lemma 4.8(iii). Then we have that 1(0,σ]f ∈ L,E[R[|1(0,σ]f |2dA]t] ≤ E[R[|f |2dA]t] < +∞ and

if J· ∈ Ito[1(0,σ]fdM ;F·] then Jt = It∧σ = It∧σ for all t ∈ R≥0 a.s.

by Lemma 11.6(v). Thus the set in question is F·+ ∨ Null(P )-stable.

20.6 Lemma. Let M be an F·-stable subset of Mart2(F·), T ∈ R>0 and N· ∈ Mart2(F·).If E[MTNT ] = 0 for all M· ∈M then t 7→Mt∧TNt∧T is an F·-martingale for all M· ∈M.

Proof. Let M· ∈ M. Given an F·-stopping time σ, the stopped process t 7→ Mt∧σ is indis-tinguishable from a process M· in M. It follows that E[MT∧σNT ] = E[MTNT ] = 0. This isvalid for the constant stopping time 0. Since N0 ∈ E[NT |F0], we have that

E[M0N0] = E[MT∧0NT ] = 0 = E[MT∧σNT ] for all F·-stopping times σ.

According to Corollary 20.2, this implies that t 7→ Nt∧TMt∧T is an F·-martingale.

We introduce the following family of semi-norms to the space Mart2(F·):

M· 7→ E[|MT |2] where the parameter T runs through R>0.

20.7 Lemma. (i) Let M·, N· ∈ Mart2(F·). Then they are indistinguishable from each otherif and only if E[|MT −NT |2] = 0 for all T ∈ N.(ii) If Mn

· is a sequence of Mart2(F·) and lim supm,n→∞E[|MmT −Mn

T |2] = 0 for all T ∈ Nthen there exists M· ∈ Mart2(F·) such that E[|Mt −Mn

t |2] converges to 0 for all t ∈ R≥0.

Proof. (ii) We can find φ ∈ Seq(N, ↑) such that E[|MmT −Mn

T |2] ≤ 1/23T for all T ∈ N andm,n ∈ N≥φ(T ). Let T ∈ N. It then follows by Example 6.4 that

P ( supt:t≤T

|Mmt −Mn

t | ≥ 1/2T ) ≤ 22TE[|MmT −Mn

T |2] ≤ 1/2T for all m,n ∈ N≥φ(T ).

The Borel-Cantelli lemma shows that there exists Ω0 ∈ F such that P (Ω0) = 1 and

t 7→Mnt (ω) is right continuous for all n ∈ N and

supt:t≤n |Mφ(n+1)t (ω)−M

φ(n)t (ω)| < 1/2n except for finitely many n ∈ N

96

Page 97: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

for all ω ∈ Ω0. We set

Mt(ω) := lim infn→∞

Mφ(n)t (ω) for t ∈ R≥0 and ω ∈ Ω.

The process t 7→Mt is F·-adapted. Let ω ∈ Ω0 and T ∈ N. Then, since

∞∑k=n

supt:t≤T

|Mφ(k+1)t (ω)−M

φ(k)t (ω)| ≤

∞∑k=n

supt:t≤k

|Mφ(k+1)t (ω)−M

φ(k)t (ω)| ≤ 1/2n−1

for all n ∈ N≥T , it follows that

Mt(ω)− 1/2n−1 ≤Mφ(n)t (ω) ≤Mt(ω) + 1/2n−1 for t ∈ R[0,T ] and n ∈ N≥T .

Therefore the right continuity of t 7→ Mt(ω) derives from that of t 7→ Mnt (ω). Moreover

Mφ(n)t (ω) converges to Mt(ω) uniformly on R[0,T ] as n tends to ∞. Exploiting the sub-

martingale property of t 7→ |Mφ(k)t −Mn

t |2, we see that

E[|Mφ(k)t −Mn

t |2] ≤ E[|Mφ(k)T −Mn

T |2] ≤ 1/23T for T ∈ N≥t, k ∈ N≥T and n ∈ N≥φ(T ).

Let t ∈ R≥0. It then follows by Fatou’s lemma that

E[|Mt −Mnt |2] ≤ lim inf

k→∞E[|Mφ(k)

t −Mnt |2] ≤ 1/23T for all T ∈ N≥t and n ∈ N≥φ(T ).

This implies that E[|Mt −Mnt |2] converges to 0 for all t ∈ R≥0. Taking the F·-adaptedness

into account we infer that t 7→Mt is an F·-martingale.

20.8 Definition. A subset M of Mart2(F·) is said to be L2-closed if the statement (ii) ofLemma 20.7 holds for M in place of Mart2(F·).

20.9 Lemma. Suppose that M is an F·+-stable subset of Mart2(F·+). Set

M⊥ := N ∈ Mart2(F·+) : E[MTNT ] = 0 for all M ∈M, T ∈ N.

(i) If M ∈M and N ∈M⊥ then t 7→MtNt is an F·+-martingale with E[M0N0] = 0.(ii) M⊥ is an F·+-stable and L2-closed linear subspace.

Proof. (i) If M ∈ M and N ∈ M⊥ then, t 7→ Mt∧TNt∧T being an F·+-martingale for allT ∈ N by Lemma 20.6, t 7→MtNt is an F·+-martingale.

(ii) Given N ∈ M⊥ and an F·+-stopping time σ, there exists N ∈ Mart2(F·+) suchthat Nt∧σ = Nt for all t ∈ R≥0 a.s. by Lemma 15.17. Since MT∧σ ∈ E[MT |G] and, by (i),MT∧σNT∧σ ∈ E[MTNT |G] where G := (F· ∨ Null(P ))(T∧σ)+, it follows that

E[MT NT ] = E[MTNT∧σ] = E[MT∧σNT∧σ] = E[MTNT ] = 0 for all M ∈M and T ∈ N.

This means that N ∈M⊥. Consequently M⊥ is F·+-stable.

20.10 Lemma. Suppose that M is an F·+-stable subset of Mart2(F·+). Then the closureM of M in Mart2(F·+) remains to be F·+-stable.

97

Page 98: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. Suppose that M· ∈ M, that is, M· ∈ Mart2(F·+) and there exists a sequence Mn· of

M such that E[|MnT −MT |2] converges to 0 as n tends to ∞ for all T ∈ N. Let σ be an

F·+-stopping time. Since M is stable, there exists a sequence Mn· of M such that Mn

· isindistinguishable from Mn

·∧σ for all n ∈ N. We see by the submartingale property that

E[|MnT −MT∧σ|2] = E[|Mn

T∧σ −MT∧σ|2] ≤ E[|MnT −MT |2] for all n, T ∈ N.

On the other hand there exists M ∈ Mart2(F·+) such that Mt∧σ = Mt for all t ∈ R≥0 a.s.by Lemma 15.17. It follows that M· ∈M and hence M is F·+-stable.

20.11 Lemma. Suppose that M is an F·-stable and L2-closed subset of Mart2(F·). Then[Mσ] ;M· ∈M is a closed subset of L2(F , P ) for all bounded F·-stopping times σ.

Proof. Suppose that Y is an square integrable random variable, Mn· is a sequence of M, σ is

an F·+-stopping time and E[|Y −Mnσ |2] converges to 0 as n tends to ∞. Since M is stable,

there exists a sequence Mn· of M such that Mn

· is indistinguishable from Mn·∧σ for all n ∈ N.

We see by the submartingale property that

E[|Mmt − Mn

t |2] = E[|Mmt∧σ −Mn

t∧σ|2] ≤ E[|Mmσ −Mn

σ |2] for all m,n ∈ N and t ∈ R≥0

Since M is L2-closed, there exists M· ∈ M such that E[|Mt − Mnt |2] converges to 0 for all

t ∈ R≥0. We have that Mnσ = Mn

σ a.s. for all n ∈ N. It follows by Example 6.4 that

E[|Mσ −Mnσ |2] = E[|Mσ − Mn

σ |2] ≤ E[supt≤T

|Mt − Mnt |2] ≤ 4E[|MT − Mn

T |2]

where T ∈ R>0 satisfies σ ≤ T a.s. Thus we get Y ∈ [Mσ].

20.12 Theorem. Let M be an F·+-stable linear subspace of Mart2(F·+). Then

[MT ] ;M· ∈M+ [NT ] ;N· ∈M⊥ = L2(FT+, P ) for all T ∈ R≥0.

where M the closure of M, M⊥ := N ∈ Mart2(F·+) : E[MTNT ] = 0∀M ∈M∀T ∈ N.

Proof. We see that [MT ] ;M· ∈M is an closed linear subspace of L2(FT+, P ) by invokingLemma 20.11 and Lemma 20.10. On the other hand E[MTY ] = 0∀M· ∈ M implies thestronger condition E[MTY ] = 0∀M· ∈M. Therefore it suffices to show that

Y ∈ L2(FT+, P ) : E[MTY ] = 0∀M· ∈M = NT ;N· ∈M⊥.

Suppose that Y ∈ L2(FT+, P ) and E[MTY ] = 0 for all M· ∈ M. Corollary 14.12 showsthe existence of N· ∈ Mart2(F·+) such that Nt ∈ E[Y |Ft+] for all t ∈ R≥0. Since Y isFT+-measurable, we may assume that Nt = Y for all t ∈ R≥T . In particular NT = Y . Weshall verify that N· ∈M⊥. Indeed

E[MtNt] = E[MtY ] = E[MTY ] = 0 for all M· ∈M and t ∈ R≥T .

According to Lemma 20.6 this implies that t 7→Mt∧TNt∧T is an F·+-martingale. Hence

E[MtNt] = E[MTNT ] = 0 for all M· ∈M and t ∈ R[0,T ].

Consequently N· ∈ M⊥ and Y ∈ [NT ]. We thus get the relation ⊂. The converse relationderives by Lemma 20.9(i).

98

Page 99: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

20.13 Theorem. Suppose that M is an F·+-stable linear subspace of Mart2(F·+). Then

M+M⊥ = Mart2(F·+) and t 7→MtNt is an F·+-martingale for all M· ∈M, N· ∈M⊥

where M the closure of M, M⊥ := N ∈ Mart2(F·+) : E[MTNT ] = 0∀M ∈M∀T ∈ N.

Proof. Let L· ∈ Mart2(F·+). Since

M⊥ = N ∈ Mart2(F·+) : E[MTNT ] = 0∀M ∈M∀T ∈ N,

there exist a sequence N →M, T 7→MT· and a sequence N →M⊥, T 7→ NT

· such that

LT = MTT +NT

T a.s. for all T ∈ N

by Theorem 20.12. We see by Lemma 20.9(i) that t 7→ (Mmt −Mn

t )(Nnt − Nm

t ) is an F·+-martingale with vanishing expectation with the help of Lemma 20.10. In particular

(?) E[(Mmt∧T −Mn

t∧T )(Nnt∧T −Nm

t∧T )] = 0 for all m,n, T ∈ N and t ∈ R≥0.

Since Lt ∈ E[LT |Ft+], MTt ∈ E[MT

T |Ft+] and NTt ∈ E[NT

T |Ft+] for all t ∈ R[0,T ],

Lt = MTt +NT

t for all t ∈ R[0,T ] a.s. for all T ∈ N.

Let T ∈ N. It then follows that

Mmt∧T +Nm

t∧T = Mnt∧T +Nn

t∧T for all t ∈ R≥0 a.s. for all m,n ∈ N≥T .

Two processes t 7→Mmt∧T −Mn

t∧T and t 7→ Nnt∧T −Nm

t∧T being indistinguishable, we get

E[|Mmt −Mn

t |2] = 0 and E[|Nnt −Nm

t |2] = 0 for all t ∈ R[0,T ] and m,n ∈ N≥T

by (?). Since M is L2-closed, this implies that there exists M· ∈M such that

E[|Mt −MTt |2] = 0 for all t ∈ R[0,T ] and T ∈ N.

We have that Mt = MTt for all t ∈ R[0,T ] a.s. for all T ∈ N. Similarly we deduce the existence

of N· ∈ M⊥ such that Nt = NTt for all t ∈ R[0,T ] a.s. for all T ∈ N. Thus we reach that

Lt = Mt +Nt for all t ∈ R≥0 a.s.

20.14 Example. (i) Martc2(F·) + Martfv2 (F·+) = Mart2(F·+) where denotes the closure.

If M· ∈ Martc2(F·) and N· ∈ Martfv2 (F·+) then t 7→MtNt is an F·+-martingale.

(ii) MT ;M· ∈ Martc2(F·)+ NT ;N· ∈ Martfv2 (F·+) = L2(FT+) for all T ∈ R≥0.

Proof. According to Lemma 20.9(i) and Theorem 19.6, we have that

N ∈ Mart2(F·+) : E[MTNT ] = 0 for all M ∈ Martfv2 (F·+) and T ∈ N= N ∈ Mart2(F·+) : N0 = 0 a.s., ∃N· ∈ Martc2(F·) s.t. Nt = Nt for all t ∈ R≥0 a.s..

On the other hand Martfv2 (F·+) is an F·+-stable linear subspace of Mart2(F·+) by Exam-ple 20.4(ii). Thus we get (i) by Theorem 20.13 and (ii) by Theorem 20.12. Note that

MT ;M· ∈ Martc2(F·) ∩ NT ;N· ∈ Martfv2 (F·+) = L2(F0+) for all T ∈ R≥0

and Martc2(F·) ∩Martfv2 (F·+) ∩ M0 = 0 a.s. consists of evanescent processes.

99

Page 100: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

21 General square integrable martingale

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields.

Mart2(F·) stands for the space of all square integrable F·-martingales withalmost sure right continuous sample path.

Martfv2 (F·) := M· ∈ Mart2(F·) : t 7→Mt of finite variation a.s.Martc2(F·) := M· ∈ Mart2(F·) : t 7→Mt continuous a.s.

21.1 Lemma. Let fn : T → R be a sequence of functions such that fn(t) ≥ 0 for all t ∈ Tand

∑t∈T fn(t) := sup

∑t∈F fn(t) ;F finite subsets of T < +∞ for all n ∈ N. Then∑

t∈T

lim infn→∞

fn(t) ≤ lim infn→∞

∑t∈T

fn(t).

Proof. We see that D :=⋃∞n=1t ∈ T : fn(t) > 0 is an countable set. If t 6∈ D then fn(t) = 0

for all n ∈ N and hence lim infn→∞ fn(t) = 0. It follows that∑t∈T

lim infn→∞

fn(t) =∑t∈D

lim infn→∞

fn(t) ≤ lim infn→∞

∑t∈D

fn(t) = lim infn→∞

∑t∈T

fn(t)

where we applied Fatou’s lemma to deduce the inequality.

21.2 Theorem. Let M· ∈ Martfv2 (F·). Then there exists an F·-increasing process Q· suchthat t 7→ |Mt|2 −Qt is an F·-martingale and it is indistinguishable from t 7→ jmpv(M,M)t.

Proof. We select and fix a sequence Mn· ∈ Martfv2 (F·) such that E[|Mn

T −MT |2] convergesto 0 for all T ∈ N. The argument of Lemma 20.7 shows that there exist φ ∈ Seq(N, ↑) andΩ0 ∈ F such that P (Ω0) = 1 and for each ω ∈ Ω0 the following holds:

t 7→Mnt (ω) is right continuous and of finite variation for all n ∈ N,

t 7→Mt(ω) is right continuous and admits left-hand limits everywhere, and

supt:t≤T |Mφ(n)t (ω)−Mt(ω)| converges to 0 for all T ∈ N.

In what follows we use the notation Mn in place of Mφ(n). For each n ∈ N introduce

Qn· := t 7→ RMndMnt −R[MndMn]t,

which is an F·-finite variation process. Observe that

Qnt (ω) =

∑s:s≤t

|Mns (ω)−Mn

s−(ω)|2 for all ω ∈ Ω0 and t ∈ R≥0.

Since t 7→ |Mnt |2 −Qn

t is an F·-martingale by Corollary 16.9(iv), it follows that

(?) E[lim infn→∞

Qnt ] ≤ lim inf

n→∞E[Qn

t ] = lim infn→∞

E[|Mnt |2] = E[|Mt|2]

by Fatou’s lemma. Let ω ∈ Ω0 and T ∈ N. Then we have that

|Mns−(ω)−Ms−(ω)| ≤ sup

t:t<T|Mn

t (ω)−Mt(ω)| for all s ∈ R(0,T ] and n ∈ N,

100

Page 101: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

and hence |Ms(ω)−Ms−(ω)|2 ≤ lim infn→∞ |Mns (ω)−Mn

s−(ω)|2. Therefore

(??)∑s:s≤t

|Ms(ω)−Ms−(ω)|2 ≤ lim infn→∞

∑s:s≤t

|Mns (ω)−Mn

s−(ω)|2 = lim infn→∞

Qnt (ω)

for all t ∈ R≥0 by Lemma 21.1. Since lim infn→∞Qnt is integrable, we may assume that

jmpv(M,M)t(ω) =∑s:s≤t

|Ms(ω)−Ms−(ω)|2 < +∞ for all t ∈ R≥0 and ω ∈ Ω0

with suitable shrink for Ω0. Let n ∈ N and write fs := Ms −Mns − (Ms− −Mn

s−). Then∑s:s≤t

|Ms(ω)−Ms−(ω)|2 −Qnt (ω) =

∑s:s≤t

fs(ω)(Ms(ω)−Ms−(ω) +Mns (ω)−Mn

s−(ω))

for all t ∈ R≥0 and ω ∈ Ω0. The absolute value of the right hand side is dominated by√jmpv(M −Mn,M −Mn)t(ω)(

√jmpv(M,M)t(ω) +

√Qnt (ω))

as Schwarz inequality shows. According to (??), the above is further dominated by

gnt (ω) :=√

lim infm→∞Qnmt (ω)(

√lim infm→∞Qm

t (ω) +√Qnt (ω))

where Qnm· := t 7→ RNdNt −R[NdN ]t with N· : t 7→Mm

t −Mnt . Consequently

|jmpv(M,M)t(ω)−Qnt (ω)| ≤ gnt (ω) for all t ∈ R≥0, ω ∈ Ω0 and n ∈ N.

Let t ∈ R≥0. Invoking Schwarz inequality we get

E[gnt ] ≤√E[lim infm→∞Qnm

t ](√E[lim infm→∞Qm

t ] +√E[Qn

t ]) for all n ∈ N.

Due to (?), the right hand side is dominated by√E[|Mt −Mn

t |2](√E[|Mt|2] +

√E[|Mn

t |2]),

which converges to 0. Note that t 7→ gnt is an F·-increasing process. Thus, choosing a suitablesubsequence and a suitable subset, we may realize an almost sure convergence

lim supn→∞

gnt (ω) = 0 for all t ∈ R≥0 and ω ∈ Ω0.

Then t 7→ lim infn→∞Qnt is an F·-increasing process. Indeed it is F·-adapted and

Qt(ω) := lim infn→∞

Qnt (ω) = jmpv(M,M)t(ω) for all t ∈ R≥0 and ω ∈ Ω0.

This also implies that |Qt(ω) − Qnt (ω)| ≤ gnt (ω) for all t ∈ R≥0, ω ∈ Ω0 and n ∈ N. Let

t ∈ R≥0. As we have seen gnt converges to 0 in L1 and hence Qnt converges to Qt in L1. Since

t 7→ |Mnt |2−Qn

t is are F·-martingales, it follows that t 7→ |Mt|2−Qt is an F·-martingale.

21.3 Corollary. Let M·, N· ∈ Martfv2 (F·).(i) var(jmpv(M,N))t ≤

√jmpv(M,M)t jmpv(N,N)t for all t ∈ R≥0 a.s.

(ii) There exists an F·-finite variation process Q· such that t 7→MtNt−Qt is an F·-martingaleand it is indistinguishable from t 7→ jmpv(M,N)t.

101

Page 102: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. The existence derives by Theorem 21.2 with the help of polarization trick.

Given X· ∈ Mart2(F·) we denote the following set by Martc[X· ;F·]:

M· ∈ Martc2(F·) : M0 = 0 a.s., t 7→ Xt −Mt ∈ Martfv2 (F·+).

The requirement reads not ∈ Martfv2 (F·) but ∈ Martfv2 (F·+). Since Mart2(F·) ⊂ Mart2(F·+),Martc[X· ;F·] 6= ∅ for all X· ∈ Mart2(F·) according to Example 20.14.

Given X·, Y· ∈ Mart2(F·), we introduce the following spaces:

[X;F·] := A· : F·-increasing, t 7→ |Xt|2 − At F·-martingale,

At − At− = |Xt −Xt−|2 ∀t ∈ R≥0 a.s.[X, Y ;F·] := A· : F·-finite variation, A0 = 0 a.s., t 7→ XtYt − At F·-martingale,

At − At− = (Xt −Xt−)(Yt − Yt−)∀t ∈ R≥0 a.s..

21.4 Theorem. Suppose that X·, Y· ∈ Mart2(F·). Then [X, Y ;F·+] 6= ∅.(i) If M· ∈ Martc[X· ;F·], N· ∈ Martc[Y· ;F·] and C· ∈ Crv[M,N ;F·] then Q· ∈ [X, Y ;F·+] isindistinguishable from t 7→ Ct+jmpv(X −M,Y −N)t. In particular [X,X;F·+] = [X;F·+].(ii) If Q· ∈ [X,Y ;F·+], A· ∈ [X;F·+] and B· ∈ [Y ;F·+] then

(var(Q·)t − var(Q·)s)2 ≤ (At − As)(Bt −Bs) for all t, s ∈ R≥0 a.s.

Proof. (i) We see that MtNt − Ct is an F·+-martingale. According to Example 20.14 botht 7→ Mt(Yt − Nt) and t 7→ Nt(Xt − Mt) are F·+-martingales. Corollary 21.3 shows theexistence of F·+-finite variation process D· such that

t 7→ (Xt −Mt)(Yt −Nt)−Dt is an F·+-martingale andDt = jmpv(X −M,Y −N)t for all t ∈ R≥0 a.s.

Since sample paths of M and N are continuous a.s., Dt − Dt− = (Xt − Xt−)(Yt − Yt−) forall t ∈ R≥0 a.s. Define Qt := Ct +Dt. Then it follows that

t 7→ XtYt −Qt is an F·+-martingale andQt −Qt− = Dt −Dt− = (Xt −Xt−)(Yt − Yt−) for all t ∈ R≥0 a.s.

Suppose that R· shares the same property as Q·. Their difference is continuous a.s. because

Qt −Qt− = (Xt −Xt−)(Yt − Yt−) = Rt −Rt− for all t ∈ R≥0 a.s.

The difference being an F·+-martingale with almost sure continuous and of finite variationsample path, Qt −Rt = Q0 −R0 for all t ∈ R≥0 a.s. by Corollary 8.17.

(ii) Let c ∈ R. We see by the bilinearity of the condition that

c(cA+Q) + cQ+B ∈ [cX + Y, cX + Y ;F·+], i.e., c2A+ 2cQ+B ∈ [cM +N ;F·+].

There exists Ω0 ∈ F such that P (Ω0) = and

t 7→ c2At(ω) + 2cQt(ω) +Bt(ω) is finite valued, right continuous and non-decreasing

102

Page 103: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

for all c ∈ Q and ω ∈ Ω0. Since c ∈ Q is arbitrary, we infer that

(Qt(ω)−Qs(ω))2 ≤ (At(ω)− As(ω))(Bt(ω)−Bs(ω)) for all t, s ∈ R≥0 and ω ∈ Ω0.

Let t, s ∈ R≥0 with s < t and ω ∈ Ω0. The Schwarz inequality shows( ∑J∈∆

|dQ(J ;ω)|)2

≤( ∑J∈∆

√dA(J ;ω)

√dB(J ;ω)

)2

≤( ∑J∈∆

dA(J ;ω))( ∑

J∈∆

dB(J ;ω)).

for any finite partition ∆ of R(s,t]. The right hand side equals (At(ω)−As(ω))(Bt(ω)−Bs(ω))and this dominates (var(Q·)t(ω)− var(Q·)s(ω))2.

21.5 Lemma. Suppose that C is a pre σ-field on a set S and m, µ, ν are finite measureson σ(C). If m(J)2 ≤ µ(J)ν(J) for all J ∈ C then( ∫

S

fg m)2

≤∫S

f 2 µ

∫S

g2 ν

for all non-negative σ(C)-measurable functions f, g : S → R. Here we interpret 0∞ = 0.

Proof. Let A ∈ σ(C)6=∅. Given ε ∈ R>0, there exist countable disjoint C-coverings Φ(1) andΦ(2) of A such that

∑J∈Φ(1) µ(J) < µ(A) + ε and

∑J∈Φ(2) ν(J) < ν(A) + ε. We set

∆ := J ∩K ; (J,K) ∈ Φ(1)× Φ(2), J ∩K 6= ∅.

Then ∆ is a countable disjoint C-covering of A and∑I∈∆

µ(I) =∑J∈Φ(1)

∑K∈Φ(2)

µ(J ∩K) ≤∑J∈Φ(1)

µ(J) ≤ µ(A) + ε.

We also have that∑

I∈∆ ν(I) ≤ ν(A) + ε. Invoking the Schwarz inequality we see that

m(A)2 ≤( ∑I∈∆

m(I))2

≤( ∑I∈∆

õ(I)

√ν(I)

)2

≤( ∑I∈∆

µ(I))( ∑

I∈∆

ν(I))

The right hand side is dominated by (µ(A) + ε)(ν(A) + ε). Consequently

m(A)2 ≤ µ(A)ν(A) for all A ∈ σ(C).

We next prove the inequality for simple functions. Suppose that f, g are non-negative σ(C)-simple functions. Write F := Image f , G := Image g and E(y, z) := f−1(y)∩ g−1(z) for(y, z) ∈ F ×G to save the space. It follows that∫

S

fg m =∑

y∈F,z∈G

yz m(E(y, z)) ≤∑

y∈F,z∈G

√y2µ(E(y, z))

√z2ν(E(y, z)).

The Schwarz inequality shows that the right hand side is dominated by the square root of∑y∈F,z∈G

y2µ(E(y, z))∑

y∈F,z∈G

z2ν(E(y, z)) =

∫S

f 2 µ

∫X

g2 ν.

Thus the inequality in question is shown for non-negative σ(C)-simple functions. Finally gen-eral non-negative σ(C)-measurable functions are monotone increasing limits of non-negativeσ(C)-simple functions.

103

Page 104: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

21.6 Theorem. Suppose that X·, Y· ∈ Mart2(F·), Q· ∈ [X, Y ;F·+], A· ∈ [X;F·+] andB· ∈ [X;F·+]. Then there exists Ω0 ∈ F such that P (Ω0) = 1 and the following holds on Ω0:∫

(0,+∞)

f(s)g(s) dvar(Q·)s ≤( ∫

(0,+∞)

f(s)2 dAs

)1/2( ∫(0,+∞)

g(s)2 dBs

)1/2

for all non-negative Borel measurable functions f, g : R>0 → R.

Proof. Combine Corollary 21.4 and Lemma 21.5.

21.7 Lemma. Suppose that M· and N· are square integrable F·-martingales with stochasticright continuity. Then there exists a natural F·+-finite variation process A· such that A0 = 0a.s. and t 7→MtNt − At is an F·+-martingale.

Proof. Clearly t 7→ Mt + Nt as well as t 7→ Mt − Nt are right continuous in probabilityand square integrable F·-martingales. So their squares t 7→ |Mt + Nt|2 and t 7→ |Mt − Nt|2are F·-submartingales, which are of class DL over Q by Lemma 7.12. We select and fixB ∈ DM[t 7→ |Mt +Nt|2,F·+] and C ∈ DM[t 7→ |Mt −Nt|2,F·+]. Since

4MtNt −Bt + Ct = (Mt +Nt)2 −Bt − (Mt −Nt)

2 + Ct,

we see that t 7→MtNt− (Bt−Ct)/4 is an F·+-martingale. Moreover t 7→ Bt−Ct is a naturalF·+-finite variation process by Lemma 15.15.

Given square integrable F·-martingales M· and N· with stochastic right continuity,

〈M,N ;F·〉 := A· : natural F·-finite variation process, A0 = 0 a.s.,

t 7→MtNt − At F·-martingale with stochastic right continuity.

We also introduce 〈M ;F·〉 := DM[t 7→ |Mt|2,F·].

21.8 Remark. Suppose that X·, Y· ∈ Mart2(F·+). If Q· ∈ [X, Y ;F·+] and A· ∈ 〈X,Y ;F·+〉then A· is an F·+-natural projection of Q·.

21.9 Lemma. Suppose that M· and N· are square integrable F·-martingales with stochasticright continuity. Here 〈. . . 〉 := 〈. . . ;F·〉 ⊂ 〈. . . ;F·+〉.(i) 〈M,N ;F·+〉 6= ∅. 〈M,N〉 = 〈N,M〉. 〈M,M〉 = 〈M〉 = 〈M· −X〉 where X ∈ L2(F0). IfMt = M0 a.s. ∀t ∈ R≥0 then 0 ∈ 〈M,N〉. If 0 ∈ 〈M〉 then Mt = M0 a.s. ∀t ∈ R≥0.(ii) If A· ∈ 〈M,N〉, B· is an F·-adapted process with a.s.-right continuous path and At = Bt

a.s. ∀t ∈ R≥0 then B· ∈ 〈M,N〉. If A·, B· ∈ 〈M,N〉 then At = Bt ∀t ∈ R≥0 a.s.

Proof. (i) Lemma 21.7 shows 〈M,N ;F·+〉 6= ∅. Clearly 〈M,N〉 = 〈N,M〉. The rela-tion 〈M〉 ⊂ 〈M,M〉 is obvious. We note that 〈M ;F·+〉 ∪ 〈M,M〉 ⊂ 〈M,M ;F·+〉. Since〈M ;F·+〉 6= ∅, each A· is indistinguishable from an element in 〈M ;F·+〉 by Theorem 15.19,that is, A· is an increasing process and hence A· ∈ 〈M〉. Since t 7→ 2MtX − X2 is an F·-martingale, Theorem 18.7(v) shows that DM[(M· −X)2,F·] = DM[M2,F·]. The last claimis by Theorem 18.7(vi). (ii) The second claim is by Theorem 15.19.

104

Page 105: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

21.10 Lemma. Suppose M·, N· and L· are square integrable F·-martingales with stochasticright continuity. Here 〈. . . 〉 := 〈. . . ;F·〉 ⊂ 〈. . . ;F·+〉.(iii) If A· ∈ 〈M,L〉, B· ∈ 〈N,L〉 and c ∈ R then A· +B· ∈ 〈M +N,L〉 and cA· ∈ 〈cM,N〉.(iv) If 〈M,L〉 ∩ 〈N,L′〉 6= ∅ then 〈M,L〉 = 〈N,L′〉. If 0 ∈ 〈M −N,L〉 then 〈M,L〉 = 〈N,L〉.If 〈M,L ;F·+〉 ∩ 〈N,L ;F·+〉 6= ∅ then 0 ∈ 〈M −N,L〉.(v) If 〈M,L ;F·+〉 ∩ 〈N,L ;F·+〉 6= ∅ for all L· then Mt−M0 = Nt−N0 a.s. for all t ∈ R≥0.(vi) If A· ∈ 〈M,N〉, B· ∈ 〈M〉 and C· ∈ 〈N〉 then

(var(A·)t − var(A·)s)2 ≤ (Bt −Bs)(Ct − Cs) for all t, s ∈ R≥0 a.s.

E[var(A·)t − var(A·)s] ≤√E[|Mt|2 − |Ms|2]

√E[|Nt|2 − |Ns|2] for all t, s ∈ R≥0 with s < t.

Proof. (iv) If 0 ∈ 〈M − N,L〉 then A· + 0 ∈ 〈N,L〉 for all A· ∈ 〈M,L〉. Conversely if〈M,L ;F·+〉 ∩ 〈N,L ;F·+〉 6= ∅ then 0 ∈ 〈M − N,L ;F·+〉, that is, t 7→ (Mt − Nt)Lt is anF·+-martingale and, being F·-adapted, it is an F·-martingale.

(v) Combine Lemma 21.9(i) and (iv).

22 Predictable process and square integrable martingale

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields.

Mart2(F·) (resp. Martb(F·)) stands for the space of all square integrable (bounded)F·-martingales with almost sure right continuous sample path.

Martfv2 (F·) := M· ∈ Mart2(F·) : t 7→Mt of finite variation a.s.Martc2(F·) := M· ∈ Mart2(F·) : t 7→Mt continuous a.s.

22.1 Lemma. Let M· ∈ Martfv2 (F·) and A· a bounded F·-finite variation process.

(i) I· : t 7→ R[AdM ]t ∈ Martfv2 (F·), and t 7→ jmpv(I,N)t is indistinguishable from t 7→R[A djmpv(M,N)]t for all N ∈ Martfv2 (F·).

(ii) If A· is F· ∨Null(P )-predictable modulo evanescence then J· := RAdM· ∈ Martfv2 (F·),and jmpv(J,N)· is indistinguishable from t 7→ RA djmpv(M,N)t for all N ∈ Martfv2 (F·).

Proof. We select and fix a sequence Mn· ∈ Martfv2 (F·) such that E[|Mn

T −MT |2] convergesto 0 for all T ∈ N. We may assume that Mn

0 = M0. Corollary 15.11(i) shows that

Int := R[AdMn]t = Mnt At −Mn

0 A0 −RMndAt for all t ∈ R≥0 a.s.

Jnt := RAdMnt = Mnt At −Mn

0 A0 −R[MndA]t for all t ∈ R≥0 a.s.

The argument of Lemma 20.7 shows that there exist φ ∈ Seq(N, ↑) and Ω0 ∈ F such thatP (Ω0) = 1 and for each ω ∈ Ω0 the following holds:

t 7→Mnt (ω) is of finite variation and right continuous for all n ∈ N,

t 7→Mt(ω) is right continuous and admits left-hand limits everywhere

supt:t≤T |Mφ(n)t (ω)−Mt(ω)| converges to 0 for all T ∈ N, and

t 7→ At(ω) is of finite variation and right continuous

105

Page 106: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

In what follows we use the notation Mn in place of Mφ(n). We see that

|R[MndA]t(ω)−R[MdA]t(ω)| ≤∫

(0,t]

|Mn·−(ω)−M·−(ω)| dvar(A)·(ω)

for all t ∈ R≥0 and ω ∈ Ω0. Therefore

Int converges to MtAt −M0A0 −RMdAt in probability for all t ∈ R≥0.Jnt converges to MtAt −M0A0 −R[MdA]t in probability for all t ∈ R≥0.

The argument being parallel, we concentrate on the statement (ii). So we further assume thatA· is indistinguishable from an F·∨Null(P )-predictable process. According to Corollary 17.17

Jn· : t 7→ RAdMnt ∈ Martfv(F·),

jmpv(Jn, Jn)t = R|A|2 djmpv(Mn,Mn)t for all t ∈ R≥0 a.s. and

jmpv(Jn − Jm, · · · )t = R|A|2 djmpv(Mn −Mm, · · · )t for all t ∈ R≥0 a.s.

There exists K ∈ R≥0 such that |At| ≤ K for all t ∈ R≥0 a.s. Since

E[jmpv(Mn,Mn)t] = E[|Mnt |2]− E[|Mn

0 |2] < +∞

by Theorem 21.2, it follows that E[jmpv(Jn, Jn)t] < +∞, which implies Jn ∈ Mart2(F·) byTheorem 16.8(ii) and Theorem 7.17(ii). Invoking Theorem 21.2 again

E[|JnT − JmT |2] = E[jmpv(In − Im, · · · )T ]

≤ K2E[jmpv(Mn −Mm, · · · )T ] = K2E[|MnT −Mm

T |2] for all T ∈ N.

The right hand side converges to 0 as m,n tends to ∞. Thus we infer that Jnt converges toMtAt −M0A0 − R[MdA]t in L2 for all t ∈ R≥0. Lemma 20.7(ii) shows that the limit is asquare integrable martingale. Finally we observe that

Jt − Jt− = MtAt −Mt−At− −Mt−(At − At−) = (Mt −Mt−)At.

This shows that jmpv(J,N)t = RA djmpv(M,N)t.

22.2 Lemma. Suppose that A· is a natural F·-finite variation process.(i) If f· is an F·+-adapted process whose almost every sample path is locally bounded andadmits left hand limits everywhere and E[R[|f |dvar(A·)]t] < +∞ for all t ∈ R≥0 then t 7→R[fdA]t is a natural F·-finite variation process.(ii) If f· is an F·-adapted process with almost sure locally bounded and right continuous samplepath, indistinguishable from an F· ∨ Null(P )-predictable process and E[R|f |dvar(A·)t] <+∞ for all t ∈ R≥0 then t 7→ RfdAt is a natural F·-finite variation process.

Proof. Let M· ∈ Martb(F·+). By Lemma 15.18(ii)

E[RMdAσ −R[MdA]σ] = 0 for all bounded F· ∨ Null(P )-optional times σ.

Since all F·+-stopping times are F·-optional times, according to Lemma 20.1, this impliesthat t 7→ RMdAt −R[MdA]t is an F·+-martingale. In view of Lemma 15.12(i)

t 7→ jmpv(M,A)t is an F·+ ∨ Null(P )-martingale.

106

Page 107: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Clearly almost every sample is of finite variation. Actually we have that

|jmpv(M,A)t − jmpv(M,A)s| ≤ 2 supu:u≥0

|Mu|(var(A)t − var(A)s)∀s < ∀t a.s.

(i) Suppose that f· is F·+-adapted, its almost every sample path is locally bounded andadmits left hand limits everywhere. According to Corollary 9.7, t 7→ R[fdA]t is an F·-finitevariation process. We see by (i) and (ii) of Lemma 15.12 that

RMdR[fdA]t −R[MdR[fdA]]t = jmpv(M,R[fdA])t = R[fd jmpv(M,A)]t

for all t ∈ R≥0 a.s. On the other hand, since

R[|f |d var(jmpv(M,A))]t ≤ 2 supu:u≥0

|Mu|R[|f |d var(A)]t for all t ∈ R≥0 a.s.,

it follows that t 7→ R[fd jmpv(M,A)]t is an F·+ ∨ Null(P )-martingale by Theorem 17.18(i).Thus we get E[RMdR[fdA]t] = E[R[MdR[fdA]]t].

(ii) Suppose that f· is indistinguishable from an F· ∨ Null(P )-predictable process andits almost every sample path is locally bounded and right continuous. According to Corol-lary 15.11(i), t 7→ RfdAt is an F·-finite variation process. We see by (i) and (ii) ofLemma 15.12 that

RMdRfdAt −R[MdRfdA]t = jmpv(M,RfdA)t = Rfd jmpv(M,A)t

for all t ∈ R≥0 a.s. Of course we have to modify Lemma 15.12(ii). Theorem 17.18(ii) showsthat t 7→ Rfd jmpv(M,A)t is an F·+ ∨ Null(P )-martingale.

22.3 Theorem. Suppose that X· ∈ Mart2(F·), Y ∈ Mart2(F·+), Q ∈ [X, Y ;F·+], C ∈〈X, Y ;F·+〉 and f· is a bounded F·-finite variation process.(i) I· : t 7→ R[fdX]t ∈ Mart2(F·), R[f dQ]· ∈ [I, Y ;F·+] and R[fdC]· ∈ 〈I, Y ;F·+〉.(ii) If f· is F·∨Null(P )-predictable modulo evanescence then J· : t 7→ RfdXt ∈ Mart2(F·),Rf dQ· ∈ [J, Y ;F·+] and Rf dC· ∈ 〈J, Y ;F·+〉. t 7→ RXdft −R[Xdf ]t ∈ Mart2(F·)

Proof. Let M· ∈ Martfv2 (F·+) such that t 7→ Xt −Mt ∈ Martc[X ;F·]. A posteriori M· isF·-adapted. The argument being parallel for (i) and (ii), we concentrate on the latter. Sowe further assume that f· is indistinguishable from an F· ∨Null(P )-predictable process. Wesee by Lemma 22.1(ii) that

L· : t 7→ RfdMt ∈ Martfv2 (F·+).

According to Corollary 15.11(ii) this is indistinguishable from

t 7→Mtft −X0f0 −R[Mdf ]t.

Being F·-adapted, the above belongs to Mart2(F·). Corollary 15.11(i) shows that

(Xt −Mt)ft −R(X −M)dft = R[fd(X −M)]t for all t ∈ R≥0 a.s.

Due to the sample path continuity of X −M the process in the left hand side is indistin-guishable from t 7→ (Xt −Mt)ft −R[(X −M)df ]t. Therefore

Jt − Lt = (Xt −Mt)ft −R[(X −M)df ]t = R[fd(X −M)]t for all t ∈ R≥0 a.s.

107

Page 108: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

The right hand side belongs to Martc2(F·). Indeed let B· ∈ Qvar[X· −M· ;F·]. Then

E[Bt] = E[|Xt −Mt|2] < +∞

by Theorem 8.15. We see by Theorem 9.10 that

t 7→ R[fd(X −M)]t ∈ Martc(F·) and R[|f |2 dB]· ∈ Qvar[L· ;F·].

Since there exists K ∈ R≥0 such that |ft| ≤ K for all t ∈ R≥0 a.s., we have that

E[R[|f |2 dB]t] ≤ K2E[Bt] < +∞.

Therefore Theorem 8.15 and Theorem 7.17(ii) shows that

t 7→ Jt − Lt ∈ Martc2(F·).

We next select and fix N· ∈ Martfv2 (F·+) such that t 7→ Yt − Nt ∈ Martc[Y ;F·]. LetA· ∈ Crv[X −M,Y −N ;F·]. It follows by Theorem 21.4 that

Q· is indistinguishable from t 7→ At + jmpv(M,N)t.

We see that R[f dA]· ∈ Crv[J − L, Y − N ;F·] by Theorem 10.10. Since A has continuoussample path and the discontinuous points of f· is countable almost surely, it follows that

Rf dA· ∈ Crv[J − L, Y −N ;F·].

On the other hand according to Lemma 22.1(ii)

Rf d(Q− A)t = Rf djmpv(M,N)t = jmpv(L,N)t for all t ∈ R≥0 a.s.

Thus we conclude by invoking Theorem 21.4 that

Rf dQ· ∈ [J, Y ;F·+].

Observe that t 7→ Qt − Ct is an F·+-martingale. Since f· is bounded and indistinguishablefrom an F· ∨ Null(P )-predictable process, it follows by Theorem 17.18(ii)that

t 7→ Rf d(Q− C)t is an F·+-martingale.

This implies that t 7→ JtYt − Rf dCt is an F·+-martingale. Finally, since f· is boundedand indistinguishable from an F· ∨ Null(P )-predictable process, the F·+-increasing processt 7→ Rf dCt is natural by Lemma 22.2(ii).

22.4 Theorem. Suppose that A· is an F·-finite variation process indistinguishable from anF· ∨ Null(P )-predictable process.(i) Let X· ∈ Mart2(F·) and σ ∈ Time(F· ∨ Null(P )). If

E[sups:s≤t∧σ |Xs|(|A0|+ var(A)t∧σ) ; σ > 0] < +∞ for all t ∈ R≥0

then t 7→ Xt∧σAt∧σ −X0A0 −R[XdA]t∧σ is an F· ∨ Null(P )-martingale.(ii) If E[var(A)t] < +∞ for all t ∈ R≥0 then t 7→ RXdAt−R[XdA]t is an F·+-martingalefor all X· ∈ Martb(F·+), and hence A· is F·-natural.

108

Page 109: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. (i) We introduce a sequence of functions φn : R → R, x 7→ maxminx, n,−n. Thecomposition process t 7→ maxminAt, n,−n is of finite variation. Indeed we have that

var(φn(A·))t − var(φn(A·))s ≤ var(A·)t − var(A·)s for all s, t ∈ R≥0 with s < t a.s.

Clearly |φn(At)| ≤ n for all t ∈ R≥0 and the F· ∨Null(P )-predictability modulo evanescenceis preserved. We see that

In· : t 7→ Xtφn(At)−X0φn(A0)−R[Xdφn(A·)]t ∈ Mart2(F·)

by Corollary 22.3(ii). In view of Example 20.4 we have that that

t 7→ Int∧σ ∈ Mart2(F· ∨ Null(P )).

If T ∈ N, n ∈ N and |A0|+ var(A)T ≤ n then φn(At) = At for all t ∈ R[0,T ]. Therefore

supt:t≤T |Int − It| converges to 0 almost surely for all T ∈ N,

where we write I· : t 7→ XtAt −X0A0 −R[XdA]t. On the other hand

|Xt∧σφn(At∧σ)−X0φn(A0)−R[Xdφn(A·)]t∧σ| ≤ 2 sups:s≤t∧σ

|Xs|(|A0|+ var(A)t∧σ) a.s.

Thus we deduce the F· ∨ Null(P )-martingale property of the stopped process t 7→ It∧σ byapplying the dominated convergence theorem.

(ii) We may assume that A0 = 0 a.s. Observe that A· is an F·+-finite variation processF·+ ∨ Null(P )-predictable modulo evanescence as well. Let X· ∈ Martb(F·+). Thus (i)shows that t 7→ XtAt − R[XdA]t is an F·+-martingale. The F·+-martingale property oft 7→ XtAt −RXdAt is discussed in Lemma 15.8.

23 Square integrable martingale as integrator

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields.

Mart2(F·) stands for the space of all square integrable F·-martingaleswith almost sure right continuous sample path.

23.1 Lemma. Let X· ∈ Mart2(F·), Q· ∈ [X;F·+], Y· ∈ Mart2(F·+) and B· ∈ [X,Y ;F·+].(i) If almost every sample path of f· is locally bounded and admits left-hand limits everywherethen

E[R[|f |dvar(B·)]t] ≤√E[R[|f |2dQ]t]

√E[|Yt|2]− E[|Y0|2] for all t ∈ R≥0.

(ii) If almost every sample path of f· is locally bounded and right continuous then

E[R|f |dvar(B·)t] ≤√E[R|f |2dQt]

√E[|Yt|2]− E[|Y0|2] for all t ∈ R≥0.

Proof. Let C· ∈ [Y ;F·+]. Note that E[Yt] < +∞ for all t ∈ R≥0. We have that

R|f |dvar(B·)t ≤√R|f |2dQtCt for all t ∈ R≥0 a.s.

by Theorem 21.6. Thus we get the claim by invoking Schwarz’ inequality.

109

Page 110: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

23.2 Theorem. Suppose that X· ∈ Mart2(F·), Q· ∈ [X;F·+], f is an F·-finite variationprocess, Y· ∈ Mart2(F·+) and B· ∈ [X,Y ;F·+].(i) E[R[|f |2dQ]t] < +∞ ∀t ∈ R≥0 ⇒ R[fdX]· ∈ Mart2(F·) and R[fdB]· ∈ [R[fdX], Y ;F·+].(ii) If f is F· ∨ Null(P )-predictable modulo evanescence and E[R|f |2dQt] < +∞ for allt ∈ R≥0 then J· : t 7→ RfdXt ∈ Mart2(F·) and RfdB· ∈ [J, Y ;F·+].(iii) If f is F·∨Null(P )-predictable modulo evanescence, E[R[|f |2dQ]t] < +∞ for all t ∈ R≥0

and E[R|f |2dQt] < +∞ for all t ∈ R≥0 then t 7→ RXdft −R[Xdf ]t ∈ Mart2(F·).

Proof. (ii) We partly repeat the proof of Lemma 22.1 and Theorem 22.4. As before weintroduce a sequence of functions φn : R → R, x 7→ maxminx, n,−n. Then

Jn· : t 7→ Xtφn(ft)−X0φn(f0)−R[Xdφn(f·)]t ∈ Mart2(F·),

supt:t≤T |Jnt − Jt| converges to 0 almost surely for all T ∈ N and

t 7→ R|φn(f)− φm(f)|2dQt ∈ [Jn − Jm ;F·+].

The third line drives by Theorem 22.3(ii). Invoking Theorem 21.4(i), we get√E[|JnT − JmT |2] =

√E[R|φn(f)− φm(f)|2dQT ]

≤√E[R|φn(f)− f |2dQT ] +

√E[R|f − φm(f)|2dQT ] for all T ∈ N.

The right hand side converges to 0 as m,n tends to ∞. Indeed

R|φn(f)− f |2dQT ≤ R|f |2dQT and φn(f) converges to f point wise.

Thus we infer that Jnt converges to Xtft − X0f0 − R[Xdf ]t in L2 for all t ∈ R≥0. FinallyLemma 20.7(ii) shows that the limit is a square integrable martingale. By Theorem 22.3(ii),

Jnt Yt −Rφn(f)dBt is an F·+-martingale.

The first term converges to JtYt in L1. On the other hand Lemma 23.1 shows that

E[|Rφn(f)dBt]−RfdBt|] ≤√E[R|φn(f)− f |2dAt]

√E[|Yt|2]− E[|Y0|2]

Consequently Rφn(f)dBt converges to RfdBt in L1. It follows that

JtYt −RfdBt is an F·+-martingale.

Finally we see that

(Jt − Jt−)(Yt − Yt−) = ft(Xt −Xt−)(Yt − Yt−) = ft(Bt −Bt−)

for all t ∈ R≥0 a.s. Consequently t 7→ RfdBt ∈ [J, Y ;F·+].

23.3 Lemma. Let X· ∈ Mart2(F·), A· ∈ 〈X;F·+〉, Y· ∈ Mart2(F·+) and B· ∈ 〈X, Y ;F·+〉.(i) If almost every sample path of f· is locally bounded and admits left-hand limits everywherethen

E[R[|f |dvar(B·)]t] ≤√E[R[|f |2dA]t]

√E[|Yt|2]− E[|Y0|2] for all t ∈ R≥0.

(ii) If almost every sample path of f· is locally bounded and right continuous then

E[R|f |dvar(B·)t] ≤√E[R|f |2dAt]

√E[|Yt|2]− E[|Y0|2] for all t ∈ R≥0.

110

Page 111: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. The proof is a repetition of that of Lemma 23.1 with square brackets replaced bycorresponding angle brackets.

23.4 Theorem. Suppose that X· ∈ Mart2(F·), A· ∈ 〈X;F·+〉, f is an F·-finite variationprocess, Y· ∈ Mart2(F·+) and B· ∈ 〈X, Y ;F·+〉.(i) E[R[|f |2dA]t] < +∞ ∀t ∈ R≥0 ⇒ R[fdX]· ∈ Mart2(F·) and R[fdB]· ∈ 〈R[fdX], Y ;F·+〉.(ii) If f is F· ∨ Null(P )-predictable modulo evanescence and E[R|f |2dAt] < +∞ for allt ∈ R≥0 then J· : t 7→ RfdXt ∈ Mart2(F·) and RfdB· ∈ 〈J, Y ;F·+〉.(iii) If f is F·∨Null(P )-predictable modulo evanescence, E[R[|f |2dA]t] < +∞ for all t ∈ R≥0

and E[R|f |2dAt] < +∞ for all t ∈ R≥0 then t 7→ RXdft −R[Xdf ]t ∈ Mart2(F·).

Proof. The proof is a repetition of that of Theorem 23.2 with square brackets replaced bycorresponding angle brackets.

23.5 Lemma. Suppose that Y· and Z· are F·-adapted processes whose almost all samplepaths are right continuous and locally bounded, F· and G· are F·-finite variation processes,F0 = 0 a.s., G0 = 0 a.s., M· : t 7→ Yt − Ft, N· : t 7→ Zt − Gt ∈ Mart2(F·), A ∈ 〈M ;F·+〉,B ∈ 〈N ;F·+〉, E[R[|G|2dA]t] < ∞ ∀t ∈ R≥0, E[R[|F |2dB]t] < ∞ ∀t ∈ R≥0, C· is a naturalF·+-finite variation process and E[|C0|] < +∞. Then

t 7→ YtZt −MtNt −RY dGt −RZdFt + jmpv(F,G)t ∈ Mart2(F·),

moreover C· − C0 ∈ 〈M,N ;F·+〉 if and only if is the following is an F·+-martingale:

t 7→ YtZt −RY dGt −RZdFt + jmpv(F,G)t − Ct

Proof. RMdGt + RNdFt = MtGt − R[GdM ]t + FtNt − R[FdN ]t for all t ∈ R≥0 a.s.by Corollary 15.11(i). RFdGt + RGdFt − jmpv(F,G)t = FtGt for all t ∈ R≥0 a.s.by (i) and (ii) of Lemma 15.12. Thus the process in question is indistinguishable fromt 7→ R[GdM ]t +R[FdN ]t, which belongs to Mart2(F·) by Theorem 23.4.

We will get the next statement with the help of Lemma 20.7.

23.6 Theorem. Suppose that M· ∈ Mart2(F·), Q· ∈ [M ;F·+], A· ∈ 〈M ;F·+〉 and f· is anF·-adapted process with almost sure locally bounded and right continuous sample path andindistinguishable from an F· ∨ Null(P )-predictable process.(i) If E[R|f |2dQt] < +∞ for all t ∈ R≥0 then there exists I· ∈ Mart2(F·) such that I0 = 0a.s. and if N· ∈ Mart2(F·+) and B· ∈ [M,N ;F·+] then RfdB· ∈ [I·, N ;F·+].(ii) If E[R|f |2dAt] < +∞ for all t ∈ R≥0 then there exists I· ∈ Mart2(F·) such that I0 = 0a.s. and if N· ∈ Mart2(F·+) and B· ∈ 〈M,N ;F·+〉 then RfdB· ∈ 〈I·, N ;F·+〉.

Given M· ∈ Mart2(F·), A· ∈ [M ;F·+] and an F·-predictable process f·with almost sure locally bounded and right continuous sample path, andE[R|f |2dAt] < +∞ for all t ∈ R≥0, we set

Ito[fdM ;F·] := I· ∈ Mart2(F·) : I0 = 0 a.s., and RfdB· ∈ [I·, N ;F·+]

for any N· ∈ Mart2(F·) and B· ∈ [M,N ;F·+].

111

Page 112: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

24 Predictable processes

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields. Theindex set is R≥0.

24.1 Lemma. Suppose that τ is an F·-optional time. If there exists a sequence σ(·) of F·-optional times such that σ(n) ≤ σ(n+ 1), supn∈N σ(n) = τ and τ > 0 ⊂ σ(n) < τ then(t, ω) : t > 0, τ(ω) ≤ t ∈ Pred(F·).

Proof. We observe that (t, ω) : t > 0, τ(ω) ≤ t =⋂∞n=1(t, ω) : σ(n, ω) < t, which belongs

to Pred(F·) by Lemma 17.2(iii). Incidentally ω : τ(ω) ≤ t =⋂∞n=1ω : σ(n, ω) < t for all

t ∈ R>0, whence τ is an F·-stopping time provided τ = 0 ∈ F0.

24.2 Definition. An F·-optional time τ is said to be predictable if τ = 0 ∈ F0 andthere exist a sequence σ(·) of F·-optional times such that σ(n) ≤ σ(n+ 1), supn∈N σ(n) = τand τ > 0 ⊂ σ(n) < τ (a posteriori τ is a stopping time). Such sequence is called anannouncing sequence.

24.3 Example. Let s, t ∈ R≥0, s < t, A ∈ F0, B ∈ Fs and A ∩B = ∅. We set

τ :=

0 on A

t on B

+∞ off A ∪Bσ(n) :=

0 on A

s/(n+ 1) + nt/(n+ 1) on B

t+ n off A ∪B

Then τ is a predictable F·-optional time, which is announced by the sequence σ(·). Indeedτ = 0 = A ∈ F0, each σ(n) is an F·-stopping time and σ(n) τ on τ > 0 = Ac.

24.4 Lemma. If F·-optional times σ and τ are predictable then so are σ ∧ τ and σ ∨ τ .Proof. Let σ(·) respectively τ(·) be announcing sequences for σ and τ . We see that

σ(k) ∧ τ(k) ≤ σ(k + 1) ∧ τ(k) ≤ σ(k + 1) ∧ τ(k + 1) ≤ σ ∧ τσ ∧ τ(k) = sup

n≥kσ(n) ∧ τ(k) ≤ sup

n≥kσ(n) ∧ τ(n) = sup

n∈Nσ(n) ∧ τ(n) ≤ σ ∧ τ

because (x, y) 7→ x ∧ y is continuous and separately non-decreasing. Moreover

σ ∧ τ > 0 = σ > 0 ∩ τ > 0 ⊂ σ(k) < σ ∩ τ(k) < τ ⊂ σ(k) ∧ τ(k) < σ ∧ τ.

On the other hand if σ = 0 and τ > 0 then σ(k) = 0 and τ(k) < τ ≤ σ ∨ τ .24.5 Lemma. Suppose that σ is an F·-optional time, Q ⊂ R>0 and for t ∈ R≥0 withQ>t := q ∈ Q : q > t 6= ∅ there exists r ∈ Q>t such that r ≤ q for all q ∈ Q>t.(i) τ := minq ∈ Q : q > σ is an F·-stopping time.(ii) If #q ∈ Q : q ≤ t <∞ for all t ∈ R≥0 then τ is predictable.

Proof. (i) τ ≤ t = σ < supq ∈ Q : q ≤ t ∈ Ft for all t ∈ R≥0.(ii) For each n ∈ N the following set satisfies that whenever q ∈ Qn : q > t is not void

it has a minimum element:

Qn := 2−k maxi ∈ Q ∪ 0 : i < q+ (1− 2−k)q ; k ∈ Z≥n, q ∈ Q ∪Q.

It follows by (i) that τ(n) := minq ∈ Qn : q > σ ∧ n is an F·-stopping time. ClearlyQn+1 ⊂ Qn implies that τ(n) ≤ τ(n+ 1). Observe that if q ∈ Q and t < q then there existsr ∈ Qn such that t < r < q, which means τ(n) < τ . Moreover supn∈N τ(n) = τ .

112

Page 113: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

24.6 Corollary. (i) For each F·-optional time σ there exists a sequence τ(·) of predictableF·-optional times such that τ(n) ≥ τ(n+ 1), infn∈N τ(n) = σ and σ < +∞ ⊂ σ < τ(n).(ii) Pred(F·) is generated by (t, ω) : τ(ω) ≤ t; τ predictable F·-optional time.

Proof. (i) τ(n) := minq ∈ 2−nZ : q > σ provides us with a desired sequence.(ii) We have that C := (t, ω) : τ(ω) ≤ t; τ predictable F·-optional time ⊂ Pred(F·).

Indeed, according to Lemma 24.1, if τ is a predictable F·-optional time then

(t, ω) : τ(ω) ≤ t = (t, ω) : t > 0, τ(ω) ≤ t ∪ (0 × τ = 0) ∈ Pred(F·).

Lemma 17.2 shows that Pred(F·) is generated by

(t, ω) : σ(ω) < t;σ ∈ Time(F·) ∪ 0 × A ;A ∈ F0.

Let σ ∈ Time(F·). There exists a sequence τ(·) of predictable F·-optional times such thatτ(n) ≥ τ(n+ 1), infn∈N τ(n) = σ and σ < +∞ ⊂ σ < τ(n) by (i). It follows that

(t, ω) : σ(ω) < t =∞⋃n=1

(t, ω) : τ(n, ω) ≤ t ∈ σ(C).

Let A ∈ F0. Example 24.3 shows the existence of an predictable F·-optional time τ suchthat A = τ = 0 = τ < +∞. Since

R>0 × A = (t, ω) : τ(ω) < t ∈ σ(C) and R≥0 × A = (t, ω) : τ(ω) ≤ t,

we see that 0 × A ∈ σ(C). Thus we get the claim.

24.7 Question. Suppose that τ is an F·-optional time and (t, ω) : τ(ω) ≤ t ∈ Pred(F·)(t 7→ 1[τ,+∞)(t) is a predictable and right continuous increasing process). Is τ predictable?

24.8 Lemma. Suppose that N is a σ-consistent exceptional family on Ω with N ⊂ F . Ifτ is an F·-optional time, τ = 0 ∈ F0 ∨ N and there exist a sequence σ(n) of F·-optionaltimes such that supn∈N σ(n) 6= τ ∈ N and τ > 0, σ(n) ≥ τ ∈ N then there exists apredictable F·-optional time τ such that τ 6= τ ∈ N .

Proof. We write S(n) := maxk∈N:k≤n σ(k) and S := supn∈N σ(n). All of them are F·-optionaltimes. Clearly S(n) ≤ S(n+ 1) and supn∈N S(n) = S. We see that

S > 0, σ(k) = S ⊂ S 6= τ ∪ S = τ, τ > 0, σ(k) = τ ⊂ S 6= τ ∪ τ > 0, σ(k) ≥ τ.

Since the right hand side belongs to N , it follows that S > 0, σ(k) = S ∈ N . Therefore

∞⋃n=1

S(n) = S > 0 =∞⋃k=1

S > 0, σ(k) = S ∈ N .

Observe that S(n) < S ∈ FS+ by Lemma 3.15(iii). Hence S < t ∩⋂∞n=1S(n) < S

belongs to Ft for all t ∈ R>0. Select and fix A ∈ F0 with τ = 0 on A ∈ N . We introducethe following F·-optional time

τ :=

0 on A

S on⋂∞n=1S(n) < S ∩ Ac

+∞ on⋃∞n=1S(n) = S ∩ Ac

113

Page 114: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

so that we have τ = 0 = A ∈ F0 and τ 6= τ ∈ N . Indeed

τ 6= τ ⊂ S 6= τ ∪ S = τ, S 6= τ ⊂ S 6= τ ∪∞⋃n=1

S(n) = S > 0 ∪ (S = 0 on A).

To show the predictability of τ we further introduce for each n ∈ N

σ(n) :=

0 on A

S(n) ∧ n on S(n) < S ∩ Ac

n on S(n) = S ∩ Ac

This is an F·-optional time because S(n) < S ∈ FS(n)+ and hence

σ(n) < t =

(S(n) < t ∩ S(n) < S) ∪ A ∈ Ft 0 < t ≤ n

Ω t > n

On⋂∞n=1S(n) < S ∩ Ac we have that

σ(k) = S(k) ∧ k ≤ S(k + 1) ∧ k ≤ S(k + 1) ∧ (k + 1) ≤ S(k + 1) < S = τ

τ ∧ k = S ∧ k = supn≥k

S(n) ∧ k ≤ supn≥k

σ(n) = supn∈N

σ(n) ≤ τ

While on S(n) = S ∩ Ac we have that

σ(k) = k < +∞ = τ for all k ∈ N≥n.

Thus σ(k) ≤ σ(k + 1) and supn∈N σ(n) = τ . Moreover σ(k) < τ off τ = 0 = A.

24.9 Definition. LetX· be an F·-adapted process with almost sure right continuous path. Itis said to be F·-quasi left continuous if almost every path admits left hand limits everywhereand for any sequence τ(n) of F·-optional times such that τ(n) ≤ τ(n+ 1) a.s. and τ(n) ≤ Ta.s. for some T ∈ R>0 the sequence Xτ(n) converges to Xσ a.s. where σ := supn∈N τ(n).

24.10 Lemma. Suppose that E is a metrizable topological space and X· is an F·-adaptedprocess with almost sure D(R≥0, E)-sample path. Then X· is F·-quasi left continuous if andonly if P (Xτ 6= Xτ−, 0 < τ < +∞) = 0 for all predictable F·-optional times τ .

Proof. Suppose that X· is F·-quasi left continuous and τ is a predictable F·-optional timewith an announcing sequence σ(·). Fix T ∈ N. Then Xσ(n)∧T converges to Xτ∧T a.s. Since0 < τ ≤ T ⊂ σ(n) ∧ T = σ(n), τ ∧ T = τ, σ(n) < τ, it follows that

P (Xτ 6= Xτ−, 0 < τ ≤ T ) = 0 for all T ∈ N

Conversely suppose that P (Xτ 6= Xτ−, 0 < τ < +∞) = 0 for all predictable F·-optionaltimes τ . Given a sequence τ(·) of F·-optional times such that τ(n) ≤ τ(n + 1) a.s. andσ := supn∈N τ(n) < +∞ a.s., we introduce

τ :=

σ on

⋂∞n=1τ(n) < σ

+∞ on⋃∞n=1τ(n) = σ

114

Page 115: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

This is a predictable F·-optional time according to the proof of Lemma 24.8. Taking

0 < τ < +∞ = σ < +∞, τ(n) < σ ∀n ⊂ τ = σ

into account we infer that

P (Xσ 6= Xσ−, σ < +∞, τ(n) < σ ∀n) = 0.

Since P (σ < +∞) = 1, the following set has full measure

Xσ = Xσ−, σ < +∞, τ(n) < σ ∀n ∪ σ < +∞, ∃n s.t. τ(n) = σ.

Moreover on this set Xσ(n) converges to Xσ.

24.11 Lemma. Suppose that τ is an F·-optional time and f : R≥0 → R is bounded, strictlyincreasing and continuous. If M· ∈ Martb(F·+) and Mt ∈ E[f(τ)|Ft+] for all t ∈ R≥0 withf(∞) = supt≥0 f(t) then supn∈N inft ∈ R≥0 : Mt − f(τ ∧ t) < 1/n = τ a.s.

Proof. According to Corollary 14.12, we have that M∞ = f(τ) a.s. and

Mσ ∈ E[f(τ)|(F· ∨ Null(P ))σ+] for all F· ∨ Null(P )-optional times σ.

Since f is continuous, t 7→ f(τ ∧ t) is a continuous F·-adapted process. Set

ζ := inft ∈ R≥0 : Mt − f(τ ∧ t) ≤ 0 ∧ inft ∈ R>0 : Mt− − f(τ ∧ t) ≤ 0.

Observe that Mτ = f(τ) a.s. and Mt > f(τ ∧ t) if t < ζ. This implies that

ζ ≤ τ a.s.

For each n ∈ N denote the F· ∨ Null(P )-optional time inft ∈ R≥0 : Mt − f(τ ∧ t) < 1/nby τ(n). Then supn∈N τ(n) = ζ a.s. by Lemma 4.5. Recall that M∞ = f(τ) a.s. Therefore

E[f(τ)] = E[Mτ(n)] = E[f(τ) ; τ(n) = +∞] + E[Mτ(n) ; τ(n) < +∞].

Since Mτ(n) − f(τ ∧ τ(n)) ≤ 1/n on τ(n) < +∞, the right hand side is dominated by

E[f(τ ∧ τ(n)) ; τ(n) = +∞] + E[f(τ ∧ τ(n)) + 1/n ; τ(n) < +∞].

This reads E[f(τ ∧ τ(n))] + P (τ(n) < +∞)/n. It follows that

E[f(τ)] ≤ E[f(τ ∧ τ(n))] + 1/n ≤ E[f(ζ)] + 1/n for all n ∈ N

where we used that τ(n) ≤ ζ a.s. and f is non-decreasing. Tending n to ∞, we get

E[f(τ)] ≤ E[f(ζ)].

Since f is strictly increasing and ζ ≤ τ a.s., we reach that supn∈N τ(n) = ζ = τ a.s.

24.12 Lemma. Suppose that τ is an F·-optional time, τ = 0 ∈ F0 ∨ Null(P ) and

E[Mτ ; τ > 0] = E[Mτ− ; τ > 0] for all M· ∈ Martb(F·+).

Then there exists a predictable F·-optional time τ such that τ = τ a.s.

115

Page 116: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. Set f : R≥0 → R, t 7→ t/(1+ t), which is bounded, strictly increasing and continuous.We select and fix M· ∈ Martb(F·+) such that Mt ∈ E[f(τ)|Ft+] for all t ∈ R≥0. Then

supn∈N τ(n) = τ a.s. where τ(n) := inft ∈ R≥0 : Mt − f(τ ∧ t) < 1/n

by Lemma 24.11. Since Mτ ∈ E[f(τ)|(F· ∨ Null(P ))τ+] by Corollary 14.12,

Mτ = f(τ) a.s.

On the other hand t 7→ τ ∧ t is F·-adapted and f(τ ∧ t) ≤ f(τ), which implies that

f(τ ∧ t) ≤Mt a.s. for all t ∈ R≥0.

With the help of the sample path right continuity, we see that

f(τ ∧ t) ≤Mt for all t ∈ R≥0 a.s. and hence f(τ) ≤Mτ− a.s. on τ > 0.

Consequently, due to E[Mτ ; τ > 0] = E[Mτ− ; τ > 0], we get

f(τ) = Mτ− a.s. on τ > 0.

Let n ∈ N. Since Mτ(n)− − f(τ ∧ τ(n)) ≥ 1/n a.s. on τ(n) > 0, it follows that

1

nP (0 < τ(n) = τ) ≤ E[Mτ(n)− − f(τ ∧ τ(n)) ; 0 < τ(n) = τ ]

= E[Mτ− − f(τ) ; 0 < τ(n) = τ ] = 0.

Thus we conclude that P (0 < τ(n) = τ) = 0 for each n ∈ N. Now apply Lemma 24.8together with Lemma 4.8(iii).

24.13 Theorem. Let σ ∈ Time(F·). Then the followings are equivalent to each other:(i) There exists a predictable F·-optional time σ such that σ = σ a.s.(ii) t 7→ 1[σ,+∞)(t) is indistinguishable from an F· ∨ Null(P )-predictable process.(iii) The F·-finite variation process t 7→ 1[σ,+∞) is natural and σ = 0 ∈ F0 ∨ Null(P ).

Proof. (i) ⇒ (ii) by Corollary 24.6(ii). (ii) ⇒ (iii) by Theorem 22.4(ii). (iii) ⇒ (i) byLemma 24.12.

24.14 Lemma. Suppose that E is a metrizable topological space and X· is an E-valuedprocess with almost sure right continuous sample path and it is indistinguishable from anF· ∨ Null(P )-predictable process. Then for each closed subset F there exists a predictableF·-optional time τ such that inft ∈ R≥0 : Xt ∈ F = τ a.s.

Proof. We see by Lemma 17.16 that there exists Ω0 ∈ F such that P (Ω0) = 1 and

t 7→ Xt(ω) is an (F· ∨ Null(P )) ∩ Ω0-predictable process with right continuoussample path on (Ω0,F ∩ Ω0, P |Ω0).

116

Page 117: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

We work on the probability space (Ω0,F∩Ω0, P |Ω0) with filtration F· := (F·∨Null(P ))∩Ω0.Then σ := inft ∈ R≥0 : Xt ∈ F is an F·-optional time by Corollary 19.5. It follows byLemma 17.2 that

(t, ω) : σ(ω) < t ∈ Pred(F·).

Due to the sample path right continuity of X·

(t, ω) : t = σ(ω) ⊂ (t, ω) : Xt(ω) ∈ F ⊂ (t, ω) : σ(ω) ≤ t.

The set in the middle belongs to Pred(F·) due to the predictability. Therefore

(t, ω) : σ(ω) ≤ t = (t, ω) : Xt(ω) ∈ F ∪ (t, ω) : σ(ω) < t ∈ Pred(F·).

This means that t 7→ 1[σ,+∞)(t) is an F·-predictable process. Thus, invoking Theorem 24.13together with Lemma 4.8(iii), we get the statement.

24.15 Corollary. Suppose that X· is an F·-adapted process with almost sure right continu-ous and locally bounded sample path and indistinguishable from an F· ∨ Null(P )-predictableprocess. Then X· is F·-locally bounded.

Proof. There exists a sequence τ(·) of predictable F·-optional times such that

inft ∈ R≥0 : |Xt| ≥ k = τ(k) a.s. for all k ∈ N

by Lemma 24.14. We have that τ(k) ≤ τ(k + 1) a.s. and, since almost every sample path islocally bounded, supk∈N τ(k) = +∞ a.s. We select and fix Ω0 ∈ F such that P (Ω0) = 1 and

inft ∈ R≥0 : |Xt| ≥ k = τ(k), τ(k) ≤ τ(k + 1) and supk∈N τ(k) = +∞ on Ω0.

In particular we that

|Xt(ω)| < k for all t ∈ R[0,τ(k,ω)) for all ω ∈ τ(k) > 0 ∩ Ω0.

For each k ∈ N there exists a sequence σ(k, ·) of F·-optional times such that

σ(k, l) ≤ σ(k, l + 1), supl∈N σ(k, l) = τ(k) and τ(k) > 0 ⊂ σ(k, l) < τ(k).

We introduce the following F·-optional times:

S(n) := maxσ(k, n) ; k ∈ N≤n.

Since σ(k, n) ≤ σ(k, n+ 1) for all k ∈ N, we have that S(n) ≤ S(n+ 1). On the other handτ(k) = supn∈N σ(k, n) = supn∈N:n≥k σ(k, n) ≤ supn∈N:n≥k S(n) = supn∈N S(n). Therefore

S(n) ≤ S(n+ 1) and supn∈N

S(n) = +∞ on Ω0.

Let ω ∈ S(n) > 0 ∩ Ω0. There exists k ∈ N≤n with S(n, ω) = σ(k, n, ω). It follows that

0 < S(n, ω) = σ(k, n, ω) ≤ τ(k, ω) and hence σ(k, n, ω) < τ(k, ω) ≤ τ(n, ω).

We thus get S(n) < τ(n) on S(n) > 0 ∩ Ω0. This implies that

|Xt(ω)| < n for all t ∈ R[0,S(n,ω)] for all ω ∈ S(n) > 0 ∩ Ω0.

Consequently the pair S(·) and n 7→ n is a reducing sequence.

117

Page 118: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Given an F·-stopping time τ we set Fτ− := σ(τ > t∩A ; t ∈ R>0, A ∈ Ft)∨F0.

24.16 Lemma. Suppose that τ is an F·-stopping time.(i) Fτ− ⊂ Fτ . If t ∈ R>0 and τ(Ω) = t then Fτ− =

∨s:s<tFs.

(ii) If σ is an F·-stopping time and σ ≤ τ then Fσ− ⊂ Fτ−.(iii) Suppose that σ is an F·-optional time, σ ≤ τ and σ < +∞, τ > 0 ⊂ σ < τ. IfA ∈ Fσ+ (and A∩σ = 0 ∈ F0) then A∩σ < +∞, τ > 0 ∈ Fτ− (A∩σ < +∞ ∈ Fτ−).(iv) Suppose that there exist a sequence σ(n) of F·-optional times (F·-stopping times) suchthat σ(n) ≤ σ(n+1), supn∈N σ(n) = τ and τ > 0 ⊂ σ(n) < τ. Then Fτ− ⊂

∨∞n=1Fσ(n)+

and A ∩ τ > 0 ∈ Fτ− for A ∈∨∞n=1Fσ(n)+ (Fτ− =

∨∞n=1Fσ(n)).

Proof. (i) If t ∈ R>0 and A ∈ Ft then τ > t ∩ A ∈ Fτ by Lemma 3.15(iv).(ii) If t ∈ R>0 and A ∈ Ft then σ > t ∩ A = τ > t ∩ (σ > t ∩ A) ∈ Fτ−.(iii) We have that Ω = σ < τ ∪ σ = τ = 0 ∪ σ = τ = +∞. Let A ∈ Fσ+. Then

A ∩ σ < t ∈ Ft for all t ∈ R>0.

A ∩ σ < +∞, τ > 0 = A ∩ σ < τ =⋃

r∈Q:r>0

τ > r ∩ (A ∩ σ < r)

On the other hand A ∩ σ < +∞, τ = 0 = (A ∩ σ = 0) ∩ τ = 0.(iv) Since B ∩ σ(n) ≥ t ∈ (F·+)σ(n) = Fσ(n)+ for all B ∈ Ft+ by Lemma 3.15(iv),

τ > t ∩ A =∞⋃n=1

((τ > t ∩ A) ∩ σ(n) ≥ t

)∈

∞∨n=1

Fσ(n)+ for all t ∈ R>0 and A ∈ Ft.

It follows that Fτ− ⊂∨∞n=1Fσ(n)+. This time we have that σ(n) = +∞ = ∅. Therefore if

A ∈ Fσ(n)+ then A ∩ τ > 0 = A ∩ σ(n) < +∞, τ > 0 ∈ Fτ− by (iii).

24.17 Lemma. Suppose that τ is a predictable F·-optional time and f : Ω → R is Fτ−-measurable. Then the process with sample path t 7→ f(ω)1[τ(ω),+∞)(t) is F·-predictable.

Proof. The optional time τ being predictable, there exist a sequence σ(n) of F·-optionaltimes such that σ(n) ≤ σ(n + 1), supn∈N σ(n) = τ and τ > 0 ⊂ σ(n) < τ. GivenB ∈

⋃∞k=1Fσ(k)+, there exists n ∈ N such that B ∈ Fσ(n)+. The process with sample path

t 7→ 1B(ω)1(σ(k,ω),+∞)(t) is left continuous and F·-adapted provided k ≥ n. As k tends to ∞this sequence of processes point wise converges to the process with sample path

t 7→ 1B(ω)1[τ(ω),+∞)∩(0,+∞)(t),

which is F·-predictable. The F·-predictability is preserved for B ∈∨∞n=1Fσ(n)+ as the

monotone class theorem shows. Let A ∈ Fτ−. We infer by Lemma 24.16(iv) that

the process with sample path t 7→ 1A(ω)1[τ(ω),+∞)∩(0,+∞)(t) is F·-predictable.

On the other hand, since τ is an F·-stopping time and, by Lemma 24.16(i), Fτ− ⊂ Fτ , wehave that A ∩ τ = 0 ∈ F0. This means that

the process with sample path t 7→ 1A∩τ=0(ω)10(t) is F·-predictable.

Being the sum of two predictable processes, t 7→ 1A(ω)1[τ(ω),+∞)(t) is F·-predictable.

118

Page 119: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

24.18 Lemma. Suppose that M· is an F·+-martingale with almost sure right continuouspath and τ is a predictable F·-optional time. Then

E[Mτ− ;A ∩ 0 < τ ≤ t] = E[Mτ ;A ∩ 0 < τ ≤ t]

for all A ∈ Fτ− and t ∈ R>0.

Proof. Since t 7→ 1A1[τ,+∞)(t) is an F·-predictable finite variation process by Lemma 24.17,we see by Theorem 22.4(ii) that t 7→ 1A1[τ,+∞)(t) is natural. This is exactly what we want.However we give a more straightforward discussion. The optional time τ being predictable,there exist a sequence σ(n) of F·-optional times such that σ(n) ≤ σ(n+ 1), supn∈N σ(n) = τand τ > 0 ⊂ σ(n) < τ. Recall that Time(F·) = Time(F·+). Let t ∈ R>0. According toCorollary 3.17(i), we have that

Mσ(k)∧t ∈ E[Mτ∧t|(F·+ ∨ Null(P ))σ(k)+] for all k ∈ N.

Let B ∈⋃∞k=1Fσ(k)+, that is, B ∈ Fσ(n)+ for some n ∈ N . Since σ(k) < t ∈ Fσ(k)+ for all

k ∈ N and Fσ(n)+ ⊂ Fσ(k)+ ⊂ (F·+ ∨ Null(P ))σ(k)+ for all k ∈ N≥n, it follows that

E[Mσ(k)∧t ;B ∩ σ(k) < t] = E[Mτ∧t ;B ∩ σ(k) < t] for all k ∈ N≥n.

We see that σ(k) < t = τ = 0, σ(k) = 0 ∪ τ > 0, σ(k) < t. Therefore

E[Mσ(k) ;B ∩ τ > 0, σ(k) < t] = E[Mτ∧t ;B ∩ τ > 0, σ(k) < t] for all k ∈ N≥n.

Since σ(k + 1) < t ⊂ σ(k) < t and, by Lemma 4.3,⋂∞k=1σ(k) < t = τ ≤ t, the

right hand side converges to E[Mτ∧t ;B ∩ 0 < τ ≤ t] as k tends to ∞. On the other handthere exists Ω0 ∈ F such that P (Ω0) = 1 and M·(ω) admits left-hand limits everywherefor all ω ∈ Ω0. Since σ(k) < τ on τ > 0, the sequence Mσ(k)1τ>0,σ(k)<t converges toMτ−10<τ≤t on Ω0. Taking the uniform integrability of Mσ(k)∧t into account, we infer that

E[Mτ− ;B ∩ 0 < τ ≤ t] = E[Mτ∧t ;B ∩ 0 < τ ≤ t] = E[Mτ ;B ∩ 0 < τ ≤ t].

The above extends to∨∞n=1Fσ(n)+ automatically. Since Fτ− ⊂

∨∞n=1Fσ(n)+ by Lemma 24.16,

we reach the statement.

119

Page 120: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

25 Semigroup of measure kernel and Markov process

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields.

S is a locally compact Hausdorff space with countable base and κt(·; ?) is a oneparameter family of probability measure kernels on S × Borel(S) such that∫

S

κt(·;A)κs(x; ·) = κt+s(x;A) for all t, s ∈ R≥0, x ∈ S,A ∈ Borel(S).

We additionally assume that κ0(x; ·) = δx for all x ∈ S and

t 7→∫S

f κt(x; ·) is right continuous at 0 for all x ∈ S, f ∈ C∞(S).

Then, given t(1), . . . , t(n) ∈ R≥0 with t(1) < · · · < t(n) and A1, . . . , An ∈ Borel(S), wecan inductively define a Borel(S)-measurable function by κ(x; t;A) = κt(x;A) and

κ(x; t(1), . . . , t(n);A1, . . . , An) :=

∫A1

κ(·; t(2)− t(1), . . . , t(n)− t(1);A2, . . . , An)κt(1)(x; ·).

25.1 Lemma. Suppose that 0 ≤ s(1) < · · · < s(m−1) < s(m) and A1, . . . , Am−1 ∈ Borel(S).Then κ(·; s(1), . . . , s(m);A1, . . . , Am−1, ·) is a probability measure kernel on S×Borel(S) and∫

A

κ(·; t(1), . . . , t(n);B1, . . . , Bn)κ(x; s(1), . . . , s(m);A1, . . . , Am−1, ·)

= κ(x; s(1), . . . , s(m), t(1) + s(m), . . . , t(n) + s(m);A1, . . . , Am−1, A,B1, . . . , Bn)

for all t(1), . . . , t(n) ∈ R≥0 with 0 < t(1) < · · · < t(n) and A,B1, . . . , Bn ∈ Borel(S).

Given t1, . . . , tn ∈ R≥0, A1, . . . , An ∈ Sbset(S), and f1, . . . , fn ∈ Map(S,R), we set

cyldr(t1, . . . , tn;A1, . . . , An) := w ∈ Map(R≥0, S) : w(t1) ∈ A1, . . . , w(tn) ∈ An,cyldr[t1, . . . , tn; f1, . . . , fn] : Map(R≥0, S) → R, w 7→ f1(w(t1)) · · · fn(w(tn)).

Given a subinterval I of R≥0, denote by Cyldr(I, S) the collection of all the setscyldr(t1, . . . , tn;A1, . . . , An) with n ∈ N, t1, . . . , tn ∈ I and A1, . . . , An ∈ Borel(S).

25.2 Theorem. There exists a unique measure kernel µ(·; ?) on S × σ(Cyldr(R≥0, S)) suchthat

µ(x; cyldr(t(1), . . . , t(n);A1, . . . , An)) = κ(x; t(1), . . . , t(n);A1, . . . , An)

for all t(1), . . . , t(n) ∈ R≥0 with t(1) < · · · < t(n) and A1, . . . , An ∈ Borel(S).

Proof. We can apply the Kolmogorov extension theorem to show the existence and theuniqueness by virtue of Lemma 25.1. One deduces the Borel(S)-measurability of x 7→ µ(x;A)for each A ∈ σ(Cyldr(R≥0, S)) from that of the functions x 7→ κ(x; t(1), . . . , t(n);A1, . . . , An)by invoking the monotone class theorem.

120

Page 121: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

We denote by κ the probability measure kernel described in Theorem 25.2.

25.3 Corollary. For all x ∈ S,r ∈ R≥0, A ∈ σ(Cyldr(R[0,r], S)) and B ∈ σ(Cyldr(R≥0, S))∫A

κ(evalr;B)κ(x; ·) = κ(x; eval·+r ∈ B ∩ A)

where evalt : Map(R≥0, S) → S is the evaluation w → w(t) for each t ∈ R≥0. In particular

Ttf(evalr) ∈ E[f(evalt+r)|σ(Cyldr(R[0,r], S))] for all t ∈ R≥0 and f ∈ C∞(S)

where the conditional expectation refers to the probability measure κ(x; ·).

Proof. Theorem 25.2 and Lemma 25.1.

Given t ∈ R≥0, f ∈ bBorel(S,R) denote by Ttf the function x 7→∫Sf κt(x; ·) on S.

25.4 Lemma. Suppose that s ∈ R≥0 and X· is an F·-adapted S-valued process, and

Ttf(Xr) ∈ E[f(Xt+r)|Fr] for all t ∈ R≥0, r ∈ R≥s and f ∈ C∞(S).

(i) κ(Xr;B) ∈ P (X·+r ∈ B|Fr) for all r ∈ R≥s and B ∈ σ(Cyldr(R≥0, S)).(ii) t 7→ Xt and t 7→ f(Xt) with f ∈ C(S) are right continuous in probability on R≥s.

Proof. (i) It suffices to show the following claim: if 0 ≤ t(1) < t(2) < · · · < t(n), r ∈ R≥s,B1, B2, . . . , Bn ∈ Borel(S) and A ∈ Fr then

E[κ(Xr; cyldr(t(1), . . . , t(n);B1, . . . , Bn)) ;A]

= P (X·+r ∈ cyldr(t(1), . . . , t(n);B1, . . . , Bn) ∩ A).

This reads by Theorem 25.2 that

E[κ(Xr; t(1), . . . , t(n);B1, . . . , Bn) ;A]

= P (X·+r ∈ cyldr(t(1), . . . , t(n);B1, . . . , Bn) ∩ A).

We carry out the proof by using the induction. The assumption implies that

E[

∫S

f κt(Xr ·) ;A] = E[f(Xt+r) ;A] for all f ∈ C∞(S).

Since the indicator function of any open subset of S is a monotone increasing limit of func-tions in C∞(S), we get the equality in question for n = 1 by the Dynkin class theorem:

E[κt(1)(Xr;B1) ;A] = P (Xt(1)+r ∈ B1 ∩ A),

which is a special case of E[∫Sf κt(Xr ·) ;A] = E[f(Xt+r) ;A] where f ∈ bBorel(S,R).

Suppose that k ∈ N and the equality in question holds for all n ∈ N≤k. We set g :=κ(·; t(2)− t(1), . . . , t(n+ 1)− t(1);B2, . . . , Bn+1). Then it follows that

E[κ(Xr; t(1), . . . , t(n+ 1);B1, . . . , Bn+1) ;A]

= E[

∫B1

g κt(1)(Xr; ·) ;A] = E[(1B1g)(Xt(1)+r) ;A].

121

Page 122: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

The right hand side equals

E[κ(Xt(1)+r; t(2)− t(1), . . . , t(n+ 1)− t(1);B2, . . . , Bn+1) ; Xt(1)+r ∈ B1 ∩ A].

The induction assumption with r replaced by t(1) + r shows that the above coincides with

P (X·+t(1)+r ∈ cyldr(t(2)− t(1), . . . , t(n+ 1)− t(1);B2, . . . , Bn+1), Xt(1)+r ∈ B1 ∩ A)

= P (X·+r ∈ cyldr(t(1), . . . , t(n+ 1);B1, . . . , Bn+1) ∩ A).

Thus the equality in question also holds for n = k + 1.(ii) We exploit the additional property of the measure kernels: κt(x; ·) converges to δx in

Cb(S)-weak topology as t ↓ 0, which so far played no role. Suppose that r, t ∈ R≥0, s ≤ r < tand g : S × S → R is bounded and continuous. Then we have that

E[g(Xr, Xt)] = E[

∫S

g(Xr, ·)κt−r(Xr; ·)]

according to (i). Since∫Sg(x, ·)κt−r(x; ·) converges to g(x, x) as t ↓ r for all x ∈ S, the

right hand side above converges to E[g(Xr, Xr)] by the dominated convergence theorem.Specifying g as mindist(·, ?), 1 respectively min|f(·)− f(?)|, 1, we get the claim.

25.5 Lemma. (i) If f ∈ Cb(S) then t 7→ Ttf(x) is right continuous for all x ∈ S.(ii) If f ∈ bBorel(S,R) then (t, x) 7→ Ttf(x) is Borel(R≥0)⊗ Borel(S)-measurable.

Proof. Let f ∈ Cb(S). Since Ttf(x) =∫f(evalt)κ

(x; ·), it follows by Corollary 25.3 andLemma 25.4 that t 7→ Ttf(x) is right continuous for all x ∈ S. Taking into account thatx 7→ Ttf(x) is Borel(S)-measurable for all t ∈ R≥0, we see that (t, x) 7→ Ttf(x) is measurable.The indicator function of any open subset of S is a non-decreasing limit of functions in C∞(S).We therefore get (ii) by the usual monotone class argument.

Given α ∈ R>0, f ∈ bBorel(S,R), set Uαf : S → R, x 7→∫

(0,+∞)e−αtTtf(x)λ(dt).

25.6 Lemma. (i) If f ∈ bBorel(S,R) then Uαf is Borel(S)-measurable for all α ∈ R>0. Iff ∈ Cb(S) then αUαf(x) converges to f(x) as α tends to +∞ for all x ∈ S.(ii) If Tt sends C∞(S) into itself for all t ∈ R>0 then so does Uα for all α ∈ R>0.(iii) If Uα sends C∞(S) into itself for all α ∈ R>0 then αUαf converges to f uniformly as αtends to +∞ for all f ∈ C∞(S).

Proof. (i) We get the measurability of Uαf by Lemma 25.5(i) and Fubini’s theorem. Letf ∈ Cb(S). Since Ttf(x) converges to f(x) for all x ∈ S as t tends to 0 by Lemma 25.5(i),it therefore follows by the dominated convergence theorem that

αUαf(x) =

∫ ∞

0

Tt/αf(x)e−t dt converges to f(x) for all x ∈ S as α tends to +∞.

(ii) Suppose that Ttf ∈ C∞(S) for all t ∈ R>0 and f ∈ C∞(S). Since |Ttf(x)| ≤ ‖f‖ forall t ∈ R>0 and x ∈ S where ‖f‖ := supx∈S |f(x)|, we infer by the dominated convergencetheorem that Uαf ∈ C∞(S).

122

Page 123: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

(iii) Fubini’s theorem shows that∫S

Uαf κt(x; ·) =

∫(0,∞)×S

Tsf(y)e−αs λ(ds)κt(x; dy) =

∫(0,∞)

Ts+tf(x)e−αs λ(ds)

for α ∈ R>0, f ∈ bBorel(S,R), t ∈ R≥0 and x ∈ S, which reads

TtUαf(x) =

∫(t,∞)

Tsf(x)e−α(s−t) λ(ds).

We first deduce the resolvent equation. We see that∫(0,∞)

TtUαf(x)e−βt λ(dt) =

∫(s,t):0<t<s

Tsf(x)e−α(s−t)e−βt λ(ds)λ(dt)

for α, β ∈ R>0, f ∈ bBorel(S,R) and x ∈ S. If α 6= β then the right hand equals∫(0,∞)

Tsf(x)e−αse(α−β)s − 1

α− βλ(ds) =

Uβf(x)− Uαf(x)

α− β.

Thus we get the resolvent equation (α − β)UβUαf = Uβf − Uαf . Since Uα sends C∞(S)into itself for all α ∈ R>0, this implies that

Uαf = Uβ(f − (α− β)Uαf) ∈ UβC∞(S) for all f ∈ C∞(S).

The role of parameters α, β being symmetric, we see that UαC∞(S) = UβC∞(S). Denotethis common linear subspace of C∞(S) byR. Let g ∈ R. Then g = U1f for some f ∈ C∞(S).Since βUβU1f − U1f = Uβ(U1f − f) by the resolvent equation, it follows that

|βUβg(x)− g(x)| ≤∫ ∞

0

|Tt(U1f − f)(x)|e−βt dt ≤ ‖U1f − f‖/β for all x ∈ S.

The right hand side converges to 0 as β tends to +∞. Hence

‖βUβg − g‖ converges to 0 as β tends to +∞ for all g ∈ R.

We identify the norm closure of R. Being a convex subset, its norm closure coincides withthe weak closure. Let f ∈ C∞(S). According to (i),

βUβf(x) converges to f(x) for all x ∈ S as β tends to +∞.

The dual space of C∞(S) is canonically identified with the set of all finite signed Borelmeasures on S. Invoking the dominated convergence theorem, we infer that

βUβf weakly converges to f as β tends to +∞.

Consequently the weak closure of R coincides with C∞(S) and so does the norm closure.Finally, given ε ∈ R>0, choose g ∈ R such that ‖f − g‖ < ε. Taking into account that‖Uβ(f − g)‖ ≤ ‖f − g‖/β we reach that

‖βUβf − f‖ ≤ ‖βUβg − g‖+ ‖βUβ(f − g)‖+ ‖f − g‖ ≤ ‖βUβg − g‖+ 2ε.

The first term of the right hand side converges to 0 as β tends to +∞.

123

Page 124: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

We characterize the Markov property in terms of Laplace transform.

25.7 Lemma. Suppose that r ∈ R≥0, f ∈ Cb(S), X· is an F·-adapted S-valued process, andt 7→ f(Xt) is right continuous in probability. Then

Ttf(Xr) ∈ E[f(Xt+r)|Fr] for all t ∈ R≥0 if and only ifE[Uαf(Xr) ;A]e−αr =

∫ ∞rE[f(Xt) ;A]e−αt dt for all α ∈ R>0 and A ∈ Fr.

Let s ∈ R≥0. If f(x) ≥ 0 for all x ∈ S and the above condition holds for all r ∈ R≥s thent 7→ Uαf(Xt)e

−αt is an F·-supermartingale after time s.

Proof. Let α ∈ R>0 and A ∈ Fr. Since (t, x) 7→ Ttf(x) is measurable by Lemma 25.5(ii),invoking Fubini’s theorem, we see that

E[Uαf(Xr) ;A] =

∫ +∞

0

E[Ttf(Xr) ;A]e−αtdt.

Note that t 7→ E[Ttf(Xr) ;A] is right continuous by Lemma 25.5(i). Since t 7→ f(Xt) is rightcontinuous in probability, t 7→ E[f(Xt+r) ;A] is also right continuous. We see that∫ ∞

0

E[f(Xt+r) ;A]e−αt dt = eαr∫ ∞

r

E[f(Xt) ;A]e−αt dt

by the translation invariance of the Lebesgue measure. It follows that∫ +∞

0

E[Ttf(Xr) ;A]e−αtdt =

∫ +∞

0

E[f(Xt+r) ;A]e−αtdt

if and only if E[Uαf(Xr) ;A]e−αr =

∫ ∞

r

E[f(X·) ;A]e−αt dt.

Thus we get the equivalence in question by applying the uniqueness of Laplace transform.

25.8 Definition. Suppose that X· is an S-valued process. A subset Θ of Ω is called anX·-cylinder set based at time parameters up to t ∈ R≥0 if there exists B ∈ Cyldr(R[0,t], S)such that Θ = ω ∈ Ω : X·(ω) ∈ B.

Let t ∈ R≥0. CyldrX·≤t stands for the family of all X·-cylinder sets based at timeparameters up to t and σX·≤t the σ-field over Ω generated by CyldrX·≤t.

25.9 Theorem. Suppose that X· is an F·-adapted S-valued process, t 7→ Uαf(Xt) is rightcontinuous in probability for all α ∈ R>0 and f ∈ C∞(S), and

Ttf(Xr) ∈ E[f(Xt+r)|Fr] for all t ∈ R≥0, r ∈ R≥0 and f ∈ C∞(S).

(i) For each lower semi-integrable and σX· ∨ Null(P )-measurable random variable Y theclass E[Y |Ft+ ∨ Null(P )] contains a σX·≤t-measurable representative for all t ∈ R≥0.(ii) (Ft+ ∨ Null(P )) ∩ (σX· ∨ Null(P )) = σX·≤t ∨ Null(P ) for all t ∈ R≥0.(iii) A process M· is a σX·≤· ∨ Null(P )-martingale if and only if it is an F·+ ∨ Null(P )-martingale and Mt is σX· ∨ Null(P )-measurable for each t ∈ R≥0.

124

Page 125: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. (i) Let f ∈ C∞(S), s ∈ R≥0 and A ∈ Fs+. According to Lemma 25.4(ii), t 7→ f(Xt)is right continuous in probability. Since A ∈ Fr for all r ∈ R>s, we see by Lemma 25.7 that

E[Uαf(Xr) ;A]e−αr =

∫ ∞

r

E[f(Xt) ;A]e−αt dt for all r ∈ R>s and α ∈ R>0.

Since t 7→ Uαf(Xt) is right continuous in probability, tending r to s, we get

E[Uαf(Xs) ;A]e−αs =

∫ ∞

s

E[f(Xt) ;A]e−αt dt for all α ∈ R>0 and A ∈ Fs+.

Applying Lemma 25.7 with Lemma 25.4(ii) taking into account, we infer that

Ttf(Xs) ∈ E[f(Xt+s)|Fs+ ∨N ] for all t ∈ R≥0, s ∈ R≥0 and f ∈ C∞(S).

Here N := Null(P ). Let t ∈ R≥0. Given Θ ∈ σX·, there exists A ∈ Cyldr(R[0,t], S) andB ∈ Cyldr(R≥0, S) such that Θ = X· ∈ A,X·+t ∈ B. It follows by Lemma 25.4(i) that

κ(Xt;B) ∈ E[1B(X·+t)|Ft+ ∨N ].

Since X· ∈ A ∈ σX·≤t ⊂ Ft+, we see that

1A(X·)κ(Xt;B) ∈ E[1A(X·)1B(X·+t)|Ft+ ∨N ] = E[1Θ|Ft+ ∨N ].

Observe that 1A(X·)κ(Xt;B) is σX·≤t-measurable. We introduce the following family:

D := Θ ∈ σX· : ∃Y σX·≤t-measurable, non-negative and Y ∈ E[1Θ|Ft+ ∨N ],

which is a Dynkin system by Theorem 1.10. Since CyldrX· ⊂ D, we reach the statementby invoking the monotone class theorem.

(ii) Let A ∈ (Ft+ ∨ N ) ∩ (σX· ∨ N ). Since 1A ∈ E[1A|Ft+ ∨ N ], there exists aσX·≤t-measurable modification Y of 1A by (i), which means A ∈ σX·≤t ∨N .

(iii) Suppose that M· is a σX·≤· ∨ N -martingale, s, t ∈ R≥0 and s < t. Then thereexists a σX·≤s-measurable random variable Y such that Y ∈ E[Mt|Fs+ ∨N ] by (i). SinceσX·≤s ⊂ Fs ⊂ Fs+, we have that Y ∈ E[Mt|σX·≤s ∨ N ]. The right hand contains Ms

as well. It follows that Ms = Y a.s. and hence Ms ∈ E[Mt|Fs+ ∨N ].

25.10 Theorem. Suppose that X· is an F·-adapted S-valued process such that

Ttf(Xr) ∈ E[f(Xt+r)|Fr] for all t ∈ R≥0, r ∈ R≥0 and f ∈ C∞(S).

(i) If αUαf converges to f uniformly as α tends to +∞ for all f ∈ C∞(S) then X· has amodification with D(R≥0, S) sample path where S is a one point compactification of S.(ii) If Uα sends C∞(S) into itself for all α ∈ R>0 then X· has a modification with D(R≥0, S)sample path.

Proof. (i) Suppose that f ∈ C∞(S), f ≥ 0 and α ∈ R>0. Then it follows by Lemma 25.7that t 7→ Uαf(Xt)e

−αt is an F·-supermartingale. Invoking Lemma 14.10, we see that

lim infst:s∈Q Uαf(Xs) = lim supst:s∈Q U

αf(Xs) for all t ∈ R≥0 a.s.lim infst:s∈Q U

αf(Xs) = lim supst:s∈Q Uαf(Xs) for all t ∈ R>0 a.s.

125

Page 126: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Due to the linearity of Uα the above holds for any f ∈ C∞(S). Let R be a countable densesubset of C∞(S). We select and fix Ω0 ∈ F such that P (Ω0) = 1 and

lim infst:s∈Q Unf(Xs(ω)) = lim supst:s∈Q U

nf(Xs(ω)) for all t ∈ R≥0 andlim infst:s∈Q U

nf(Xs(ω)) = lim supst:s∈Q Unf(Xs(ω)) for all t ∈ R>0

for all ω ∈ Ω0, n ∈ N and f ∈ R. Let ω ∈ Ω0, t ∈ R≥0 and f ∈ R. Given ε ∈ R>0, thereexists n ∈ N such that |nUnf(x)− f(x)| < ε for all x ∈ S. It follows that

lim supst:s∈Q

f(Xs(ω)) ≤ lim supst:s∈Q

nUnf(Xs(ω)) + ε

= lim infst:s∈Q

nUnf(Xs(ω)) + ε ≤ lim infst:s∈Q

f(Xs(ω)) + 2ε.

The same argument works for the left hand limits. Thus the following holds for all ω ∈ Ω0:

lim infst:s∈Q f(Xs(ω)) = lim supst:s∈Q f(Xs(ω)) for all t ∈ R≥0, f ∈ R andlim infst:s∈Q f(Xs(ω)) = lim supst:s∈Q f(Xs(ω)) for all t ∈ R>0, f ∈ R.

Since R is dense in C∞(S), this implies that if ω ∈ Ω0 then

limst:s∈QXs(ω) 6= ∅ for all t ∈ R≥0 and limst:s∈QXs(ω) 6= ∅ for all t ∈ R>0.

Here the limit is considered in the one point compactification S. Consequently there existsan S-valued process Y· such that its every sample path is right continuous and admits lefthand limits everywhere, and limst:s∈QXs(ω) = Yt(ω) for all ω ∈ Ω0 and t ∈ R≥0. On theother hand, according to Lemma 25.4(ii), t 7→ Xt is right continuous in probability. Thisimplies that Yt = Xt a.s. for all t ∈ R≥0.

(ii) Suppose that Uαf ∈ C∞(S) for all α ∈ R>0 and f ∈ C∞(S). We regard func-tions in C∞(S) take the value 0 at the infinity of the compactification S. According toLemma 25.6(iii), the premise in (i) is fulfilled. Let Y· be as in the discussion for (i). Its everysample path is right continuous and admits left hand limits everywhere. We select and fixf ∈ C∞(S) such that f(x) > 0 for all x ∈ S. Note that U1f(x) > 0 for all x ∈ S. Set

ζ := inft ∈ R≥0 : U1f(Yt)e−t ≤ 0 ∧ inft ∈ R>0 : U1f(Yt−o)e

−t ≤ 0.

For each n ∈ N introduce the following F· ∨ Null(P )-optional time:

τ(n) := inft ∈ R≥0 : Zt < 1/n where Z· : t 7→ U1f(Yt)e−t.

Since supn∈N τ(n) = ζ by Lemma 4.5, ζ is an F·∨Null(P )-optional time. On the other handZ· : t 7→ U1f(Yt)e

−t is an F·∨Null(P )-supermartingale by Lemma 25.7. Moreover its samplepath is right continuous and Zt ≥ 0 for all t ∈ R≥0. Invoking Lemma 7.7(i), we see that

E[Zζ ; ζ < +∞] ≤ E[Zτ(n) ; τ(n) < +∞] ≤ P (τ(n) < +∞)/n ≤ 1/n for all n ∈ N.

The second inequality derives by Zτ(n) ≤ 1/n on τ(n) < +∞. Tending n to ∞, we reachthat E[Zζ ; ζ < +∞] = 0 and, due to the non-negativity, Zζ = 0 a.s. on ζ < +∞. Thisargument works for any optional time σ with ζ ≤ σ. Taking the right continuity into accountwe infer that there exists Ω1 ∈ F such that Ω1 ⊂ Ω0, P (Ω1) = 1 and Zζ+t = 0 for all t ∈ R≥0

on ζ < +∞ ∩ Ω1. Since U1f(x) > 0 for all x ∈ S, this reads as follows:

126

Page 127: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

if ω ∈ Ω1 and ζ(ω) < +∞ then Yt(ω) 6∈ S for all t ∈ R≥ζ(ω).

On the other hand if ω ∈ Ω1 and T < ζ(ω) then T < τ(n, ω) for some n ∈ N, which meansU1f(Yt(ω)) ≥ Zt(ω) ≥ 1/n for all t ∈ R[0,T ]. Since x ∈ S : U1f(x) ≥ 1/n is compact,

if ω ∈ Ω1 and T < ζ(ω) then Yt(ω) ; t ∈ R[0,T ] is a relatively compact subset of S.

Let t ∈ R≥0. Then we see that t < ζ ∩ Ω1 = Yt ∈ S ∩ Ω1 and hence

P (t < ζ) = P (Yt ∈ S) = P (Xt ∈ S) = E[κt(X0;S)].

The right hand side equals 1. Therefore, tending t to +∞, we conclude that ζ = +∞ a.s.

26 Strong Markov property and quasi left continuity

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields.

S is a locally compact Hausdorff space with countable base and κt(·; ?) is asemigroup of probability measure kernels on S×Borel(S) such that κ0(x; ·) = δxand t 7→ κt(x; ·) is C∞(S)-weak right continuous at 0 for all x ∈ S.

We still keep the same scheme of notations as in the previous section.

26.1 Lemma. Suppose that s ∈ R≥0, f ∈ bBorel(S,R), X· is an F·-adapted S-valuedprocess, t 7→ f(Xt) is right continuous a.s., and

Ttf(Xr) ∈ E[f(Xt+r)|Fr] for all t ∈ R≥0, r ∈ R≥s.

(i) t 7→ Uαf(Xt)e−αt + Re−α·f(X)dλt is an F·-martingale after time s for all α ∈ R>0.

(ii) Let α ∈ R>0. If t 7→ Uαf(Xt) is right continuous a.s. then

E[Uαf(Xσ)e−ασ ;A ∩ σ < +∞] =

∫ ∞

0

E[f(Xt) ;A ∩ σ ≤ t]e−αtdt

for all F· ∨ Null(P )-optional times σ with σ ≥ s and A ∈ (F· ∨ Null(P ))σ+.

Proof. (i) There exists Ω0 ∈ F such that P (Ω0) = 1 and t 7→ f(Xt(ω)) is right continuousfor every ω ∈ Ω0. Let α ∈ R>0, r ∈ R≥s and A ∈ Fr. According to Fubini’s theorem,∫ ∞

r

E[f(Xt) ;A]e−αt dt = E[

∫(0,+∞)

f(X·)e−α·λ−

∫(0,r)

f(X·)e−α·λ ;A ∩ Ω0]

If ω ∈ Ω0 then∫

(0,t)e−α·f(X·(ω))λ = Re−α·f(X)dλt(ω) for all t ∈ R≥0∪+∞. Therefore

it follows by Lemma 25.7 that

E[Uαf(Xr)e−αr ;A] = E[Re−α·f(X)dλ∞ ;A]− E[Re−α·f(X)dλr ;A] for all A ∈ Fr.

This implies that Uαf(Xr)e−αr +Re−α·f(X)dλr ∈ E[Re−α·f(X)dλ∞|Fr].

127

Page 128: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

(ii) Suppose that t 7→ Uαf(Xt) is right continuous a.s. Then so is the martingale discussedin (i). We also verified its uniform integrability. Let σ be an F·∨Null(P )-optional time withσ ≥ s. It follows by Theorem 14.8(iv) that

Uαf(Xσ)e−ασ1σ<+∞ +Re−α·f(X)dλσ ∈ E[Re−α·f(X)dλ∞|(F· ∨ Null(P ))σ+].

Let A ∈ (F· ∨ Null(P ))σ+. The notation Ω0 in (i) is still in force. We infer that

E[Uαf(Xσ)e−ασ ;A ∩ σ < +∞] = E[Re−α·f(X)dλ∞ −R· · · σ ;A ∩ σ < +∞]

= E[

∫[σ,+∞)

f(X·)e−α·λ ;A ∩ σ < +∞ ∩ Ω0] =

∫ ∞

0

E[f(Xt) ;A ∩ σ ≤ t]e−αtdt.

The last equality derives by Fubini’s theorem.

X· is an F·-adapted S-valued process whose almost every sample path is rightcontinuous, and σ is an F· ∨ Null(P )-optional time unless otherwise stated.

26.2 Lemma. Suppose that f ∈ Cb(S), and for all α ∈ R>0 and A ∈ (F· ∨ Null(P ))σ+

E[Uαf(Xσ)e−ασ ;A ∩ σ < +∞] =

∫ ∞

0

E[f(Xt) ;A ∩ σ ≤ t]e−αtdt.

(i) Tt−σf(Xσ)1σ≤t ∈ E[f(Xt)1σ≤t|(F· ∨ Null(P ))σ+] for all t ∈ R≥0.(ii) If f ≥ 0 then Ttf(Xσ)1σ<+∞ ∈ E[f(Xt+σ)1σ<+∞|(F· ∨ Null(P ))σ+] for all t ∈ R≥0.

Proof. (i) Let α ∈ R>0. Due to the translation invariance of the Lebesgue measure,

Uαf(x)e−αs =

∫[0,+∞)

Ttf(x)e−α(t+s) λ(dt) =

∫[s,+∞)

Tt−sf(x)e−αt λ(dt) for all s ∈ R≥0.

Let A ∈ (F· ∨ Null(P ))σ+. Since (t, x) 7→ Ttf(x) is measurable by Lemma 25.5, we see that

E[Uαf(Xσ)e−ασ ;A ∩ σ < +∞] =

∫ +∞

0

E[Tt−σf(Xσ) ;A ∩ σ ≤ t]e−αtdt

by Fubini’s theorem. The right continuity of t 7→ Ttf(x) for each x ∈ S implies the rightcontinuity of t 7→ E[Tt−σf(Xσ) ;A∩σ ≤ t]. While, due to the sample path right continuity,t 7→ E[f(Xt) ;A ∩ σ ≤ t] is also right continuous. Thus we conclude that

E[Tt−σf(Xσ) ;A ∩ σ ≤ t] = E[f(Xt) ;A ∩ σ ≤ t] for all t ∈ R≥0

by applying the uniqueness of Laplace transform.(ii) Suppose that α ∈ R>0 and A ∈ (F· ∨ Null(P ))σ+. Observe that eασ is non-negative

and (F· ∨ Null(P ))σ+-measurable. The integrands being non-negative, we infer that

E[Uαf(Xσ) ;A ∩ σ < +∞] =

∫ +∞

0

E[f(Xt)e−α(t−σ) ;A ∩ σ ≤ t]dt

128

Page 129: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

by the monotone convergence theorem. There exists Ω0 ∈ F such that P (Ω0) = 1 andt 7→ f(Xt(ω)) is right continuous for every ω ∈ Ω0. The right hand side equals

E[

∫[σ,+∞)

f(X·)e−α(·−σ)λ ; Ω0 ∩ A ∩ σ < +∞] =

∫ +∞

0

E[f(Xt+σ)e−αt ;A ∩ σ < +∞]dt

by Fubini’s theorem. While, (t, x) 7→ Ttf(x) being measurable, the left hand side is theLaplace transform of t 7→ E[Ttf(Xσ) ;A ∩ σ < +∞]. Therefore it follows that

E[Ttf(Xσ) ;A ∩ σ < +∞] = E[f(Xt+σ) ;A ∩ σ < +∞] for all t ∈ R≥0

by applying the uniqueness of Laplace transforms.

26.3 Lemma. (i) Suppose that s ∈ R≥0 and the followings hold for all f ∈ C∞(S):

Ttf(Xr) ∈ E[f(Xt+r)|Fr] for all t ∈ R≥0, r ∈ R>s,

Ttf(Xs) ∈ E[f(Xt+s)|Fs+] for all t ∈ R≥0, and

Ts−σf(Xσ)1σ≤s ∈ E[f(Xs)1σ≤s|(F· ∨ Null(P ))σ+].

Then, given B ∈ σ(Cyldr(R≥s, S)), there exists g ∈ bBorel(S,R) such that

Ts−σg(Xσ)1σ≤s ∈ P (X· ∈ B, σ ≤ s|(F· ∨ Null(P ))σ+).

(ii) If Ttf(Xσ+r)1σ<+∞ ∈ E[f(Xt+σ+r)1σ<+∞|(F· ∨ Null(P ))(σ+r)+] for all t, r ∈ R≥0, f ∈C∞(S) then κ(Xσ;B) ∈ P (X·+σ ∈ B|(F· ∨ Null(P ))σ+) for all B ∈ σ(Cyldr(R≥0, S)).

Proof. (i) Let A ∈ (F· ∨ Null(P ))σ+ and C ∈ σ(Cyldr(R≥0, S)). Set Gt := Ft for t ∈ R>s

and Gs := Fs+. Applying Lemma 25.4(i) with the filtration G· we get

κ∗(Xs;C) ∈ P (X·+s ∈ C|Fs+).

Since A ∩ σ ≤ s ∈ (F· ∨ Null(P ))s+ = Fs+ ∨ Null(P ) by Lemma 4.8(i), it follows that

P (X·+s ∈ C, σ ≤ s, A) = E[κ∗(Xs;C) ;A ∩ σ ≤ s] for all A ∈ (F· ∨ Null(P ))σ+.

Thus the statement holds for B := θ−1s (C) and g := κ∗(·;C).

(ii) The discussion is the same as that for Lemma 25.4(i). Consider the process t 7→ Xt+σ

with the filtration t 7→ (F· ∨ Null(P ))(σ+t)+ under the measure P (·|σ < +∞).

26.4 Theorem. Suppose that s ∈ R≥0, σ ≥ s,

Ttf(Xr) ∈ E[f(Xt+r)|Fr] for all t ∈ R≥0, r ∈ R≥s and f ∈ C∞(S),

t 7→ Uαf(Xt) is right continuous a.s. for all α ∈ R>0, f ∈ C∞(S).

If t ∈ R≥s and A ∈ σ(Cyldr(R≥t, S)) then there exists g ∈ bBorel(S,R) such that

Tt−σg(Xσ)1σ≤t ∈ P (X· ∈ A, σ ≤ t|(F· ∨ Null(P ))σ+).

If B ∈ σ(Cyldr(R≥0, S)) then κ(Xσ;B) ∈ P (X·+σ ∈ B|(F· ∨ Null(P ))σ+).

Proof. Lemma 26.1, Lemma 26.2 and Lemma 26.3.

129

Page 130: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

26.5 Lemma. Suppose that limstXs 6= ∅ for all t ∈ R>0 a.s., f ∈ Cb(S), f ≥ 0,

Ttf(Xr) ∈ E[f(Xt+r)|Fr] for all t ∈ R≥0, r ∈ R≥0,

Uαf(lims↓t

Xs) = lims↓t

Uαf(Xs) for all t ∈ R≥0 a.s. for all α ∈ R>0,

Uαf(limst

Xs) = limst

Uαf(Xs) for all t ∈ R>0 a.s. for all α ∈ R>0.

If τ(·) is a non-decreasing sequence of F· ∨ Null(P )-optional times then

lim infn→∞

f(Xτ(n))1τ<+∞ ∈ E[f(Xτ )1τ<+∞|∞∨n=1

(F· ∨ Null(P ))τ(n)+] where τ := supn∈N

τ(n).

Proof. Suppose that α ∈ R>0 and σ is an F·∨Null(P )-optional time. Observe that Uαf ≥ 0and eασ is a non-negative (F·∨Null(P ))σ+-measurable function. Since t 7→ Uαf(Xt) is rightcontinuous a.s., it follows by Lemma 26.1(ii) and the proof of Lemma 26.2 that

E[Uαf(Xσ) ;A ∩ σ < +∞] =

∫ ∞

0

E[f(Xt+σ) ;A ∩ σ < +∞]e−αtdt

for all A ∈ (F· ∨Null(P ))σ+. Suppose that τ(·) is a sequence of F· ∨Null(P )-optional times,τ(n) ≤ τ(n+ 1) for all n ∈ N. Clearly

lim infn→∞ f(Xτ(n))1τ<+∞ is∨∞n=1(F· ∨ Null(P ))τ(n)+-measurable.

Let k,m ∈ N and A ∈ (F· ∨ Null(P ))τ(m)+. Then

E[Uαf(Xτ(n)) ;A, τ(n) ≤ k] =

∫ ∞

0

E[f(Xt+τ(n)) ;A, τ(n) ≤ k]e−αtdt

for α ∈ R>0 and n ∈ N≥m. Observe that τ(n + 1) ≤ k ⊂ τ(n) ≤ k for all n ∈ N and⋂∞n=1τ(n) ≤ k = τ ≤ k. There exists Ω0 ∈ F such that P (Ω0) = 1 and for every ω ∈ Ω0

the sample path t 7→ Xt(ω) is right continuous and admits left hand limits everywhere,and Uαf(Xt−o(ω)) ∈ limst U

αf(Xs(ω)) for all t ∈ R>0. We tend n to ∞. The dominatedconvergence theorem shows that the left hand side converges to

E[Uαf(Xτ−o) ;A ∩ Ω0, τ(n) < τ ∀n, τ ≤ k] + E[Uαf(Xτ ) ;A,∃n : τ(n) = τ, τ ≤ k]

while the right hand side converges to∫ ∞

0

E[f(X(t+τ)−o) ;A ∩ Ω0, τ(n) < τ ∀n, τ ≤ k]e−αtdt

+

∫ ∞

0

E[f(Xt+τ ) ;A,∃n : τ(n) = τ, τ ≤ k]e−αtdt.

We see that τ is an F· ∨Null(P )-optional time and the event ∃n : τ(n) = τ, τ ≤ k belongsto (F· ∨ Null(P ))τ+. Therefore the second terms of the two limits coincide. Thus we get

E[αUαf(Xτ−o) ;A ∩ Ω0, τ(n) < τ ∀n, τ ≤ k]

=

∫ ∞

0

αe−αtE[f(X(t+τ)−o) ;A ∩ Ω0, τ(n) < τ ∀n, τ ≤ k]dt

130

Page 131: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

for all α ∈ R>0 and k ∈ N, where we multiplied both sides by α. Since αUαf(x) convergesto f(x) as α tends to +∞ for all x ∈ S by lemma 25.6(i), we infer that

E[f(Xτ−o) ;A ∩ Ω0, τ(n) < τ ∀n, τ ≤ k] = E[f(Xτ ) ;A ∩ Ω0, τ(n) < τ ∀n, τ ≤ k]

by the dominated convergence theorem. We add E[f(Xτ ) ;A,∃n : τ(n) = τ, τ ≤ k] to theboth sides. Since f(Xτ−o(ω)) ∈ limn→∞ f(Xτ(n)(ω)) for ω ∈ Ω0 ∩ τ(n) < τ ∀n, τ < +∞,tending k to ∞, we reach that

E[lim infn→∞

f(Xτ(n)) ;A ∩ τ < +∞] = E[f(Xτ ) ;A ∩ τ < +∞].

This holds for all A ∈⋃∞m=1(F· ∨ Null(P ))τ(m)+.

26.6 Theorem. Suppose that limstXs 6= ∅ for all t ∈ R>0 a.s.,

Ttf(Xr) ∈ E[f(Xt+r)|Fr] for all t ∈ R≥0, r ∈ R≥0, f ∈ C∞(S),

Uαf(lims↓t

Xs) = lims↓t

Uαf(Xs) for all t ∈ R≥0 a.s. for all α ∈ R>0, f ∈ C∞(S),

Uαf(limst

Xs) = limst

Uαf(Xs) for all t ∈ R>0 a.s. for all α ∈ R>0, f ∈ C∞(S),

τ(·) is a non-decreasing sequence of F· ∨ Null(P )-optional times and τ := supn∈N τ(n).(i) P (Xτ ∈ limn→∞Xτ(n), τ < +∞) = P (τ < +∞).(ii) For each lower semi-integrable and σX· ∨ Null(P )-measurable random variable Y theclass E[Y |(F· ∨ Null(P ))τ+] contains a

∨∞n=1(F· ∨ Null(P ))τ(n)+-measurable representative.

(iii) If F ⊂ σX· ∨ Null(P ) then (F· ∨ Null(P ))τ+ =∨∞n=1(F· ∨ Null(P ))τ(n)+.

Proof. (i) We may assume that P (τ(n) < τ ∀n, τ < +∞) > 0. Otherwise the statementbecomes trivial. There exists Ω0 ∈ F such that P (Ω0) = 1 and t 7→ Xt(ω) admits left handlimits everywhere for every ω ∈ Ω0. We make use of the following fact: if µ, ν is a finitemeasure on S × S and

∫S×S f ⊗ g µ =

∫S×S f ⊗ g ν for all f, g ∈ C∞(S) then µ = ν. It

then follows by Lemma 26.5 that (Xτ , Xτ−o) induces the same law as (Xτ−o, Xτ−o) underP (·|Ω0, τ(n) < τ ∀n, τ < +∞). Thus P (Xτ−o = Xτ |Ω0, τ(n) < τ ∀n, τ < +∞) = 1.

(ii) Suppose that t ∈ R≥0 and A ∈ σ(Cyldr(R≥t, S)). According to Corollary 26.4, thereexists g ∈ bBorel(S,R) such that

Tt−τg(Xτ )1τ≤t ∈ P (X· ∈ A, τ ≤ t|(F· ∨ Null(P ))τ+).

Clearly τ is G :=∨∞n=1(F· ∨ Null(P ))τ(n)+-measurable. We verified in (i) that the same

measurability holds for Xτ1τ<+∞. Thus, due to the joint measurability (t, x) 7→ Ttg(x), weinfer that Tt−τg(Xτ )1τ≤t is G-measurable. Consequently

P (X· ∈ A, τ ≤ t|(F· ∨ Null(P ))τ+) contains a G-measurable representative.

We next suppose that 0 ≤ s(1) < · · · < s(m) and A1, . . . , Am ∈ Borel(S). Set

A := cyldr(s(1), . . . , s(m);A1, . . . , Am),B(i) := cyldr(s(1), . . . , s(i− 1);A1, . . . , Ai−1) for i ∈ N[2,m+1] andC(i) := cyldr(s(i), . . . , s(m);Ai, . . . , Am) for i ∈ N[1,m]

131

Page 132: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

so that we have A = C(1), A = B(i) ∩C(i) for i ∈ N[2,m], and A = B(m+ 1). We make useof the following decomposition:

1A(X·) = 1C(1)(X·)1τ≤s(1) +m∑i=2

1B(i)(X·)1s(i−1)<τ≤s(i)1C(i)(X·) + 1B(m+1)(X·)1s(m)<τ

Observe that X· ∈ B(i) ∈ F≤s(i−1) for i ∈ N[2,m+1] and C(i) ∈ σ(Cyldr(R≥s(i), S)) fori ∈ N[1,m]. We see by Lemma 3.15(iv) that

X· ∈ B(i) ∩ τ(n) > s(i− 1) ∈ (F· ∨ Null(P ))τ(n)+ for all n ∈ N

Since⋃∞n=1τ(n) > s(i− 1) = supn∈N τ(n) > s(i− 1), we get

X· ∈ B(i) ∩ s(i− 1) < τ ∈∞∨n=1

(F· ∨ Null(P ))τ(n)+ = G.

On the other hand there exist G-measurable functions gi such that

gi ∈ P (X· ∈ C(i), τ ≤ s(i)|(F· ∨ Null(P ))τ+)

by Corollary 26.6. It follows that

g1 +m∑i=2

1B(i)(X·)1s(i−1)<τgi + 1B(m+1)(X·)1s(m)<τ ∈ P (X· ∈ A|(F· ∨ Null(P ))τ+).

Observe that g1 +∑m

i=2 1B(i)(X·)1s(i−1)<τgi + 1B(m+1)(X·)1s(m)<τ is G-measurable. We intro-duce the following family:

D := Θ ∈ σX· : ∃Y G-measurable, non-negative and Y ∈ E[1Θ|(F· ∨ Null(P ))τ+].

We see that D is a Dynkin system by Theorem 1.10 and CyldrX· ⊂ D. Thus, invokingthe monotone class theorem, we get the statement.

27 orthogonality of martingales

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields.

S is a locally compact Hausdorff space with countable base, X· is anF·-adapted S-valued process whose almost every sample path is right con-tinuous, and L is a linear mapping from a subspace of C∞(S) into C∞(S).

27.1 Definition. Let X· be an S-valued process. We say a triplet (X·,F·, P ) solves the L-martingale problem if X· is F·-adapted and its almost every sample path is right continuous,and t 7→ f(Xt)−RLf(X·)dλt is an (F·, P )-martingale for every f ∈ Dom(L).

Mart2(F·) stands for the space of all square integrable F·-martingaleswith almost sure right continuous sample path.

132

Page 133: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

27.2 Lemma. (i) Let f ∈ Dom(L), α ∈ R>0 and s ∈ R≥0. Then t 7→ f(Xt)−RLf(X·)dλtis an F·-martingale after time s if and only if∫ ∞

r

E[(α− L)f(Xt) ;A]e−αtdt = E[f(Xr)e−αr ;A] for all r ∈ R≥s and A ∈ Fr.

(ii) Suppose that (X·,F·, P ) solves the L-martingale problem after time s ∈ R≥0 and σ is anF·-optional time with σ ≥ s. Then for α ∈ R>0, f ∈ Dom(L) and A ∈ Fσ+∫ ∞

0

E[(α− L)f(Xt) ;A ∩ σ ≤ t]e−αtdt = E[f(Xσ)e−ασ ;A ∩ σ < +∞].

Proof. (i) To save the space we assume that s = 0, which cause no loss of generality. Thelatter condition reads that t 7→ f(Xt)e

−αt + Re−α·(α − L)f(X)dλt is an F·-martingale.Suppose that Y· and h· are F·-adapted processes whose almost sure sample paths are rightcontinuous and locally bounded. Let β ∈ R. We set F· : t 7→ Rhdλt, Z· : t 7→ eβt andG· : t 7→ eβt − 1. Since RZdRhdλt = RZhdλt for all t ∈ R≥0 a.s., it follows that

YtZt − (Yt − Ft)(Zt −Gt)−RY dGt −RZdFt + Yt − Ft

= Yteβt −RY (βeβ·)dλt −Reβ·hdλt = Yte

βt −Reβ·(βY + h)dλt for all t ∈ R≥0 a.s.

Clearly Z· − G· ∈ Mart2(F·). Therefore if t 7→ Yt − Rhdλt belongs to Mart2(F·) then sodoes t 7→ Yte

βt − Reβ·(βY + h)dλt by Lemma 23.5. The correspondence of process pairs(Y·, h·) 7→ (Y·e

β·, eβ·(βY +h)) is inverted by changing the parameter β to its negative. Indeede−βt(−β)Yte

βt + eβt(βYt + ht) = ht. Consequently we infer that

t 7→ Yt −Rhdλt ∈ Mart2(F·) if and only if t 7→ Yteβt −Reβ·(βY + h)dλt ∈ Mart2(F·).

(ii) The discussion is the same as that for Lemma 26.1.

W := D(R≥0, S) and evalt : W → S is the evaluation w → w(t) for each t ∈ R≥0.

27.3 Lemma. Borel(W ) = σeval·. (eval· : W → Map(R≥0, S))

T· is a conservative Feller semigroup on C∞(S).

Given t(1), . . . , t(n) ∈ R≥0 with t(1) < · · · < t(n) and f1, . . . , fn ∈ C∞(S), we caninductively define a function belonging to C∞(S) by T [t; f ] := Ttf and

T [t(1), . . . , t(n); f1, . . . , fn] := Tt(1)[f1T [t(2)− t(1), . . . , t(n)− t(1); f2, . . . , fn]).

27.4 Theorem. There exists a unique measure kernel µ(·; ?) on S × Borel(W ) such that∫W

cyldr[t(1), . . . , t(n); f1, . . . , fn](eval·)µ(x; ·) = T [t(1), . . . , t(n); f1, . . . , fn](x)

for all x ∈ S, t(1), . . . , t(n) ∈ R≥0 with t(1) < · · · < t(n) and f1, . . . , fn ∈ C∞(S).

133

Page 134: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. We write κ0(x; ·) := δx for x ∈ S. According to the Riesz-Markov theorem, for eacht ∈ R>0 and x ∈ S there exists a unique probability measure κt(x ; ·) such that

Ttf(x) =

∫S

f κt(x; ·) for all f ∈ C∞(S).

Let t ∈ R>0. Being continuous, x 7→∫Sf κt(x; ·) is Borel(S)-measurable for each f ∈ C∞(S).

The indicator function of any open subset of S is a non-decreasing limit of functions in C∞(S).Therefore, if A ∈ Borel(S) then x 7→ κt(x;A) is Borel(S)-measurable by the Dynkin classtheorem. In other words

κt(·; ?) is a probability measure kernel on S × Borel(S).

Let t, s ∈ R≥0 and x ∈ S. The semigroup property of T· reads∫S

∫S

f κt(·; ?)κs(x; ·) =

∫S

f κt+s(x; ·) for all f ∈ C∞(S).

This implies that∫Sκt(·;A)κs(x; ·) = κt+s(x;A) for all A ∈ Borel(S). We denote by κ the

probability measure kernel described in Theorem 25.2. We have that∫Ω

f(evalt)κ(x, ·) = Ttf(x) for all x ∈ S, t ∈ R≥0 and f ∈ C∞(S)

where Ω := Map(R≥0, S). Suppose that x ∈ S, t(1), . . . , t(n) ∈ R≥0 with t(1) < · · · < t(n)and f1, . . . , fn ∈ C∞(S). It follows by Corollary 25.3 that∫

Ω

cyldr[t(1), . . . , t(n); f1, . . . , fn]κ(x, ·)

=

∫Ω

Tt(n)−t(n−1)fn(evalt(n−1))cyldr[t(1), . . . , t(n− 1); f1, . . . , fn−1]κ(x, ·).

The integrand int the right hand side equals cyldr[t(1), . . . , t(n− 1); f1, . . . , fn−1g] with g :=Tt(n)−t(n−1)fn. Therefore we obtain by invoking the induction that∫

Ω

cyldr[t(1), . . . , t(n); f1, . . . , fn]κ(x, ·) = T [t(1), . . . , t(n); f1, . . . , fn](x).

We see that t 7→∫Sf κt(x; ·) is continuous for each x ∈ S and f ∈ C∞(S). On the other

hand the resolvent Uα sends C∞(S) into itself for each α ∈ R>0 according to Lemma 25.6(ii).We fix x ∈ S for the time being. Invoking Theorem 25.10(ii), we infer that the canonicalprocess eval· on (Ω,F , P ), where (F , P ) is the completion of the pair σ(Cyldr(R≥0, S)) andκ(x, ·), has a modification, say X·, with D(R≥0, S)-sample path. If A ∈ σ(Cyldr(R≥0, S))then X· ∈ A = A mod Null(P ). Since X·(w) ∈ W for all w ∈ Ω, it follows that

X· ∈ A ∩W = A mod Null(P ) for all A ∈ σ(Cyldr(R≥0, S)).

Lemma 27.3 means that Borel(W ) = A∩W ;A ∈ σ(Cyldr(R≥0, S)). Therefore the mappingX· : Ω → W is measurable with respect to the pair F and Borel(W ). Denote by µ(x; ·) theimage measure (X·)∗P . Then

κ(x,A) = P (A) = P (X· ∈ A ∩W) = µ(x;A ∩W ) for all A ∈ σ(Cyldr(R≥0, S)).

This also implies the measurability of x 7→ µ(x;B) for each B ∈ Borel(W ).The uniqueness derives by Lemma 27.3 and the monotone class theorem.

134

Page 135: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

We denote by (T·)(·; ?) the probability measure kernel described in Theorem 27.4

Feller process measure kernel associate with T·.

27.5 Lemma. (i) There exists a linear mapping L from a subspace of C∞(S) into C∞(S)such that α−L is the inverse mapping of Uα for all α ∈ R>0, that is, Dom(L) = UαC∞(S),(α− L)Uαf = f for all f ∈ C∞(S), and Uα(α− L)f = f for all f ∈ Dom(L). Moreover

f ∈ C∞(S) : limα→+∞

α(αUαf − f) 6= ∅ = Dom(L) = f ∈ C∞(S) : limt→0

(Ttf − f)/t 6= ∅

If f ∈ Dom(L) then Lf ∈ limα→+∞

α(αUαf − f) and Lf ∈ limt→0

(Ttf − f)/t.

(ii) Let L be as in (i), α ∈ R>0 and D is a subspace of Dom(L). Then (α−L)D is dense inC∞(S) if and only if D is dense in Dom(L) with respect to the graph norm.

Proof. (i) The resolvent Uα sends C∞(S) into itself for each α ∈ R>0 by Lemma 25.6(ii)and, according to the proof of Lemma 25.6(iii), UαC∞(S) = U1C∞(S) for all α ∈ R>0 and

Uαf − Uβf = (β − α)UβUαf for all α, β ∈ R>0 and f ∈ C∞(S).

Suppose that α ∈ R>0, f ∈ C∞(S) and Uαf = 0. Then it follows by the resolvent equationthat Uβf = 0 for all β ∈ R>0. Since βUβf converges to f uniformly as β tends to +∞ byLemma 25.6(iii), we get f = 0. Thus we verified that

Uα induces a linear bijection from C∞(S) → U1C∞(S) for all α ∈ R>0.

We denote by L the linear mapping U1C∞(S) → C∞(S), f 7→ f − (U1)−1f so that we have(1− L)U1f = f for all f ∈ C∞(S). Then the resolvent equation shows that

(α− L)Uαf = (α− 1)Uαf + (1− L)U1((1− α)Uαf + f) = f for all f ∈ C∞(S).

Due to the bijectivity, this also implies that Uα(α−L)f = f for all f ∈ U1C∞(S). Thus thefirst part is proved. We next suppose that f ∈ U1C∞(S). Then

α(αUαf − f) = α(αUαf − Uα(α− L)f) = αUαLf.

The right hand side converges to Lf uniformly as α tends to +∞ by Lemma 25.6(iii).Conversely suppose that f, g ∈ C∞(S) and g ∈ limα→+∞ α(αUαf − f). We have that

U1α(αUαf − f) = α(U1Uαf − Uαf)

by the resolvent equation. The left hand side converges to U1g uniformly as α tends to +∞while the right hand side converges to U1f − f by Lemma 25.6(iii). It therefore follows thatf = U1f − U1g ∈ U1C∞(S). Finally we directly relate L to the semigroup T·. In the proofof Lemma 25.6(iii) we obtained the following:

TtU1f = U1Ttf = et

∫ ∞

t

Tsfe−s ds for all t ∈ R>0, f ∈ C∞(S).

Therefore we see that

1

t(TtU

1f − U1f) = U1 1

t(Ttf − f) =

et − 1

t

∫ ∞

t

Tsfe−s ds− 1

t

∫ t

0

Tsfe−s ds.

135

Page 136: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

The right hand side converges to U1f−f = LU1f uniformly as t tends to 0, which means thatif g ∈ U1C∞(S) then Lg ∈ limt→0(Ttg−g)/t while if f, g ∈ C∞(S) and g ∈ limt→0(Ttf−f)/tthen U1g = U1f − f and hence f = U1f − U1g ∈ U1C∞(S).

(ii) Let f ∈ Dom(L). Since ‖f‖ = ‖Uα(α− L)f‖ ≤ ‖(α− L)f‖/α, we see that

‖f‖+ ‖Lf‖ ≤ (1 + α)‖f‖+ ‖ − (α− L)f‖ ≤ (2 + 1/α)‖(α− L)f‖.

On the other hand ‖(α− L)f‖ is dominated by maxα, 1(‖f‖+ ‖Lf‖).

27.6 Definition. The linear mapping described in Lemma 27.5(i) is called the generator ofthe Feller semigroup T·. If the condition in (ii) holds then D is called a core of the generator.

In what follows L is a restriction of the generator of the Feller semigroup T·.

We will refer to the canonical filtration σeval·≤· on (W,Borel(W )).

27.7 Theorem. Suppose that Dom(L) is a core of the generator of the Feller semigroup T·.(i) If (X·,F·, P ) solves the L-martingale problem after time s ∈ R≥0 then (T·)

(Xr;B∩W ) ∈P (X·+r ∈ B|Fr) for all B ∈ σ(Cyldr(R≥0, S)) and r ∈ R≥s.(ii) Let x ∈ S. Then (T·)

(x, ·) is the unique probability measure on W under which the pair(eval·, σeval·≤·) solves the L-martingale problem starting from x.

Proof. (i) Let α ∈ R>0 and f ∈ (α−L)Dom(L). Then Uαf ∈ Dom(L) and (α−L)Uαf = fby Lemma 27.5(i). Applying Lemma 27.2(i), we see that

E[Uαf(Xr)e−αr ;A] =

∫ ∞

r

E[f(Xt) ;A]e−αtdt for all r ∈ R≥s and A ∈ Fr.

Since (α−L)Dom(L) is dense in C∞(S) by Lemma 27.5(ii), the above holds for all f ∈ C∞(S)provided α ∈ R>0. It therefore follows by Lemma 25.7 that

Ttf(Xr) ∈ E[f(Xt+r)|Fr] for all t ∈ R≥0, r ∈ R≥s and f ∈ C∞(S).

Denote by κt(·; ?) the family of measure kernels associate with the semigroup T· and by κ

the measure kernel described in Theorem 25.2. Then, since Ttf(x) =∫Sf κt(x; ·) for all

t ∈ R≥0, x ∈ S and f ∈ C∞(S), it follows by Lemma 25.4 that

κ(Xr;B) ∈ P (X·+r ∈ B|Fr) for all B ∈ σ(Cyldr(R≥0, S)) and r ∈ R≥s.

Since (T·)(·; ? ∩W ) = κ(·; ?) according to the proof of Theorem 27.4, we get the claim.

(ii) Let x ∈ S and t, r ∈ R≥0. It follows by Corollary 25.3 that∫B∩W

Ttf(evalr) (T·)(x; ·) =

∫B

Ttf(evalr)κ(x; ·) =

∫B

f(evalt+r)κ(x; ·)

=

∫B∩W

f(evalt+r) (T·)(x; ·) for all f ∈ C∞(S) and B ∈ σ(Cyldr(R[0,r], S)).

Since B ∩W ;B ∈ σ(Cyldr(R[0,r], S)) coincides with σeval·≤r, we get

Ttf(evalr) ∈ E[f(evalt+r)|σeval·≤r] for all f ∈ C∞(S)

where the conditional expectation refers to the probability measure (T·)(x; ·). Reverting the

argument in (i) we infer that (eval·, σeval·≤·, (T·)(x; ·)) solves the L-martingale problem.The uniqueness follows from (i).

136

Page 137: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

27.8 Theorem. Suppose that for each x ∈ S there exists at most one probability measureon W under which the pair (eval·, σeval·≤·) solves the L-martingale problem starting fromx. If (X·,F·, P ) solves the L-martingale problem then

P (X· ∈ B) = E[(T·)(X0;B ∩W )] for all B ∈ σ(Cyldr(R≥0, S)).

27.9 Theorem. Suppose that Dom(L) is a core of the generator of the Feller semigroup T·and (X·,F·, P ) solves the L-martingale problem. If M· ∈ Mart2(F·), Mt is σX·∨Null(P )-measurable for all t ∈ R≥0 and t 7→Mt(f(Xt)−RLf(X)dλt) is an (F·, P )-martingale forevery f ∈ Dom(L) then Mt = M0 for all t ∈ R≥0 a.s.

Proof. Let s ∈ R≥0, A ∈ Fs, α ∈ R>0 and g ∈ (α− L)Dom(L). Then

E[Uαg(Xr)e−αrMr ;A] =

∫ +∞

r

E[g(Xt)Mr ;A]e−αtdt for all r ∈ R≥s

according to the proof of Theorem 27.7(i). Therefore the following holds for all r ∈ R≥s:

(?)

∫ +∞

s

E[Mr∧tg(Xt) ;A]e−αt dt = E[MrUαg(Xr)e

−αr ;A] +

∫ r

s

E[Mtg(Xt) ;A]e−αt dt.

We set f := Uαg ∈ Dom(L). Since RMdRLf(X)dλt = RMLf(X)dλt for all t ∈ R≥0

a.s. and M· ∈ Mart2(F·), it follows by Lemma 23.5 that

t 7→MtRLf(X)dλt −RMLf(X)dλt ∈ Mart2(F·).

Adding the martingale t 7→Mt(f(Xt)−RLf(X)dλt), we see that

t 7→Mtf(Xt)−RMLf(X)dλt ∈ Mart2(F·).

Consequently we infer by the proof of Lemma 27.2(i) that

t 7→Mtf(Xt)e−αt +Re−α·M(α− L)f(X)dλt is an F·-martingale.

Recall that f = Uαg and (α − L)f = g. Therefore this implies together with (?) thatr 7→

∫ +∞s

E[Mr∧tg(Xt) ;A]e−αt dt is constant on R≥s. In particular the following holds:∫ +∞

s

E[Mr∧tg(Xt) ;A]e−αtdt =

∫ +∞

s

E[Msg(Xt) ;A]e−αtdt for all r ∈ R≥s.

Since (α−L)Dom(L) is dense in C∞(S) by Lemma 27.5(ii), the above holds for all g ∈ C∞(S)provided α ∈ R>0. Due to the sample path right continuity we conclude that

E[Mr∧tg(Xt) ;A] = E[Msg(Xt) ;A] for all r, t ∈ R≥s

by the uniqueness of Laplace transforms. We may take r = t. Then, provided t ∈ R≥s, thefollowing holds for all A ∈ Fs and g ∈ C∞(S):

(ℵ) E[Mtg(Xt) ;A] = E[Msg(Xt) ;A] = E[MsTt−sg(Xs) ;A]

137

Page 138: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

where we invoked Theorem 27.7(i) to get the second equality. Given 0 ≤ t(1) < · · · < t(n)and f1, . . . , fn ∈ C∞(S), with the help of Theorem 27.4 and (ℵ), we can inductively verifythe following by mimicking the argument of Lemma 25.4:

E[Mt(n)cyldr[t(1), . . . , t(n); f1, . . . , fn](X·)]

= E[M0

∫W

cyldr[t(1), . . . , t(n); f1, . . . , fn](T·)(X0; ·)].

Let t ∈ R≥0. Then, applying the monotone class argument, we deduce that

E[Mt ;X· ∈ B] = E[M0 (T·)(X0;B ∩W )] for all B ∈ σ(Cyldr(R[0,t], S)).

The right hand side coincides with E[M0 ;X· ∈ B] by Theorem 27.7(i). It follows that

E[Mt ;X· ∈ B] = E[M0 ;X· ∈ B] for all B ∈ σ(Cyldr(R[0,t], S)).

Since (X·)∗σ(Cyldr(R[0,t], S)) = σX·≤t and, according to Theorem 27.7(i) and Theo-

rem 25.9, Ft ∩ (σX·∨Null(P )) ⊂ σX·≤t ∨Null(P ), we infer that Mt = M0 a.s. Thus wereach the statement due to the sample path right continuity.

27.10 Theorem. Suppose that Dom(L) is a core of the generator of the Feller semigroup,T ∈ R>0 and (X·,F·, P ) solves the L-martingale problem. If Y is a square integrable and(FT+ ∨ Null(P )) ∩ (σX· ∨ Null(P ))-measurable random variable, and

E[Y (f(XT∧τ )−RLf(X)dλT∧τ )] = E[Y f(X0)]

for every f ∈ Dom(L) and σX·≤·-optional time τ then Y is σX0 ∨Null(P )-measurable.

Proof. We write N := Null(P ). Since σX·≤t+ ⊂ σX·≤t ∨ N for all t ∈ R≥0 and Y isσX·≤T ∨N -measurable by Theorem 27.7(i) and Theorem 25.9, it follows that

there exists M· ∈ Mart2(σX·≤· ∨N ) such that MT = Y

by Theorem 14.11. Let f ∈ Dom(L). Since σX·≤t ⊂ Ft for all t ∈ R≥0, we see that

N· : t 7→ f(Xt)−RLf(X)dλt is a σX·≤· ∨N -martingale.

Hence (X·, σX·≤·∨N , P ) solves the L-martingale problem. Let σ be a σX·≤·∨N -optionaltime. Since there exists a σX·≤·-optional time τ such that τ = σ a.s. by Lemma 4.8(iii),

E[MTNT∧σ] = E[Y (f(XT∧τ )−RLf(X)dλT∧τ )] = E[Y f(X0)] = E[M0N0].

Corollary 20.2 shows the σX·≤·∨N -martingale property of t 7→Mt∧TNt∧T . Adding anothermartingale t 7→Mt∧T (Nt −Nt∧T ), we obtain that

Mt∧T (f(Xt)−RLf(X)dλt) is a σX·≤· ∨N -martingale for all f ∈ Dom(L).

Consequently, applying Theorem 27.9, we deduce that Mt∧T = M0 for all t ∈ R≥0 a.s.

138

Page 139: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

27.11 Corollary. Suppose that Dom(L) is a core of the generator of the Feller semigroupand (X·,F·, P ) solves the L-martingale problem. Then for each T ∈ R>0 the linear span ofthe random variables below is dense in L2((FT+ ∨N ) ∩ (σX· ∨ N )) where N := Null(P ):

f(XT∧τ )−RLf(X)dλT∧τ + Y ; f ∈ Dom(L), τ ∈ Opt(σX·≤·), Y ∈ L2(σX0).

The linear span of the martingales below is dense in Mart2(σX·≤· ∨N ):

f(X·∧τ )−RLf(X)dλ·∧τ + Y ; f ∈ Dom(L), τ ∈ Opt(σX·≤·), Y ∈ L2(σX0).

27.12 Lemma. Suppose that (X·,F·, P ) solves the L-martingale problem and Dom(L) is asubalgebra of C∞(S). Then the following is an F·-martingale for each f, g ∈ Dom(L):

t 7→ (f(Xt)−RLf(X)dλt)(g(Xt)−RLg(X)dλt)−R(L(fg)− fLg − gLf)(X)dλt.

Proof. Due to the bilinearity and the symmetricity it suffices to prove the statement forg = f . Set Y· : t 7→ f(Xt), F· : t 7→ RLf(X)dλt and C· : t 7→ R(L(f 2)− 2fLf)(X)dλt.Since RY dRLf(X)dλt = RY Lf(X)dλt for all t ∈ R≥0 a.s., it follows that

Y 2t − 2RY dFt − Ct

= f(Xt)2 − 2Rf(X)Lf(X)dλt −R(L(f 2)− 2fLf)(X)dλt for all t ∈ R≥0 a.s.

The right hand side is indistinguishable from t 7→ f(Xt)2 − RL(f 2)(X)dλt, which is an

F·-martingale. Thus we obtain the statement by Lemma 23.5.

Denote by κ•(x, ·) the outer measure induced by κ(x, ·) Mble(κ(x, ·)).

28 Integrands for stochastic integration

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields. Theindex set is R≥0.

28.1 Lemma. Suppose that p ∈ R≥1, T ∈ R>0, f : R[0,T ] × Ω → R is Borel(R[0,T ]) ⊗ F-measurable and E[

∫[0,T ]

|f |pλ] < +∞ where λ is the Lebesgue measure. Then

lim supn→∞

∫[0,1)

E[

∫[0,T ]

|f([n(· − s)]/n+ s, ?)− f(·, ?)|pλ]ds = 0

where f(t, ω) = 0 for t < 0 and [x] := maxi ∈ Z : i < x for x ∈ R.

Proof. Let n ∈ N, t ∈ R[0,T ] and ω ∈ Ω. Since −1/n < t− ([nt] + 1)/n ≤ 0, we have that

[0, 1) ⊂n−[nt]⋃i=−[nt]

[t+ (i− 1)/n, t+ i/n) (disjoint union).

Observe that [n(t− s)]/n+ s = s− i/n for s ∈ [t+ (i− 1)/n, t+ i/n). It follows that∫[0,1)

|f([n(t− s)]/n+ s, ω)− f(t, ω)|p ds

≤n−[nt]∑i=−[nt]

∫[t+(i−1)/n,t+i/n)

|f(s− i/n, ω)− f(t, ω)|p ds.

139

Page 140: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

The summand in the right hand side reads∫

[−1/n,0)|f(t+ s, ω)− f(t, ω)|p ds. Consequently∫

[0,1)

|f([n(t− s)]/n+ s, ω)− f(t, ω)|p ds ≤ (n+ 1)

∫[−1/n,0)

|f(t+ s, ω)− f(t, ω)|p ds.

The translation of functions g 7→ g(· + s, ?) for s ∈ R induces a strongly continuous oneparameter isometry group on Lp(R × Ω, λ ⊗ P ). Therefore, given ε ∈ R>0, there existsδ ∈ R>0 such that if −δ < s < 0 then

E[

∫[0,T ]

|f(·+ s, ?)− f(·, ?)|pλ] < ε.

Suppose that nδ > 1. Then, invoking Fubini’s theorem, we infer that∫[0,1)

E[

∫[0,T ]

|f([n(· − s)]/n+ s, ?)− f(·, ?)|pλ]ds

≤ (n+ 1)

∫[−1/n,0)

E[

∫[0,T ]

|f(·+ s, ?)− f(·, ?)|pλ]ds ≤ (n+ 1)ε/n < 2ε.

Thus we reach the statement.

28.2 Corollary. Suppose p ∈ R≥1, f : R≥0 × Ω → R is Borel(R≥0) ⊗ F-measurable andE[

∫[0,T ]

|f |pλ] < +∞ for all T ∈ R>0. Then, given n· ∈ Seq(N) tending to ∞, there exist

s ∈ [0, 1) and φ ∈ Seq(N, ↑) such that

lim supn→∞

E[

∫[0,T ]

|f([nφ(n)(· − s)]/nφ(n) + s, ?)− f(·, ?)|pλ] = 0 for all T ∈ R>0.

where f(t, ω) = 0 for t < 0 and [x] := maxi ∈ Z : i < x for x ∈ R.

Let δ ∈ R>0 and s ∈ R. We see that t 7→ f([(t− s)/δ]δ + s, ω) is left continuous for eachω ∈ Ω and that t− δ ≤ [(t− s)/δ]δ + s < t.

28.3 Lemma. Suppose that f : R≥0×Ω → R is Borel(R≥0)⊗F-measurable. If f(t, ·) is Ft+-measurable for all t ∈ R≥0 then there exists a Pred(F·)-measurable function g : R≥0×Ω → Rsuch that (λ⊗ P )(f 6= g) = 0.

Proof. We first assume that there exists K ∈ R≥0 such that |f(t, ω)| ≤ K for all t ∈ R≥0

and ω ∈ Ω. According to Corollary 28.2, there exist s ∈ R and φ ∈ Seq(N, ↑) such that

lim supn→∞

E[

∫[0,T ]

|f([φ(n)(· − s)]/φ(n) + s, ?)− f(·, ?)|λ] = 0 for all T ∈ R>0.

By choosing a subsequence if necessary, we may assume that f([φ(n)(· − s)]/φ(n) + s, ?)converges to f(·, ?) almost everywhere with respect to λ⊗P . Since the process with samplepath t 7→ f([φ(n)(t− s)]/φ(n) + s, ω) is left continuous and F·-adapted, it follows that

g : R≥0 × Ω → R, (t, ω) 7→ lim supn→∞

f([φ(n)(t− s)]/φ(n) + s, ω)

is Pred(F·)-measurable and (λ ⊗ P )(f 6= g) = 0. For general functions we consider thecontraction minmaxf,−n, n where n ∈ N. For each n ∈ N there exists a Pred(F·)-measurable function gn : R≥0 × Ω → R such that minmaxf,−n, n = gn a.e. It followsthat lim supn→∞ gn is Pred(F·)-measurable and (λ⊗ P )(f 6= lim supn→∞ gn) = 0.

140

Page 141: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

29 Quasi-left continuous increasing process

Let (Ω,F , P ) be a complete probability space and F· be a filtration of sub σ-fields. Wediscuss submartingales with continuous parameter. So the index space is R≥0.

29.1 Lemma. Let X· be an F·-supermartingale with almost sure right continuous samplepath, and σ, τ be F· ∨Null(P )-optional times with σ ≤ τ ≤ T a.s. for some T ∈ R>0. Then

P (supXs ; s ∈ R≥0, σ ≤ s ≤ τ > λ,B) ≤ (E[Xσ ;B]− E[minXτ , 0 ;B])/λ

for λ ∈ R>0 and B ∈ (F· ∨ Null(P ))σ+

Proof. There exists a right continuous stochastic process Y· such that Xt = Yt for all t ∈ R≥0

a.s. Then Y· is an F· ∨ Null(P )-supermartingale. Given λ ∈ R>0, we write

S := infs ∈ R≥0 : s ≥ σ ∧ T, Ys > λ,

which is an F· ∨ Null(P )-optional time. Indeed

S < t =⋃

s∈Q:0≤s<t

σ ∧ T ≤ s, Ys > λ ∈ Ft ∨ Null(P ) for all t ∈ R>0

due to the right continuity of sample path. We have that

supYs ; s ∈ R≥0, σ ∧ T ≤ s ≤ τ ∧ T > λ ⇔ S < τ ∧ T or ‘S = τ ∧ T , Yτ∧T > λ’

Note that (F· ∨ Null(P ))(σ∧T )+ = (F· ∨ Null(P ))σ+ and σ ∧ T ≤ S. We see that

E[Yσ∧T ;B] ≥ E[YS∧τ∧T ;B] for B ∈ (F· ∨ Null(P ))σ+

by Theorem 3.9. The right hand side equals

E[YS ;S < τ ∧ T,B] + E[Yτ∧T ;S = τ ∧ T, Yτ∧T > λ,B] + E[Yτ∧T ;S ≥ τ ∧ T, Yτ∧T ≤ λ,B].

The second term dominates λP (S = τ ∧ T, Yτ∧T > λ,B) and the last term dominatesE[minYτ∧T , 0 ;B]. On the other hand, since YS ≥ λ on S < +∞ due to the rightcontinuity of sample path, it follows that

λP (S < τ ∧ T,B) + λP (S = τ ∧ T, Yτ∧T > λ,B) ≤ E[Yσ∧T ;B]− E[minYτ∧T , 0 ;B].

Thus we get the claim.

Let A· be an integrable F·-increasing process.

29.2 Corollary. Suppose that q ∈ R>0 and M· is a stochastic process such that almost everysample path is right continuous and Mt ∈ E[Aq|Ft+] for all t ∈ R≥0. Then

P (supMs − As ; s ∈ R(p,q] > λ,B) ≤ E[Aq − Aσ ;σ < q,B]/λ

for all p ∈ R[0,q), λ ∈ R>0 and B ∈ Fp+ ∨ Null(P ) where σ := inft ∈ R>p ;Mt − At > λ.

Proof. There exists Ω0 ∈ F such that P (Ω0) = 1 and for each ω ∈ Ω0

141

Page 142: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

t 7→ At(ω) is finite valued, right continuous and non-decreasing,t 7→Mt(ω) is finite valued and right continuous.

Since Aq ∈ E[Aq|Ft+] if t ≥ q, we can choose an appropriate subset so that

Mt(ω) = Aq(ω) ≤ At(ω) for all t ∈ R≥q and ω ∈ Ω0.

Clearly M· is an F·+-martingale and A· is an F·+-increasing process. So the difference M·−A·is an F·+-supermartingale. Note that Mq = Aq a.s. It follows by Lemma 29.1 that

P (supMs − As ; s ∈ R≥0, σ ∧ q ≤ s ≤ q > λ,C) ≤ E[Mσ∧q − Aσ∧q ;C]/λ

for all C ∈ (F· ∨ Null(P ))(σ∧q)+. Since Mq(ω)− Aq(ω) = 0 for all ω ∈ Ω0,

supMs − As ; s ∈ R(p,q] > λ if and only if supMs − As ; s ∈ R(p,q) > λ

on Ω0. The latter is equivalent to σ < q. If σ < q and Mσ − Aσ > λ then, due to the rightcontinuity of the sample path, t ∈ R≥0 ;σ < t < q,Mt − At > λ 6= ∅ on Ω0. Obviouslyσ < q and Mσ − Aσ ≤ λ imply the same conclusion. Therefore we have that

supMs − As ; s ∈ R(p,q] > λ ∩ Ω0 ⊂ supMs − As ; s ∈ R≥0, σ < s < q > λ

Since E[Mσ∧q ;C] = E[Mq ;C] by Theorem 3.9 and Mq = Aq a.s., it follows that

P (supMs − As ; s ∈ R(p,q] > λ,C) ≤ E[Aq − Aσ∧q ;C]/λ for all C ∈ (F· ∨ Null(P ))(σ∧q)+.

Finally, taking p ≤ σ into account, we see that Fp+ ∨ Null(P ) ⊂ (F· ∨ Null(P ))(σ∧q)+.

We write Q(2) :=⋃n∈Z:n≥0 2−nZ. For each q ∈ Q(2)

>0 choose a stochasticprocess N q

· such that almost every sample path is right continuous and

N qt ∈ E[Aq|Ft+] for all t ∈ R≥0.

which exists by Corollary 14.12. We set Ant := Nd2nte/2n

t for t ∈ R>0 andn ∈ Z≥0 where dxe := min Z≥x.

29.3 Lemma. Let n ∈ Z≥0, ε ∈ R>0 and T ∈ N. Then τ := inft ∈ R>0 ;Ant −At > ε ∧ Tis an F· ∨Null(P )-optional time and P (supAns −As ; s ∈ R(0,T ] > ε) ≤ E[Aτ+2−n −Aτ ]/ε.

Proof. There exists Ω0 ∈ F such that P (Ω0) = 1 and for each ω ∈ Ω0

t 7→ At(ω) is finite valued, right continuous and non-decreasing,

t 7→ N qt (ω) is finite valued and right continuous for all q ∈ Q(2)

>0,

N qt (ω) = Aq(ω) for all q ∈ Q(2)

>0 and t ∈ R≥q

For each p, q ∈ Q(2)>0 with p < q we introduce

σ(p, q) := inft ∈ R>p ;N qt − At > ε,

which is an F· ∨ Null(P )-optional time. Since σ(p, q) ∈ R[p,q) ∪ +∞ on Ω0,

inft ∈ R>0 ;Ant − At > ε = minσ(q − 2−n, q) ; q ∈ 2−nZ>0 on Ω0.

142

Page 143: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

This implies that τ is an F· ∨ Null(P )-optional time. Moreover

τ < T ∩ Ω0 =⋃

q∈2−nZ:0<q≤T

(σ(q − 2−n, q) < q ∩⋂

p∈2−nZ:0<p<q

σ(p− 2−n, p) ≥ p ∩ Ω0).

We write B(q) :=⋂p∈2−nZ:0<p<qσ(p− 2−n, p) ≥ p to save the space. It follows that

P (τ < T ) =∑

q∈2−nZ:0<q≤T

P (σ(q − 2−n, q) < q ∩B(q)).

Recall that supN qs −As ; s ∈ R(p,q] > ε if and only if σ(p, q) < q on Ω0. On the other hand

B(q) ∈ Fq−2−n ∨ Null(P ) for all q ∈ 2−nZ>0. Thus we see by Corollary 29.2 that

P (σ(q − 2−n, q) < q ∩B(q)) ≤ E[Aq − Aσ(q−2−n,q) ; σ(q − 2−n, q) < q ∩B(q)]/ε.

Since q ≤ σ(q − 2−n, q) + 2−n, the right hand side is dominated by

E[Aσ(q−2−n,q)+2−n − Aσ(q−2−n,q) ; σ(q − 2−n, q) < q ∩B(q)]/ε,

which coincides with E[Aτ+2−n − Aτ ; σ(q − 2−n, q) < q ∩B(q)]/ε. Consequently

P (τ < T ) ≤ E[Aτ+2−n − Aτ ; τ < T ]/ε ≤ E[Aτ+2−n − Aτ ]/ε.

Finally, since AnT = AT a.s., we have that P (supAns −As ; s ∈ R(0,T ] > ε) = P (τ < T ).

29.4 Corollary. If A· is F·-quasi left continuous then the sequence An· converges to A·uniformly on any bounded interval as n tends to ∞ a.s.

Proof. We fix ε ∈ R>0 and T ∈ N. For each n ∈ N with we introduce

τ(n) := inft ∈ R>0 ;Ant − At > ε,

which is an F· ∨ Null(P )-optional time by Lemma 29.3. We have that Ant ≥ An+1t ≥ At

a.s. for all t ∈ R>0 since Ant ∈ E[Ad2nte/2n|Ft+], An+1t ∈ E[Ad2n+1te/2n+1 |Ft+] and d2nte/2n ≥

d2n+1te/2n+1 ≥ t. The sample paths being right continuous off 2−n−1N a.s., it follows that

Ant ≥ An+1t ≥ At for all t ∈ R>0 a.s.

Consequently we have that τ(n) ≤ τ(n+ 1) a.s. for all n ∈ N. According to Lemma 4.8(iii),there exists a sequence τ(·) of F·-optional times such that τ(n) = τ(n) a.s. for all n ∈ N.Since σ = supn∈N τ(n) a.s., the quasi-left continuity reads that Aτ(n)∧T converges to Aσ∧T asn tends to ∞ a.s. It follows that

Aτ(n)∧T converges to Aσ∧T as n tends to ∞ a.s.

On the other hand Aτ(n)∧T ≤ Aτ(n)∧T+2−n ≤ Aσ∧T+2−n . Taking into account that almostevery sample path is right continuous, we infer that

Aτ(n)∧T+2−n also converges to Aσ∧T as n tends to ∞ a.s.

143

Page 144: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Recall that Ans − As ≥ 0 for all s ∈ R>0 a.s. Therefore Lemma 29.3 shows that

P (sup|Ans − As| ; s ∈ R(0,T ] > ε) ≤ E[Aτ(n)∧T+2−n − Aτ(n)∧T ]/ε.

Since E[AT+1] < +∞ and 0 ≤ Aτ(n)∧T+2−n − Aτ(n)∧T ≤ AT+1 a.s., invoking the dominatedconvergence, we thus get that

lim supn→∞

P (sup|Ans − As| ; s ∈ R(0,T ] > ε) = 0 for all ε ∈ R>0 and T ∈ N.

The convergence in probability implies the existence of almost sure converging subsequence.There exists φ ∈ Map(N,Z≥0) and Ω0 ∈ F such that φ(k) < φ(k + 1), P (Ω0) = 1 and

lim supk→∞

sup|Aφ(k)s (ω)− As(ω)| ; s ∈ R(0,T ] = 0 for all ω ∈ Ω0.

Since Ant ≥ An+1t for all t ∈ R>0 a.s., by choosing a proper subset if necessary, we see that

lim supn→∞

sup|Ans (ω)− As(ω)| ; s ∈ R(0,T ] = 0 for all ω ∈ Ω0.

Finally, T ∈ N being arbitrary, we get the claim.

29.5 Remark. Fix t ∈ R≥0. Then Ant converges to At as n tends to ∞ a.s. Indeed we havethat Ant ≥ An+1

t a.s. and Ant ∈ E[Ad2nte/2n|Ft+] for all n ∈ N. However one can not expectthat t 7→ lim infn→∞Ant is right continuous without extra cost like quasi left continuity.

29.6 Lemma. Let X· be an F·-finite variation process. If it is F·-quasi left continuous thenso are var+(X·) and var−(X·) as F·+-increasing processes.

Proof. Given a sequence τ(·) of F·-optional times such that τ(n) ≤ τ(n+1) a.s. and τ(n) ≤ Ta.s. for some T ∈ R>0. Set σ := supn∈N τ(n). There exists Ω0 ∈ F such that P (Ω0) = 1 and

t 7→ Xt(ω) is right continuous and of finite variation for each ω ∈ Ω0.

If ω ∈ Ω0 then infs<t var+(dX·(ω); R(s,t]) = maxXt(ω)−Xt−(ω), 0, i.e.,

var+(X·)t − var+(X·)t− = maxXt −Xt−, 0 for all t ∈ R>0 on Ω0.

By choosing a proper subset, we may regard that

τ(n, ω) ≤ τ(n+ 1, ω) and τ(n, ω) ≤ T for all ω ∈ Ω0.

The quasi-left continuity of X· means

P (ω ∈ Ω0 : Xσ(ω) 6= Xσ−(ω), τ(n, ω) < σ(ω) for all n ∈ N) = 0

It follows that var+(X·)τ(n) converges to var+(X·)σ as n tends to ∞ a.s. Consequentlyvar+(X·) is quasi-left continuous as F·+-increasing process since Time(F·) = Time(F·+).

29.7 Theorem. Let A· and B· be F·-finite variation processes. If A· is bounded and F·-quasileft continuous and B· is natural then E[RAdBt] = E[R[AdB]t] for all t ∈ R>0.

Proof. In view of Lemma 29.6, we prove the statement when A· is an F·+-increasing process.There exist Ω0 ∈ F and K ∈ R>0 such that P (Ω0) = 1 and for each ω ∈ Ω0

144

Page 145: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

t 7→ At(ω) is finite valued, right continuous and non-decreasing,t 7→ Bt(ω) is right continuous and of finite variation,

t 7→ N qt (ω) is right continuous and admits left hand limits for all q ∈ Q(2)

>0,

0 ≤ N qt (ω) ≤ K for all t ∈ R>0, N

qt (ω) = Aq(ω) for all t ∈ R≥q for all q ∈ Q(2)

>0.

Here t 7→ N qt are defined before Lemma 29.3. Note that (F·+)t+ = Ft+ for all t ∈ R≥0.

According to Corollary 29.4, by choosing a proper subset if necessary, we have that

lim supn→∞ sup|Ans (ω)− As(ω)| ; s ∈ R(0,T ] = 0 for all T ∈ N and ω ∈ Ω0

where Ans := Nd2nse/2n

s . Let t ∈ R>0. It then follows that∫

(0,t]Ans (ω) dBs(ω) converges to

RAdBt(ω) for all ω ∈ Ω0. Since Kvar(B·)t serves as an dominating function, we have that

E[

∫(0,t]

Ans dBs : Ω0] converges to E[RAdBt]

by the dominated convergence theorem. The same reasoning works for the following:

E[

∫(0,t]

Ans− dBs : Ω0] converges to E[R[AdB]t].

Fix n ∈ N for the time being. Observe that∫(p−2−n,u]

Ans dBs =

∫(p−2−n,u]

Nps dBs =

∫(0,u]

Nps dBs −

∫(0,p−2−n]

N qs dBs

for all p ∈ 2−nN and u ∈ R(p−2−n,p] on Ω0. Consequently E[∫

(0,t]Ans dBs : Ω0] equals∑

p∈2−nN:p<q

(E[RNp· dBp]− E[RNp

· dBp−2−n ]) + E[RN q· dBt]− E[RN q

· dBq−2−n ]

where q := 2−n[2nt]. On the other hand∫(p−2−n,u]

Ans− dBs =

∫(p−2−n,u]

Nps− dBs =

∫(0,u]

Nps− dBs −

∫(0,p−2−n]

Nps− dBs

for all p ∈ 2−nN and u ∈ R(p−2−n,p] on Ω0. Thus E[∫

(0,t]Ans− dBs : Ω0] equals∑

p∈2−nN:p<q

(E[R[Np· dB]p]− E[R[Np

· dB]p−2−n ]) + E[R[N q· dB]t]− E[R[N q

· dB]q−2−n ].

Since B· is natural and all Np· are bounded F·+-martingales we reach that

E[

∫(0,t]

Ans dBs : Ω0] = E[

∫(0,t]

Ans− dBs : Ω0] for all n ∈ N.

Tending n to ∞ we get the claim.

29.8 Theorem. Let A· be an F·-finite variation process. If A· is F·-quasi left continuousand natural then almost every sample path is continuous.

145

Page 146: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

Proof. There exists Ω0 ∈ F such that P (Ω0) = 1 and for each ω ∈ Ω0

t 7→ At(ω) is right continuous and of finite variation.

Let n ∈ N and T ∈ N. The truncate process t 7→ var+(A·)t ∧ n is a quasi-left continuousF·+-increasing process by Lemma 29.6. It follows by the proof of Theorem 29.7 that

E[

∫(0,T ]

(var+(A·)s ∧ n) dAs ; Ω0] = E[

∫(0,T ]

(var+(A·)s− ∧ n) dAs ; Ω0].

The difference of the integrand is non-negative. Indeed for each ω ∈ Ω0∫(0,T ]

(var+(A·)s(ω) ∧ n− var+(A·)s−(ω) ∧ n) dAs(ω)

=∑s≤T

(var+(A·)s(ω) ∧ n− var+(A·)s−(ω) ∧ n)(var+(A·)s(ω)− var+(A·)s−(ω)).

The right hand side converges to∑

s∈R:s>0(var+(A·)s(ω)− var+(A·)s−(ω))2 as n and T tendto ∞. The convergence being monotone, it follows that

E[∑

s∈R:s>0

(var+(A·)s − var+(A·)s−)2 ; Ω0] = 0.

The same argument works for the negative variation var−(A·). Thus, by choosing a subsetof Ω0 if necessary, we infer that

∑s∈R:s>0(As(ω)− As−(ω))2 = 0 for all ω ∈ Ω0.

29.9 Corollary. Given a quasi left continuous F·-finite variation process X· such thatE[var(X·)t] < +∞ for all t ∈ R≥0, there exists an F·-finite variation process Y· such thatY0 = X0 a.s., almost every sample path is continuous and t 7→ Xt − Yt is an F·-martingale.Any such process Z· is also an F·+-natural projection of X·.

Proof. Let A· be an F·+-natural projection of X·, which exists by Corollary 18.9(i). Weverify that A· is quasi-left continuous. Suppose that τ(·) is a sequence of F·+-optional timessuch that τ(n) ≤ τ(n + 1) a.s. and τ(n) ≤ T for some T ∈ R>0. Corollary 18.9(iii) showsthat t 7→ var(X·)t − var(A·)t is an F·+-submartingale. According to Theorem 3.9,

E[var(A·)σ − var(A·)τ(n)] ≤ E[var(X·)σ − var(X·)τ(n)] ≤ E[var(X·)T ] < +∞.

where σ := supn∈N τ(n), which is an F·+-optional time. We see by Lemma 29.6 that var(X·) isquasi-left continuous as F·+-increasing process. It then follows by the monotone convergencetheorem that

E[var(A·)σ − supn∈N

var(A·)τ(n)] ≤ E[var(X·)σ − supn∈N

var(X·)τ(n)] = 0.

Since |Aτ(n) − Aσ| ≤ var(A·)σ − var(A·)τ(n) a.s., this means that Aτ(n) converges to Aσ a.s.Being natural and quasi-left continuous, it follows by Theorem 29.8 that

almost every sample path of A· is continuous.

146

Page 147: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

We see by Lemma 10.2 that there exists an F·-adapted process Y· such that At = Yt for allt ∈ R≥0 a.s. It follows that Y· is an F·-finite variation process whose almost every samplepath is continuous. Being an F·-adapted modification of t 7→ Xt−At, the process t 7→ Xt−Ytis an F·-martingale. Finally we see that t 7→ Yt−Zt = Xt−Zt−(Xt−Yt) is an F·-martingalesuch that almost every path is continuous and of finite variation. It follows by Corollary 8.17that Yt − Zt = Y0 − Z0 = 0 for all t ∈ R≥0 a.s. Therefore Zt = Yt = At for all t ∈ R≥0 a.s.Being F·-adapted, Z· is F·+-adapted and hence it is a natural F·+-finite variation process.Moreover t 7→ Xt − Zt is an F·+-martingale since it is F·+-adapted and a modification oft 7→ Xt − At. We thus infer that Z· is an F·+-natural projection of X·.

29.10 Definition. Let X· be an F·-submartingale whose almost every sample path is rightcontinuous. It is said to be regular if for any sequence τ(·) of F·-optional times such thatτ(n) ≤ τ(n + 1) a.s. and τ(n) ≤ T for some T ∈ R>0 the sequence E[Xτ(·)] converges toE[Xσ] where σ := supn∈N τ(n), which is an F·-optional time.

29.11 Lemma. (i) An integrable F·-increasing process is F·-quasi left continuous if and onlyif it is regular as an F·-submartingale.(ii) Every martingale with almost sure right continuous path is regular.(iii) Let X· be an F·-submartingale whose almost every sample path is right continuous. If itis of class DL and F·-quasi left continuous then it is regular.

Proof. Let τ(·) be a sequence of F·-optional times such that τ(n) ≤ τ(n + 1) a.s. andτ(n) ≤ T for some T ∈ R>0. We write σ := supn∈N τ(n). Since Aσ ≤ AT a.s., we have thatE[Aσ] < +∞. Due to the non-decreasing property of t 7→ At we see that the sequence E[Aτ(·)]converges to E[supn∈NAτ(n)], by the monotone convergence theorem, and supn∈NAτ(n) ≤ Aσa.s. Consequently the sequence E[Aτ(·)] converges to E[Aσ] if and only if supn∈NAτ(n) = Aσa.s. Thus we get (i). Theorem 3.9 immediately shows (ii). (iii) is obvious.

We prove the existence of continuous compensator for regular submartingales.

29.12 Theorem. Suppose that X· is an F·-submartingale whose almost every sample path isright continuous. If X· is of class DL and regular then there exists an F·-increasing processA· such that almost every sample path is continuous and t 7→ Xt − At is an F·-martingale.

Proof. According to Theorem 18.6, there exists B· ∈ DM[X·,F·+]. Theorem 3.9 shows thatE[Xτ ] = E[X0] + E[Bτ ] for all bounded τ ∈ Time(F·+). Since Time(F·+) = Time(F·), theregularity of X· implies that B· is F·+-quasi-left continuous by Lemma 29.11(ii). The rest ofthe discussion is the same as in the proof of Corollary 29.9.

Given a stochastic process X· we set

cDM[X·,F·] := A· : F·-increasing process, t 7→ At continuous a.s.,

t 7→ Xt − At F·-martingale with stochastic right continuity.

29.13 Theorem. Let X· be an F·-submartingale with almost sure right continuous path.cDM[X·,F·] ⊂ DM[X·,F·+]. cDM[X·,F·] 6= ∅ if and only if X· is of class DL and regular.

Proof. Let A· ∈ cDM[X·,F·]. It follows that t 7→ Xt − At is an F·+-martingale by Theo-rem 14.11(ii). Observe that At ≤ |Xt| + |Xt − At|. Being an integrable increasing processwith almost sure continuous path, A· is natural by Lemma 15.14.

147

Page 148: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

29.14 Lemma. Suppose that M· is a square integrable F·-martingale with stochastic rightcontinuity. If M· is almost surely right continuous and F·-quasi left continuous then A· hasan F·-adapted and almost sure continuous modification.

Proof. We see by Lemma 29.11(iii) that the F·-submartingale t 7→ Mt2 with almost sure

right continuous path and quasi-left continuity is regular.

Given a square integrable F·-martingale M· with stochastic right continuity, we set

c〈M ;F·〉 := cDM[t 7→ |Mt|2,F·]

29.15 Theorem. Let M· be a square integrable F·-martingale with a.s.-right continuouspath. Then c〈M ;F·〉 ⊂ 〈M ;F·+〉. c〈M ;F·〉 6= ∅ if and only if M· is F·-quasi left continuous.

Proof. Theorem 29.13 claims that c〈M ;F·〉 ⊂ 〈M ;F·+〉. Suppose that c〈M ;F·〉 6= ∅. ChooseA· ∈ 〈M ;F·〉. Given a sequence τ(·) of F·-optional times such that τ(n) ≤ τ(n+ 1) a.s. andτ(n) ≤ T for some T ∈ R>0. We write σ := supn∈N τ(n), which is an F·-optional time. Itfollows by Lemma 6.1 that

P (sup(Ms −Mτ(n))2 ; s ∈ R≥0, τ(n) ≤ s ≤ σ ≥ ε) ≤ E[(Mσ −Mτ(n))

2]/ε

for all n ∈ N and ε ∈ R>0. Since

(Mσ −Mτ(n))2 = (Mσ)

2 − (Mτ(n))2 − 2(Mσ −Mτ(n))Mτ(n),

we have that E[(Mσ −Mτ(n))2] = E[(Mσ)

2 − (Mτ(n))2]. The right hand side coincides with

E[Aσ − Aτ(n)]. We thus infer that

P (sup(Ms −Mτ(n))2 ; s ∈ R≥0, τ(n) ≤ s ≤ σ ≥ ε) ≤ E[Aσ − Aτ(n)]/ε.

Since t 7→ At is continuous almost surely, the dominated convergence theorem shows that

lim supn→∞

P (sup(Ms −Mτ(n))2 ; s ∈ R≥0, τ(n) ≤ s ≤ σ ≥ ε) = 0 for all ε ∈ R>0.

This implies that Mτ(n) converges to Mσ a.s., that is, M· is quasi left continuous. Theconverse implication is discussed in Lemma 29.14.

Given square integrable F·-martingales M· and N· with stochastic right continuity,

c〈M,N ;F·〉 := A· : F·-finite variation process, A0 = 0 a.s., t 7→ At continuous a.s.,

t 7→MtNt − At F·-martingale with stochastic right continuity.

29.16 Corollary. If M·, N· are square integrable F·-martingales with a.s.-right continuouspath and F·-quasi left continuity then c〈M,N ;F·〉 6= ∅ and c〈M,N ;F·〉 ⊂ 〈M,N ;F·+〉.Proof. Let A· ∈ 〈M,N ;F·+〉, which exists by Lemma 21.7. We verify that t 7→ At iscontinuous a.s. According to Theorem 29.15, c〈M ;F·〉 6= ∅, c〈N ;F·〉 6= ∅, c〈M ;F·〉 ⊂〈M ;F·+〉 and c〈N ;F·〉 ⊂ 〈N ;F·+〉. Choose B· ∈ c〈M ;F·〉 and C· ∈ c〈N ;F·〉. We see that

(var(A·)t − var(A·)s)2 ≤ (Bt −Bs)(Ct − Cs) for all t, s ∈ R≥0 a.s.

by Lemma 21.9(v). It follows that almost every sample path of A· is continuous. We see byLemma 10.2 that there exists an F·-adapted process Y· such that At = Yt for all t ∈ R≥0

a.s. Then Y· is an F·-finite variation process whose almost every sample path is continuous.Being an F·-adapted modification of t 7→MtNt − At,

148

Page 149: Note on martingales - Hiroshima University...Note on martingales May 29, 2004 Contents 1 Conditional expectation 3 2 Optional sampling 8 3 Optional sampling–continuous time 12 4

the process t 7→MtNt − Yt is an F·-martingale.

Finally suppose that Z· ∈ c〈M,N ;F·〉. Then the process t 7→ Yt − Zt is an F·-martingalesuch that almost every path is continuous and of finite variation. It follows by Corollary 8.17that Yt − Zt = Y0 − Z0 = 0 for all t ∈ R≥0 a.s. Therefore Zt = Yt = At for all t ∈ R≥0 a.s.Being F·-adapted, Z· is F·+-adapted and hence it is a natural F·+-finite variation process.Moreover t 7→MtNt − Zt is an F·+-martingale since it is F·+-adapted and a modification oft 7→MtNt − At. We thus infer that Z· ∈ 〈M,N ;F·+〉.

149