102
Numerical Approximation of BSDEs by using Backward Euler schemes Bruno Bouchard Université Paris-Dauphine, PSL Research University. [email protected] European Summer School in Mathematical Finance Pushkin, St. Petersburg, 29 August - 2 September 2016 1

Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

  • Upload
    others

  • View
    6

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Numerical Approximation of BSDEs by using Backward Eulerschemes

Bruno BouchardUniversité Paris-Dauphine, PSL Research University.

[email protected]

European Summer School in Mathematical FinancePushkin, St. Petersburg, 29 August - 2 September 2016

1

Page 2: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Contents

I Motivation 7

1 Backward SDEs: definition and existence 7

2 Examples of application 122.1 Hedging in finance . . . . . . . . . . . . . . . . . . . . . . . . 122.2 Optimal control problem and stochastic maximum principle . . 142.3 Risk measures representation . . . . . . . . . . . . . . . . . . 162.4 Semi-linear PDEs . . . . . . . . . . . . . . . . . . . . . . . . 18

3 Extentions (in the Markovian case from now on) 203.1 BSDEs with jumps and IPDEs . . . . . . . . . . . . . . . . . 203.2 BSDEs with jumps and systems of PDEs . . . . . . . . . . . . 223.3 BSDEs in a domain . . . . . . . . . . . . . . . . . . . . . . . 243.4 Reflected BSDEs . . . . . . . . . . . . . . . . . . . . . . . . . 25

2

Page 3: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

II Discrete time approximation 26

1 A backward Euler type scheme 261.1 Construction of the scheme . . . . . . . . . . . . . . . . . . . 271.2 Important quantities and first error bound . . . . . . . . . . . 291.3 Modulus of regularity of Y . . . . . . . . . . . . . . . . . . . 381.4 Modulus of regularity of Z . . . . . . . . . . . . . . . . . . . 391.5 Strong convergence speed (conclusion) . . . . . . . . . . . . . 431.6 Weak convergence speed . . . . . . . . . . . . . . . . . . . . . 44

2 BSDEs with jumps 452.1 Approximation scheme . . . . . . . . . . . . . . . . . . . . . . 462.2 Error analysis . . . . . . . . . . . . . . . . . . . . . . . . . . 47

3 Reflected BSDEs 493.1 Approximation scheme . . . . . . . . . . . . . . . . . . . . . . 493.2 Regularity of Z : Ma and Zhang approach . . . . . . . . . . . 51

3

Page 4: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

3.3 Regularity of Z : Discretely reflected case . . . . . . . . . . . 523.4 Convergence speed : Discretely reflected case . . . . . . . . . . 563.5 Convergence speed : Continuously reflected case . . . . . . . . 57

4 BSDEs in a domain 584.1 Approximation scheme . . . . . . . . . . . . . . . . . . . . . . 584.2 Regularity of (Y, Z) . . . . . . . . . . . . . . . . . . . . . . . 604.3 Study of the exit time approximation error term (strong) . . . 634.4 Study of the exit time approximation error term (weak) . . . . 65

5 Irregular terminal condition 67

6 Additional references 70

III Computation of conditional expectations 71

1 Motivation 71

4

Page 5: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

2 Generalities 72

3 Integration by parts technique 743.1 Conditional expectation representation in the Gaussian case . . 743.2 Variance Reduction in the Gaussian Case: Control variate . . . 783.3 Variance Reduction in the Gaussian case : Localization . . . . 793.4 Full numerical scheme (f indep. of Z for simplicity) . . . . . . 843.5 The general case . . . . . . . . . . . . . . . . . . . . . . . . . 873.6 Importantly . . . . . . . . . . . . . . . . . . . . . . . . . . . 88

4 Non-parametric regression approach 894.1 Full numerical scheme (f indep. of Z for simplicity) . . . . . . 904.2 Choice of the basis functions . . . . . . . . . . . . . . . . . . 914.3 Error analysis . . . . . . . . . . . . . . . . . . . . . . . . . . 924.4 The multistep forward dynamic programming scheme . . . . . 934.5 Parallelized algorithm . . . . . . . . . . . . . . . . . . . . . . 94

5

Page 6: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

5 Quality test 955.1 The case of American options . . . . . . . . . . . . . . . . . . 955.2 Mean L2-discrete trajectory error . . . . . . . . . . . . . . . . 97

6 More... 98

6

Page 7: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Part I

Motivation

1 Backward SDEs: definition and existence

For a complete introduction to BSDEs, see the lecture notes [25, 5] and thebook [26].

Probability space: (Ω,F ,P), W a d-dimensional Brownian motion, F =

(Fs, 0 ≤ s ≤ T ) the filtration generated by W .

Process spaces:- S2: adapted continuous processes Y such that ‖Y ‖S2 := E[sup[0,T ] |Y |2]

12 <

∞.- L2P : predictable processes Z such that ‖Z‖L2

P:= E[

∫ T0 |Zt|

2dt]12 <∞.

7

Page 8: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

BSDE: Given ξ in L2 and f : Ω× [0, T ]×Rd×Rd×d, find (Y, Z) ∈ S2×L2P

such that

Yt = ξ +

∫ T

t

fs(Ys, Zs)ds−∫ T

t

ZsdWs, t ≤ T, P− a.s.

It means that the process Y has the dynamics

dYs = −fs(Ys, Zs)ds + ZsdWs with YT = ξ.

8

Page 9: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Martingale representation #1: In the case f ≡ 0,

Yt = ξ +

∫ T

t

fs(Ys, Zs)ds−∫ T

t

ZsdWs, t ≤ T, P− a.s.

can hold only ifYt = E [ξ |Ft] ,

and Z is uniquely given by the martingale representation theorem

ξ = E [ξ] +

∫ T

0

ZsdWs, i.e. E [ξ |Ft] = E [ξ] +

∫ t

0

ZsdWs.

⇒ the component Z is here to ensure that the process Y is adapted. Unlikedeterministic ODEs, we can not simply revert time as the filtration goes inone direction.

9

Page 10: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Martingale representation #2: If f (ω, y, z) = ρ(ω) + α(ω)y + β(ω)z,then

Yt = ξ +

∫ T

t

(ρs + αsYs + βsZs)ds−∫ T

t

ZsdWs

= ξ +

∫ T

t

(ρs + αsYs)ds−∫ T

t

ZsdWβs

withW β = W −

∫ ·0

βsds a Brownian motion under Qβ,

so that

e∫ t0 αsdsYt = e

∫ T0 αsdsξ +

∫ T

t

e∫ s0 αrdrρsds−

∫ T

t

e∫ s0 αrdrZsdW

βs

and (with Λ = e∫ ·0 αsdsE(

∫ ·0 βsdWs))

Yt = EQβ [e∫ Tt αsdsξ +

∫ T

t

e∫ st αrdrρsds |Ft] = E[ΛTξ +

∫ T

t

Λsρsds |Ft]Λ−1t .

Z is again uniquely given by the martingale representation theorem under Qβ.

10

Page 11: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

General existence result: Assume that there exists K such that

|f (y, z)− f (y′, z′)| ≤ K(|y − y′| + |z − z′|) dt× dP,

and that f (0) ∈ L2P , then the BSDE admits a unique solution.

Existence is obtained by a contraction argument (like for SDEs): Given (Y, Z),let (Y ′, Z ′) be defined by

Y ′t = ξ +

∫ T

t

fs(Ys, Zs)ds−∫ T

t

Z ′sdWs,

and set (Y ′, Z ′) = Φ(Y, Z). The map Φ is contracting on a suitable weightedversion of S2 × L2

P .

Existence can hold under weaker conditions, see [5] for a survey.

11

Page 12: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

2 Examples of application

2.1 Hedging in finance

Stock price: dSt = Stµtdt + StσtdWt,

Weath process: Let π be the amount invested in the stock, then the wealthY evolves according to

dYt =πtStdSt + rt(Yt − πt)dt = πt(µt − rt) + rtYt dt + πtσtdWt.

Hedging: In order to hedge the claim ξ at T , we need to find π such thatYT = ξ, i.e.

Yt = ξ −∫ T

t

Zsλs + rsYs ds−∫ T

t

ZsdWs,

after setting Z := πσ and λ := (µ− r)/σ.

12

Page 13: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Different interest rates for borrowing and lending:

Yt = ξ −∫ T

t

πsµs + rls(Ys − πs)+ − rbs(Ys − πs)−

ds−

∫ T

t

πsσsdWs.

= ξ −∫ T

t

Zsµsσs

+rlsσs

(σsYs − Zs)+ −rbsσs

(σsYs − Zs)−ds−

∫ T

t

ZsdWs.

It is no more linear...

13

Page 14: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

2.2 Optimal control problem and stochastic maximum principle

Maximization problem:

J(ν) := E[g(Xν

T ) +

∫ T

0

f (Xνs , νt)dt

],

in which Xν is the solution of the one dimensional sde

dXνt = b(Xν

t , νt)dt + σ(Xνt )dWt

with ν in the set U of predictable processes with values in R.

Associated Hamiltonian:

H(x, u, p, q) := b(x, u)p + σ(x)q + f (x, u).

Adjoint BSDE equation

Pt = ∂xg(XT ) +

∫ T

t

∂xH(Xs, Ps, Qs)ds−∫ T

t

QsdWs (2.1)

whereH(x, p, q) := sup

u∈RH(x, u, p, q).

14

Page 15: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Maximum principle: Assume that

x 7→ g(x) and x 7→ H(x, Pt, Qt) := supu∈RH(x, u, Pt, Qt) are P− a.s. concave.

Assume further that ν satisfies

H(Xτ , ντ , Pτ , Qτ) = H(Xτ , Pτ , Qτ)

∂xH(Xτ , ντ , Pτ , Qτ) = ∂xH(Xτ , Pτ , Qτ)

for all stopping times τ , and that (X, P , Q) solves the adjoint equation (2.1).Then, an optimal control is given by ν.

Remark:- σ can also depend on the control, but the formulation is more complex.- See [5] for exponential utility maximization problems that can be treateddifferently and lead to quadratic BSDEs.

15

Page 16: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

2.3 Risk measures representation

See Peng [38] for a complete treatment.

Definition A non-linear F-expectation is an operator E : L2 7→ R such that

• X ′ ≥ X implies E [X ′] ≥ E [X ′] with equality if and only if X ′ = X .

• E [c] = c for c ∈ R.

• For eachX ∈ L2 and t ≤ T , there exists ηXt ∈ L2(Ft) such that E [X1A] =

E [ηXt 1A] for all A ∈ Ft. We write Et[X ] for ηXt .

Remark: ηXt is uniquely defined. It corresponds to the notion of conditionalexpectation, in this non-linear framework.

16

Page 17: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Let us now consider the solution (Y, Z) of

Yt = ξ +

∫ T

t

fs(Ys, Zs)ds−∫ T

t

ZsdWs, t ≤ T,

and call the Y component Eft [ξ].

Theorem If f satisfies the Lipschitz and integrability conditions given in theabove existence result, then Ef is a non-linear F-expectation. Conversely, letE be a non-linear F-expectation such that for all X,X ′ ∈ L2

E [X + X ′] ≤ E [X ] + Efµ[X ′] (with fµ(y, z) = µ|z|)

andEt[X + X ′] = Et[X ] + X ′ if X ′ ∈ L2(Ft).

Then, there exists a random driver f (which does not depend on y) such thatE = Ef .

17

Page 18: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

2.4 Semi-linear PDEs

Consider the solution X of the sde

X = x +

∫ ·0

bs(Xs)ds +

∫ ·0

σs(Xs)dWs,

in which b and σ are determinist, Lipschitz in space, and continuous in time.Assume that v is a smooth solution of

0 = Lv + f (·, v, ∂xvσ) on [0, T )× R, with v(T, ·) = g

for some continuous g with polynomial growth, and where

Lv = ∂tv + b∂xv +1

2σ2∂2xxv.

Then,Y := v(·, X) , Z := ∂xv(·, X)σ(X)

solves

Y = g(XT ) +

∫ T

·fs(Xs, Ys, Zs)ds−

∫ T

·ZsdWs.

18

Page 19: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Indeed, by Itô’s Lemma,

g(XT ) = v(t,Xt) +

∫ T

t

Lv(s,Xs)ds +

∫ T

t

∂xv(s,Xs)σs(Xs)dWs

= v(t,Xt)︸ ︷︷ ︸Yt

−∫ T

t

fs(Xs, v(s,Xs)︸ ︷︷ ︸Ys

, ∂xv(s,Xs)σs(Xs)︸ ︷︷ ︸Zs

)ds

+

∫ T

t

∂xv(s,Xs)σs(Xs)︸ ︷︷ ︸Zs

dWs.

In general, no smooth solution exists but Y = v(·, X) where v is the uniqueviscosity solution of the corresponding PDE (see Crandall, Ishii & Lions [21]for the definition of viscosity solutions).

In particular, solving the BSDE or the PDE is equivalent.

19

Page 20: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

3 Extentions (in the Markovian case from now on)

3.1 BSDEs with jumps and IPDEs

Consider the solution (X, Y, Z, U) ∈ S2 × S2 × L2P × L2

λ ofXt = X0 +

∫ t0 b(Xr)dr +

∫ t0 σ(Xr)dWr +

∫ t0

∫E β(Xr−, e)µ(de, dr) ,

Yt = g(XT ) +∫ Tt f (Θr) dr −

∫ Tt ZrdWr −

∫ Tt

∫E Ur(e)µ(de, dr)

where Θ := (X, Y,Γ, Z) with Γ :=∫E ρ(e)U(e)λ(de), and

• b, σ, β, f , g Lipschitz in (x, y, γ, z) uniformly in e ∈ E.

• µ Poisson measure on E = R` with compensator ν(de, dt) = λ(de)dt andµ = µ− ν. ρ : E 7→ Rm is bounded.

• L2λ: P ⊗ B(E) measurable maps U : Ω× [0, T ]× E → R such that

‖U‖L2λ

:= E[∫ T

0

∫E

|Us(e)|2λ(de)ds

]12

< ∞

20

Page 21: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

If v solves

−Lv(t, x)− f (t, x, v(t, x), σ(t, x)∂xv(t, x), I[v](t, x)) = 0

v(T, x) = g(x) ,

where

Lv(t, x) := ∂tv(t, x) + ∂xv(t, x)b(x) +1

2

d∑i,j=1

(σσ>(x))ij∂2u

∂xi∂xj(t, x)

+

∫E

v(t, x + β(x, e))− v(t, x)− ∂xv(t, x)β(x, e)λ(de) ,

I[v](t, x) :=

∫E

v(t, x + β(x, e))− v(t, x) ρ(e)λ(de) ,

then

Y = v(·, x), Z = ∂xv(·, X)σ(·, X) and U = v(·, X− + β(X−, ·))− v(·, X−).

21

Page 22: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

3.2 BSDEs with jumps and systems of PDEs

Idea coming from Pardoux, Pradeilles & Rao [37].

System of κ PDE’s (i = 0, . . . , κ− 1)

0 = ∂tvi + bi∂xvi +1

2Tr[σiσ>i ∂

2xxvi] + fi(·, v, ∂xviσi)

gi = vi(T, ·) .

Define for i = 0, . . . , κ− 1

f (i, t, x, y, γ, z) = fi

(t, x, (. . . , y + γκ−2, y + γκ−1, y︸︷︷︸

i

, y + γ1, y + γ2, . . .), z

)

Set E = 1, . . . , κ− 1, λ(de) = λ∑κ−1

k=1 δk(e) and

Mt =

∫ t

0

∫E

eµ(de, ds) modulo κ.

22

Page 23: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Then, the corresponding BSDE is:

dXt = bMt(Xt)dt + σMt(Xt)dWt

−dYt = f (Mt, t, Xt, Yt,Γt, Zt)dt− λκ−1∑k=1

Γkt dt− ZtdWt −∫E

Ut(e)µ(de, dt)

YT = gMT(XT )

with ρ(k)i = λ−11k=i, i.e Γk = U(k).

Indeed, it corresponds to

0 = ∂tv(·, i) + bi∂xv(·, i) +1

2Tr[σiσ>i ∂

2xxv(·, i)] + f (·, v(·, i), ∂xv(·, i)σi, I[v(·, i)])

where

f (·, v, ·, I[v(·, i)]) =

fi(·, (. . . , v(·, i) + v(·, i− 1)− v(·, i)︸ ︷︷ ︸v(·,i−1)

, v(·, i)︸ ︷︷ ︸i

, v(·, i) + v(·, i + 1)− v(·, i)︸ ︷︷ ︸v(·,i+1)

, . . .), ·)

Remark: We obtain such systems in the case of option pricing with possibledefault.

23

Page 24: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

3.3 BSDEs in a domain

Consider the system:

Xt = X0 +

∫ t

0

b(Xs)ds +

∫ t

0

σ(Xs)dWs

Yt = g(τ,Xτ) +

∫ τ

t∧τfs(Xs, Ys, Zs)ds−

∫ τ

t∧τZsdWs,

whereτ := inft ≥ 0 : Xt /∈ O ∧ T,

for some open set O.

Then, it corresponds to the Dirichlet problem:

−Lv − f (·, v, ∂xvσ) = 0 on [0, T )×Ov = g on ([0, T )× ∂O) ∪ (T × O).

Remark: Barrier options in finance.

24

Page 25: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

3.4 Reflected BSDEs

Consider the system:

Xt = X0 +

∫ t

0

b(Xs)ds +

∫ t

0

σ(Xs)dWs

Yt = g(XT ) +

∫ T

t

fs(Xs, Ys, Zs)ds−∫ T

t

ZsdWs + KT −Kt

Yt ≥ g(Xt),

where K is non-decreasing adapted continuous and∫ T

0

[Yt − g(Xt)]dKt = 0.

Then, it corresponds to the obstacle problem:

min−Lv − f (·, v, ∂xvσ), v − g = 0 on [0, T )× Rd

v = g on T × Rd.

Remark: American options in finance.

25

Page 26: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Part II

Discrete time approximation

See [8] for a survey.

1 A backward Euler type scheme

From now on, we set T = 1 and consider

Xt = X0 +

∫ t

0

b(Xs)ds +

∫ t

0

σ(Xs)dWs ,

Yt = g(X1) +

∫ 1

t

f (Xs, Ys, Zs)ds−∫ 1

t

ZsdWs

with b, σ, f, g Lipschitz.

We construct a discrete time approximation on a grid π := ti := i/n, i ≤ nof [0, T ] (with t0 = 0 and tn = T ). The mesh is |π| = n−1.

26

Page 27: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

1.1 Construction of the scheme

It is based on the simple approximation

Yti = Yti+1+

∫ ti+1

ti

f (Xs, Ys, Zs) ds−∫ ti+1

ti

ZtdWt

' Yti+1+

1

nf(Xti, Yti, Zti

)−∫ ti+1

ti

ZtdWt

whereZti := nEti[

∫ ti+1

ti

Ztdt] ' nEti[Yti+1(Wti+1

−Wti)],

so that

Yti ' Eti[Yti+1] +

1

nf(Xti, Yti, Zti

)Zti ' nEti[Yti+1

(Wti+1−Wti)]

The discrete time approximation is based on forcing equality in the above.

27

Page 28: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

We define (Yπ, Z

π) by

Yπti

:= Eti[Yπti+1

] +1

nf (Xπ

ti, Y

πti, Z

πti)

Zπti

:= nEti[Yπti+1

(Wti+1−Wti)],

where Y πtn

= g(Xπtn

), and Xπ is the Euler scheme of X .

We want to control

Err2 := maxi

E

[sup

ti≤t≤ti+1

|Yt − Yπti|2]

+

∫ 1

0

E[|Zt − Z

πt |2]dt

in which Zπt := Z

πtion [ti, ti+1).

Remark: One needs to solve the first equation in Y πti. One could alternatively

set

Yπti

= Eti[Yπti+1

] +1

nEti[f (Xπ

ti, Y

πti+1, Z

πti)],

without changing the nature of the convergence.

28

Page 29: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

1.2 Important quantities and first error bound

By construction

Zti := nEti[∫ ti+1

ti

Zsds]

is the best L2-approximation of Z by a step-constant process, and thus∫ 1

0

E[|Zt − Z

πt |2]dt ≥ R(Z) :=

∫ 1

0

E[|Zt − Zt|2

]dt.

On the other hand, if f ≡ 0, then Yti = Eti[Yt] so that (Yti)i also provides thebest L2-approximation and we expect that

maxi

E

[sup

ti≤t≤ti+1

|Yt − Yπti|2]& R(Y ) := max

iE

[sup

ti≤t≤ti+1

|Yt − Yti|2

].

Not surprisingly the error should depend on the regularity of (Y, Z)...Theorem: There exists C > 0 such that

Err2 ≤ C (|π| +R(Y ) +R(Z))

29

Page 30: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Proof. (case f (X, Y, Z) = f (Z))• Continuous backward Euler scheme on [ti, ti+1]

Y πt = Y

πti+1

+ (ti+1 − t)f(Zπti

)−∫ ti+1

t

Zπs dWs .

• Set δY = Y − Y π, δZ = Z − Zπ. By Itô’s Lemma,

E[|δYt|2

]+

1

2

∫ ti+1

t

E[|δZs|2

]ds = E

[|δYti+1

|2]

+

∫ ti+1

t

E[δYs(f (Zs)− f (Z

πti))]ds

30

Page 31: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Proof. (case f (X, Y, Z) = f (Z))• Continuous backward Euler scheme on [ti, ti+1]

Y πt = Y

πti+1

+ (ti+1 − t)f(Zπti

)−∫ ti+1

t

Zπs dWs .

• Set δY = Y − Y π, δZ = Z − Zπ. By Itô’s Lemma,

E[|δYt|2

]+

1

2

∫ ti+1

t

E[|δZs|2

]ds = E

[|δYti+1

|2]

+

∫ ti+1

t

E[δYs(f (Zs)− f (Z

πti))]ds

• Then, using |δy(f (z)− f (z′))| ≤ Cα |δy|

2 + 4αC |f (z)− f (z′)|2,

E[|δYt|2

]+

1

2

∫ ti+1

t

E[|δZs|2

]ds ≤ E

[|δYti+1

|2]

+C

α

∫ ti+1

t

E[|δYs|2

]ds

+ α

∫ ti+1

t

E[|Zs − Zti|

2 + |Zti − Zπti|2]ds

31

Page 32: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Since

Zti := nEti[∫ ti+1

ti

Zsds]

Zπti

:= nEti[Yπti+1

(Wti+1−Wti)] = nEti[

∫ ti+1

ti

Zπs ds],

we deduce from Jensen’s inequality that∫ ti+1

t

E[|Zti − Z

πti|2]ds ≤

∫ ti+1

t

n

∫ ti+1

ti

E[|Zr − Zπ

r |2]drds

≤∫ ti+1

ti

E[|Zs − Zπ

s |2]ds

=

∫ ti+1

ti

E[|δZs|2

]ds

32

Page 33: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Proof. (case f (X, Y, Z) = f (Z))• Continuous backward Euler scheme on [ti, ti+1]

Y πt = Y

πti+1

+ (ti+1 − t)f(Zπti

)−∫ ti+1

t

Zπs dWs .

• Set δY = Y − Y π, δZ = Z − Zπ. By Itô’s Lemma,

E[|δYt|2

]+

1

2

∫ ti+1

t

E[|δZs|2

]ds = E

[|δYti+1

|2]

+

∫ ti+1

t

E[δYs(f (Zs)− f (Z

πti))]ds

• Then, using |δy(f (z)− f (z′))| ≤ Cα |δy|

2 + 4αC |f (z)− f (z′)|2,

E[|δYt|2

]+

1

2

∫ ti+1

t

E[|δZs|2

]ds ≤ E

[|δYti+1

|2]

+C

α

∫ ti+1

t

E[|δYs|2

]ds

+ α

∫ ti+1

t

E

|Zs − Zti|2 + |Zti − Z

πti|2︸ ︷︷ ︸

“≤ |δZs|2”

ds33

Page 34: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Proof. (case f (X, Y, Z) = f (Z))• Continuous backward Euler scheme on [ti, ti+1]

Y πt = Y

πti+1

+ (ti+1 − t)f(Zπti

)−∫ ti+1

t

Zπs dWs .

• Set δY = Y − Y π, δZ = Z − Zπ. By Itô’s Lemma,

E[|δYt|2

]+

1

2

∫ ti+1

t

E[|δZs|2

]ds = E

[|δYti+1

|2]

+

∫ ti+1

t

E[δYs(f (Zs)− f (Z

πti))]ds

• For η := 12 − α > 0, and with the help of Gronwall’s lemma,

E[|δYti|

2]

+ η

∫ ti+1

ti

E[|δZs|2

]ds ≤ eC|π|E

[|δYti+1

|2]

+ α

∫ ti+1

ti

E[|Zs − Zti|

2]ds

34

Page 35: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Proof. (case f (X, Y, Z) = f (Z))• Continuous backward Euler scheme on [ti, ti+1]

Y πt = Y

πti+1

+ (ti+1 − t)f(Zπti

)−∫ ti+1

t

Zπs dWs .

• Set δY = Y − Y π, δZ = Z − Zπ. By Itô’s Lemma,

E[|δYt|2

]+

1

2

∫ ti+1

t

E[|δZs|2

]ds = E

[|δYti+1

|2]

+

∫ ti+1

t

E[δYs(f (Zs)− f (Z

πti))]ds

• By the discrete Gronwall’s lemma,

maxi

E[|δYti|

2]

+

n−1∑i=0

∫ ti+1

ti

E[|δZs|2

]ds ≤ C

n−1∑i=0

∫ ti+1

ti

E[|Zs − Zti|

2]ds

35

Page 36: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Proof. (case f (X, Y, Z) = f (Z))• Continuous backward Euler scheme on [ti, ti+1]

Y πt = Y

πti+1

+ (ti+1 − t)f(Zπti

)−∫ ti+1

t

Zπs dWs .

• Set δY = Y − Y π, δZ = Z − Zπ. By Itô’s Lemma,

E[|δYt|2

]+

1

2

∫ ti+1

t

E[|δZs|2

]ds = E

[|δYti+1

|2]

+

∫ ti+1

t

E[δYs(f (Zs)− f (Z

πti))]ds

• Finally, as above,∫ ti+1

ti

E[|Zs − Z

πti|2]ds ≤

∫ ti+1

ti

E[|Zti − Z

πti|2]ds +

∫ ti+1

ti

E[|Zs − Zti|

2]ds

≤∫ ti+1

ti

E[|δZs|2

]ds +

∫ ti+1

ti

E[|Zs − Zti|

2]ds.

2

36

Page 37: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

We have “shown” thatTheorem: There exists C > 0 such that

Err2 ≤ C (|π| +R(Y ) +R(Z))

in which

R(Y ) := maxi

E

[sup

ti≤t≤ti+1

|Yt − Yti|2

]

R(Z) :=

∫ 1

0

E[|Zt − Zt|2

]dt with Zti := nEti[

∫ ti+1

ti

Zsds].

It remains to study the modulus of regularity R(Y ) and R(Z)...

37

Page 38: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

1.3 Modulus of regularity of Y

This is the easy part.... Standard estimates (Gronwall + BDG inequalities)lead to Yt = v(t,Xt) in which v is 1/2-Hölder in time and Lipschitz in X .Then,

|Yt − Ys|2 ≤ C(|t− s| + |Xt −Xs|2

).

Hence,

R(Y ) = maxi

E

[sup

ti≤t≤ti+1

|Yt − Yti|2

]≤ max

iC(ti+1 − ti) = C|π|.

Theorem: There exists C > 0 such that

Err2 ≤ C (|π| +R(Y ) +R(Z)) ≤ C (|π| +R(Z)) .

38

Page 39: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

1.4 Modulus of regularity of Z

This is the difficult part... We use the representation of Z in terms of Malliavinderivatives. The initial proof is due to Ma & Zhang [35].

• Assume that the coefficients are smooth enough. Then, (Yt, Zt) admits aMalliavin derivative for all t ≤ T and (DsY,DsZ) solves

DsYt = ∇g(XT )DsXT +

∫ T

t

∇f ( Θr︸︷︷︸(Xr,Yr,Zr)

)DsΘrdr −∫ T

t

DsZrdWr

• SinceYt = Y0 −

∫ t

0

f (Θr)dr +

∫ t

0

ZrdWr

we have (Λ as on slide 10)

Zt = DtYt = E[

ΛT∇g(XT )DsXT +

∫ T

t

Λr∇xf (Θr)DsXrdr | Ft]

Λ−1t

= E[

ΛT∇g(XT )∇XT +

∫ T

t

Λr∇xf (Θr)∇Xrdr | Ft]

(∇Xt)−1σ(Xt)Λ

−1t

39

Page 40: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Another way to put this: Y = v(·, X) and Z = ∂xv(·, X)σ(X) with vsolving

−Lv − f (·, v, ∂xvσ) = 0.

Then, u := ∂xv solves a semi-linear equation of the form

−Lu− f (·, u, ∂xuσ) = 0.

By looking at this equation, we obtain that (u(·, X)∇X, ∂xu(·, X)σ(X)) =

(Zσ−1(X)∇X, ∂xu(·, X)σ(X))) solves a linear BSDE whose solution is givenby the above (for an appropriate process Λ).

40

Page 41: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

• Then, Zt = (Vt − αt)ηt where

Vt = E[

ΛT∇g(XT )∇XT +

∫ T

0

Λr∇xf (Θr)∇Xrdr | Ft]

ηt = (∇Xt)−1σ(Xt)Λ

−1t

αt =

∫ t

0

Λr∇xf (Θr)∇Xrdr

• There exists σ ∈ L2P such that

Vt = V0 +

∫ t

0

σsdWs,

and thusn−1∑i=0

∫ ti+1

ti

E[|Vt − Vti|

2]dt ≤ C

n−1∑i=0

∫ ti+1

ti

∫ t

ti

E[|σs|2

]dsdt

≤ C |π| .

The additional terms behave in |π|12 in any Sp...

41

Page 42: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Hence,

R(Z) :=

∫ 1

0

E[|Zt − Zt|2

]dt ≤ C|π|.

Theorem: There exists C > 0 such that

Err2 ≤ C (|π| +R(Y ) +R(Z)) ≤ C|π|.

42

Page 43: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

1.5 Strong convergence speed (conclusion)

Theorem:(B. & Touzi [11], and Zhang[39])) There exists C > 0 such that

Err ≤ C|π|12 .

This is the convergence speed in the linear case, it can not be improved.

43

Page 44: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

1.6 Weak convergence speed

Under additional regularity assumptions, Gobet and Labart [29] obtain expan-sions of the form

Yt − Y πt = ∂xv(t,Xt)(Xt −Xπ

t ) + O(|π|) + O(|Xt −Xπ

t |2)

Zt − Zπt = ∂x [∂xv(t,Xt)σ]> (Xt −Xπ

t ) + O(|π|) + O(|Xt −Xπ

t |2),

where (Xπt , Y

πt )t≤1 is the continuous version of the Euler scheme.

In particular, this implies that

Y0 − Yπ0 = Y0 − Y π

0 = O(|π|).

Moreover, when the grid is uniform, this allows to deduce from the weakconvergence of the process

√n(X −Xπ) that the process

√n(X −Xπ, Y −

Y π, Z − Zπ) weakly converges too.

44

Page 45: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

2 BSDEs with jumps

In the presence of jumps, the discrete time approximation is essentially thesame.

We consider the BSDE

Xt = X0 +

∫ t

0

b(Xr)dr +

∫ t

0

σ(Xr)dWr +

∫ t

0

∫E

β(Xr−, e)µ(de, dr) ,

Yt = g(X1) +

∫ 1

t

f (Θr) dr −∫ 1

t

ZrdWr −∫ 1

t

∫E

Ur(e)µ(de, dr)

where Θ := (X, Y,Γ, Z) with Γ :=∫E ρ(e)U(e)λ(de).

45

Page 46: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

2.1 Approximation scheme

Step-constant driver

Yπti

= Yπti+1

+1

nf(Xπti, Y

πti,Γ

πti, Z

πti

)−∫ ti+1

ti

Zπt dWt −

∫ ti+1

ti

∫E

Uπt µ(de, dt) ,

Best L2P approximation of Zπ and Γπ =

∫E U

π(e)ρ(e)λ(de) by Fti-meas. pro-cesses

Zπt := nEti[

∫ ti+1

ti

Zπs ds] = nEti[Y

πti+1

(Wti+1−Wti)]

Γπt := nEti[

∫ ti+1

ti

Γπsds] = nEti[Yπti+1

∫E

ρ(e)µ(de, (ti, ti+1])].

Discrete scheme: Y πti

= Eti[Yπti+1

] + 1nf(Xπti, Y

πti,Γ

πti, Z

πti

).

We want to control

Err2 := maxi

E

[sup

ti≤t≤ti+1

|Yt − Yπti|2]

+

∫ 1

0

E[|Zt − Z

πt |2]dt +

∫ 1

0

E[|Γt − Γ

πt |2]dt.

46

Page 47: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

2.2 Error analysis

We find the same initial error bound in terms of the modulus of regularity ofY and Z, + an additional term related to Γ.

But Y = v(·, X) and U(e) = v(·, X− + β(X−, e)) − v(·, X−) so that U (andtherefore Γ) can be analyzed as Y .

The analysis of Z can be done by using the same martingale arguments asabove but it requires and additional invertibility condition on the flow X .

Assumption : For each e ∈ E, the map x ∈ Rd 7→ β(x, e) admits a Jacobianmatrix ∇β(x, e) such that the function

(x, ξ) ∈ Rd × Rd 7→ a(x, ξ; e) := ξ>(∇β(x, e) + Id)ξ

satisfies one of the following condition uniformly in (x, ξ) ∈ Rd × Rd

a(x, ξ; e) ≥ |ξ|2K−1 or a(x, ξ; e) ≤ −|ξ|2K−1 .47

Page 48: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Proposition:(B. & Elie [7]) Under the above condition, there exists C > 0

such thatn−1∑i=0

∫ ti+1

ti

E[|Zt − Zti|

2]dt ≤ C |π|.

Theorem:(B. & Elie [7]) Under the above condition, there exists C > 0 suchthat

Err ≤ C |π|12 .

48

Page 49: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

3 Reflected BSDEs

3.1 Approximation scheme

We want to approximate the solution (Y, Z,K) of

Yt = g(X1) +

∫ 1

t

f (Xs, Ys, Zs)ds−∫ 1

t

ZsdWs + K1 −Kt ,

Yt ≥ h(Xt) , 0 ≤ t ≤ 1

with K continuous, increasing, such that K0 = 0 and∫ 1

0

(Yt − h(Xt))dKt = 0 .

References:- f = 0 : Clément, Lamberton and Protter [20]- f independent of Z : Bally, Pages and Printemps ([1],...), B. and Touzi [11].- f depends on Z : Ma and Zhang [36].

49

Page 50: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Approximation scheme

Zπti

= n Ei[Yπti+1

(Wti+1−Wti)

]Y πti

= Ei[Yπti+1

]+ n−1 f (Xπ

ti, Y π

ti, Z

πti)

Yπti

= J(ti , X

πti, Y π

ti

),

with the terminal condition

Yπtn

= g(Xπ1 ) .

where for < = rj, 0 ≤ j ≤ κ ⊃ π

J (t, x, y) := y + [h(x)− y]+1t∈<\0,T , (t, x, y) ∈ [0, T ]× Rd+1 .

50

Page 51: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

3.2 Regularity of Z : Ma and Zhang approach

Representation of the Z: The same ideas as above (but more delicate)combined with an integration by parts argument lead to

Zt = Et[g(XT )N tT +

∫ T

t

f (Θs)Ntsds +

∫ T

t

N tsdKs]

where

N tr := (r − t)−1

∫ r

t

σ(Xs)−1∇XsdWs (∇Xt)

−1σ(Xt).

Theorem: (Ma and Zhang [36]) If σ is uniformly elliptic, σ, b ∈ C1b and h

∈ C2b , then R(Z) ≤ |π|12 . If < = π, then

E[

maxi≤n|Yti − Y

πti|2]

+ maxi≤n

E

[sup

t∈[ti,ti+1]

|Yt − Yπti+1|2]

+ ‖Zπ − Z‖2L2P≤ C |π|

12 .

Remark : Requires stronger conditions and converges only in |π|14 (instead of|π|12).

51

Page 52: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

3.3 Regularity of Z : Discretely reflected case

To try to improve the above, one first considers the discretely reflectedBSDE

Y <t = Y <rj+1+

∫ rj+1

t

f (Θ<s)ds−∫ rj+1

t

Z<s dWs ,

Y <t = J(t , Xt , Y

<t

)on each [rj, rj+1] , j ≤ κ− 1 .

with Y <1 = g(X1) and Θ< = (X, Y <, Z<).

Proposition: Let τj be the next reflection time after rj. There is a versionof Z< such that for each j ≤ κ− 1 and t ∈ [rj, rj+1):

Z<t = Et[∇g(XT )ΛT∇XT1τj=T +∇h(Xτj)(Λ∇X)τj1τj<T

]Λ−1t

+ Et[∫ τj

t

∇xf (Θ<s )(Λ∇X)sds

]Λ−1t (∇Xt)

−1σ(Xt) .

This works as in the non-reflected case... because the reflection is performedonly at fixed discrete times (do the same as above on each time interval).

52

Page 53: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Main idea to conclude: Assume h = g is C1L and f ≡ 0. For simplicity:

Λ ≡ (∇Xt)−1σ(Xt) ≡ 1 (never true unless X = X0 + W ...).

Z<t = V jt := Et

[∇h(Xτj)∇Xτj

]is a martingale, thus, with ij s.t. tij = rj,

n−1∑i=0

E[∫ ti+1

ti

|Z<t − Z<ti |2dt

]=

κ−1∑j=0

ij+1−1∑k=ij

∫ ti+1

ti

E[|V jt − V

jtk|2]dt

≤ |π|κ−1∑j=0

ij+1−1∑k=ij

E[|V jtk+1|2 − |V j

tk|2]

= |π| E[|V κ−1rκ|2 − |V 0

r0|2]

+

κ−2∑j=1

E[|V jrj+1|2 − |V j+1

rj+1|2]

53

Page 54: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

where, by the Lipschitz continuity of ∇h,

E[|V jrj+1|2 − |V j+1

rj+1|2]≤ E

[ηrj+1|V jrj+1− V j+1

rj+1|]

= E[ηrj+1|Erj+1

[∇h(Xτj)∇Xτj −∇h(Xτj+1)∇Xτj+1

]|]

≤ E[η(τj+1 − τj)

12

]so that

κ−2∑j=1

E[|V jrj+1|2 − |V j+1

rj+1|2]≤√κ E

[η(τκ−1 − τ1)

12

]• Similarly, if ∇h ∈ C2 we apply Itô’s lemma to obtain

E[|V jrj+1|2 − |V j+1

rj+1|2]≤ E [η(τj+1 − τj)]

so thatκ−2∑j=1

E[|V jrj+1|2 − |V j+1

rj+1|2]≤ E [η(τκ−1 − τ1)] .

54

Page 55: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Theorem:(B. & Chassagneux [7]) If H1 : h ∈ C1L, then

n−1∑i=0

E[∫ ti+1

ti

|Z<t − Z<ti |2dt

]≤ C

√κ |π| .

If H2 : h ∈ C2L and σ ∈ C1

L, thenn−1∑i=0

E[∫ ti+1

ti

|Z<t − Z<ti |2dt

]≤ C |π| .

55

Page 56: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

3.4 Convergence speed : Discretely reflected case

Theorem:(B. & Chassagneux [7]) Let H1 hold. Then,

maxi≤n−1

‖ |Y πti− Y <ti | + sup

t∈(ti,ti+1]

|Y πti+1− Y <t | ‖L2 ≤ C(κ

14 |π|

12 + |π|

14)

and

‖Zπ − Z<‖L2P≤ C(κ

12 |π|

12 + |π|

14)

Theorem: (B. & Chassagneux [7]) Let H2 hold. Then,

maxi≤n−1

‖ |Y πti− Y <ti | + sup

t∈(ti,ti+1]

|Y πti+1− Y <t | ‖L2 ≤ C|π|

12

and

‖Zπ − Z<‖L2P≤ C(κ

12 |π|

12 + |π|

12)

Remark: can do better when Xπ = X on π.56

Page 57: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

3.5 Convergence speed : Continuously reflected case

Theorem:(B. & Chassagneux [7]) Take < = π. Let H1 hold. Then,

maxi≤n−1

‖ |Y πti− Yti| + sup

t∈(ti,ti+1]

|Y πti+1− Yt| ‖L2 ≤ CL α(π) ,

with α(π) = |π|14 under H1 and α(π) = |π|12 under H2.Moreover, under H1,

‖Zπ − Z‖L2P≤ CL |π|

14 ,

If H2 holds and Xπ = X on π, then

‖Zπ − Z‖L2P≤ CL |π|

12 .

57

Page 58: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

4 BSDEs in a domain

4.1 Approximation scheme

We consider

Xt = X0 +

∫ t

0

b(Xs)ds +

∫ t

0

σ(Xs)dWs

Yt = g(Xτ) +

∫ τ

t∧τfs(Xs, Ys, Zs)ds−

∫ τ

t∧τZsdWs,

whereτ := inft ≥ 0 : Xt /∈ O ∧ T,

for some open set O.

58

Page 59: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

We approximate the first exit time τ by

τπ := inft ∈ π : Xπt /∈ O ∧ 1 .

The Euler scheme is defined as previously with Y πτπ = g(Xπ

τπ) and

Zπti

= n Eti[Yπti+1

(Wti+1−Wti)]

Y πti

= Eti[Yπti+1

] +1

nf (Xπ

ti, Y π

ti, Zπ

ti).

For Lipschitz continuous coefficients, we get a similar error term + the exittime approximation error:

Err ≤ C

|π|12 +R(Y ) +R(Z)︸ ︷︷ ︸previous terms

+Err(τ − τπ)

59

Page 60: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

4.2 Regularity of (Y, Z)

We can not follow the Malliavin derivative approach anymore because Xτ isnot Malliavin differentiable in general...

However, one can follow the PDE approach (say f ≡ 0): ∂xv(·, X)∇X is amartingale, which can be read from the pde, and therefore

Zt = ∂xv(t,Xt)∇Xt1t≤τ(∇Xt)−1σ(Xt)

= Et[∂xv(τ,Xτ)∇Xτ ]1t≤τ(∇Xt)−1σ(Xt)

6= Et[∂xg(Xτ)∇Xτ ]1t≤τ(∇Xt)−1σ(Xt)

60

Page 61: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

4.2 Regularity of (Y, Z)

We can not follow the Malliavin derivative approach anymore because Xτ isnot Malliavin differentiable in general...

However, one can follow the PDE approach (say f ≡ 0): ∂xv(·, X)∇X is amartingale, which can be read from the pde, and therefore

Zt = ∂xv(t,Xt)∇Xt1t≤τ(∇Xt)−1σ(Xt)

= Et[∂xv(τ,Xτ)∇Xτ ]1t≤τ(∇Xt)−1σ(Xt)

If ∂xv bounded, we can use the same martingale techniques as in the caseO = Rd to bound R(Z) (and Z as well, leading to a bound on R(Y )) !

61

Page 62: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Assume from now on that:

• All coefficients are Lipschitz.

• O :=⋂m`=1O` where O` is C2 with a compact boundary.

• Exterior sphere condition + interior truncated cone condition.

• The boundary satisfies a non characteristic condition outside a neighbor-hood of C :=

⋂m` 6=k=1 ∂O` ∩ ∂Ok and σ is uniformly elliptic on a neigh-

borhood of C.

• g ∈ C2(O) and ‖∂xg‖ + ‖∂2xxg‖ ≤ L on O .

Proposition: (B. & Menozzi [10]) Let v be such that Y = v(·, X). Then, vis uniformly Lipschitz in space and 1

2-Hölder in time.

Corollary: Assume that the above conditions hold. Then,

R(Y ) +R(Z) = O(|π|12) .

62

Page 63: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

4.3 Study of the exit time approximation error term (strong)

We have (on τπ ≤ τ )

Y πτπ − Yτπ = g(Xπ

τπ)− g(Xτ)−∫ τ

τπf (· · · )ds +

∫ τ

τπZsdWs.

Thus, because of the Ito integral,

E[|Y πτπ − Yτπ|2

]= O

(E[|g(Xπ

τπ)− g(Xτ)|2]

+ E[ξ|τπ − τ |2

])+ O (E [ξ|τπ − τ |]) .

This leads to Err(τ − τπ) = Oε

(E [|τπ − τ |]

12−ε).

Theorem: (B. & Menozzi [10], B., Geiss & Gobet [9])

E [|τπ − τ |] ≤ O(|π|12).

63

Page 64: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

By combining everything together...

Theorem: (B. & Menozzi [10])

Err ≤ C(|π|12 +R(Y ) +R(Z)︸ ︷︷ ︸

gradient bound : |π|12

+ Err(τ − τπ)︸ ︷︷ ︸O

(E[ξ|τπ−τ |]

12

)=Oε

(|π|

14−ε)) ≤ Cε|π|

14−ε .

64

Page 65: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

4.4 Study of the exit time approximation error term (weak)

We now stop at τ ∧ τπ:

Y πτ∧τπ − Yτ∧τπ = Eτ∧τπ[g(Xπ

τπ)− g(Xτ)]− Eτ∧τπ[∫ τ∨τπ

τ∧τπf (· · · )ds]

= O(|π|12 , ω) + Eτ∧τπ[

∫ τ∨τπ

τ∧τπF (ω, g,Dg,D2g)(· · · )ds]

− Eτ∧τπ[∫ τ∨τπ

τ∧τπf (· · · )ds],

which leads to

E[|Y πτ∧τπ − Yτ∧τπ|2

]= O(|π|) + O

(E[Eτ∧τπ[ξ|τπ − τ |]2

])= Oε

(|π|1−ε

).

65

Page 66: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

By combining everything together...

Theorem: (B. & Menozzi [10])

maxi<n

E

[sup

t∈[ti,ti+1]

|Yt − Y πti|21t≤τ∧τπ

]12

+ E

[∑i

∫ ti+1

ti

‖Zt − Zπti‖21t≤τ∧τπdt

]12

≤ C(|π|12 +R(Y ) +R(Z)︸ ︷︷ ︸

gradient bound : |π|12

+ Err(τ − τπ)︸ ︷︷ ︸O(E[ Eτ∧τπ [ξ|τπ−τ |]2 ])=Oε(|π|

12−ε)

)

≤ Cε|π|12−ε .

In particular, |Y0 − Y π0 | ≤ Cε(|π|

12−ε).

Remark: Should be able to get rid of the ε, at least if g ∈ C2b and f is bounded.

66

Page 67: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

5 Irregular terminal condition

This is the case where g is not Lipschitz, possibly discontinuous. Based onearlier works of S. Geiss, it was first treated by Gobet & Makhlouf [31] (seealso Gobet, Geiss & Geiss [28]).

Main idea: The important quantity is

supt<1

Vt,1︷ ︸︸ ︷E[ |g(X1)− Et[g(X1)]|2]

(1− t)α.

We say that g ∈ L2,α, 0 < α ≤ 1, if the above is finite and g(X1) ∈ L2 (ifX1 = W1, this corresponds to a Besov space).

When g ∈ L2,α,E[|∂2xxv(t,Xt)|2] ≤ CVt,1/(1− t)2.

This is obtained under smoothness/uniform ellipticity conditions by estimatingthe second order derivative (representation + integration by parts).

67

Page 68: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Because Z is essentially ∂xv(·, X) (up to nice behaving terms), not surpris-ingly:∫ 1

0

E[|Zt − Zt|2

]dt ≤ C(· · · +

n−1∑i=0

∫ ti+1

ti

(ti+1 − s)E[|∂2xxv(s,Xs)|2]ds)

i.e. ∫ 1

0

E[|Zt − Zt|2

]dt ≤ C(|π| +

n−1∑i=0

∫ ti+1

ti

(ti+1 − s)Vs,1

(1− s)2ds).

If one takes ti = 1− (1− i/n)1β , with β < α, then∑

i

∫ ti+1

ti

(ti+1 − s)Vs,1

(1− s)2ds =

∑i

∫ ti+1

ti

ti+1 − s(1− s)1−β

(1− s)1−β Vs,1(1− s)2

ds

≤ C supi

ti+1 − ti(1− ti)1−β

∫ 1

0

(1− s)1−β+α−2ds

= C supi

ti+1 − ti(1− ti)1−β

≤ C1

βn.

68

Page 69: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Conclusion: For ti = 1− (1− i/n)1β , with β < α, we retrieve∫ 1

0

E[|Zt − Zt|2

]dt ≤ C |π|.

If one takes a uniform grid, then one only has:∫ 1

0

E[|Zt − Zt|2

]dt ≤ C |π|α.

Example: If g is α-Hölder with polynomial growth then g(X1) ∈ L2,α. Ifg(x) = 1x≥xo and X1 = W1, then g(X1) ∈ L2,12

.

69

Page 70: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

6 Additional references

More can be done, here are (some) additional contributions:

• Doubly reflected BSDEs: Chassagneux [15].

• Multi-dimensional BSDEs with oblique reflection : Chassagneux, Elie &Kharroubi [18].

• Quadratic BSDEs: Chassagneux & Richou [19].

• Coupled FBSDEs: Delarue & Menozzi [24], Bender & Zhang [4].

• Second order BSDEs: Fahim, Touzi & Warrin [27], see also B., Elie &Touzi [8].

• Linear multi-step or Runge-Kutta type schemes: Chassagneux [16], andChassagneux & Crisan [17].

70

Page 71: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Part III

Computation of conditional expectations

1 Motivation

The typical numerical scheme is of the form

Yπti

= Eti[Yπti+1

] +1

nf (Xπ

ti, Y

πti, Z

πti)

Zπti

= nEti[Yπti+1

(Wti+1−Wti)],

where Y πtn

= g(Xπtn

).

To compute this quantities, one needs to be able to estimate efficiently thedifferent conditional expectations.

71

Page 72: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

2 Generalities

In practice, we approximate

Y π1 = g(Xπ

1 )

Y πti

= E[Y πti+1| Xπ

ti

]+

1

nf (Xπ

ti, Y π

ti, Z

πti)

by

Y π1 = g(Xπ

1 )

Y πti

= E[Y πti+1| Xπ

ti

]+

1

nf (Xπ

ti, Y π

ti, Zπ

ti)

Zπti

= E[Y πti+1

(Wti+1−Wti) | X

πti

]

where E[Y πti+1| Xπ

ti

]and E

[Y πti+1

(Wti+1−Wti) | Xπ

ti

]are estimators of E

[Y πti+1| Xπ

ti

]and E

[Y πti+1

(Wti+1−Wti) | Xπ

ti

]72

Page 73: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Theorem (B. & Touzi [11],. . . )

‖Y πti− Y π

ti‖Lp ≤ nCp max

0≤j≤n−1Ej,p(E, n)

with

Ej,p(E, n) := ‖E[Y πtj+1| Xπ

tj

]− E

[Y πtj+1| Xπ

tj

]‖Lp

+‖E[Y πtj+1

(Wti+1−Wti) | X

πtj

]− E

[Y πtj+1

(Wti+1−Wti) | X

πtj

]‖Lp

Remark :1- At best Ej,p(E, n) ∼ N−1/2 if computed by pure Monte-Carlo.2- ‖Yti − Y π

ti‖Lp ∼ n−1/2, hence to get ‖Y π

ti− Y π

ti‖Lp ∼ n−1/2 we need to take

at least N = n3 if the global error is of order n−12 + nN−

12 .

73

Page 74: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

3 Integration by parts technique

See B., Ekeland & Touzi [6], B. & Warin [12], Crisan, Manolarakis & Touzi[23].

3.1 Conditional expectation representation in the Gaussian case

Set v(Wti) = Y πti

(we omit the dependence of v on the time variable).

Reduction of the problem :

E[v(Wti+1

) | Wti = w]

=E[δw(Wti)v(Wti+1

)]

E [δw(Wti)]

δw Dirac mass at w.

74

Page 75: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

• Integration by parts argument (fδ = density N (0, δ)):

E[δw(Wti)v(Wti+1)]

=∫ ∫δw (x) v (x + y) fti(x)f(ti+1−ti)(y)dxdy

=∫ ∫1x≥w

[v (x + y)

x

ti− v′ (x + y)

]fti(x)f(ti+1−ti)(y)dxdy

=∫ ∫1x≥wv (x + y)

[x

ti− y

ti+1 − ti

]fti(x)f(ti+1−ti)(y)dxdy ,

=

E[1Wti

≥wv(Wti+1

) (Wtiti− Wti+1

−Wtiti+1−ti

)]

75

Page 76: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Alternative formulation :

E[v(Wti+1

) | Wti = w]

=E[1Wti

≥wv(Wti+1

) (Wtiti− Wti+1

−Wtiti+1−ti

)]E[1Wti

≥wWti

ti

]︸ ︷︷ ︸E[δw(Wti

)]=fWti(w)

Monte-Carlo estimator : W (`)`, N copies of W

E[v(Wti+1

) | Wti = w]

:=

1N

∑1

W(`)ti≥wv (W (`)

ti+1

)(W

(`)titi−

W(`)ti+1−W (`)

titi+1−ti

)1N

∑1

W(`)ti≥wW (`)

titi

76

Page 77: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Variance estimation

Var

[W

(`)ti+1−W (`)

ti

ti+1 − ti

]12

=(ti+1 − ti)

12

ti+1 − ti= n

12

leading to

Var

[1

N

∑`

1W

(`)ti≥wv (W (`)

ti+1

)(W (`)ti

ti−W

(`)ti+1−W (`)

ti

ti+1 − ti

)]12

∼ n12

N12

.

77

Page 78: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

3.2 Variance Reduction in the Gaussian Case: Control variate

E[v(Wti+1

) (Wtiti− Wti+1

−Wtiti+1−ti

)]=∫ ∫

v (x + y)

[x

ti− y

ti+1 − ti

]fti(x)f(ti+1−ti)(y)dxdy

=∫ ∫v′ (x + y) fti(x)f(ti+1−ti)(y)− v′ (x + y) fti(x)f(ti+1−ti)(y)dxdy

=

0

We can then replace 1Wti≥w by (1Wti

≥w − c(w)):

E[v(Wti+1

) | Wti = w]

=E[(

1Wti≥w − c(w)

)v(Wti+1

) (Wtiti− Wti+1

−Wtiti+1−ti

)]E[(

1Wti≥w − c(w)

)Wtiti

]78

Page 79: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

3.3 Variance Reduction in the Gaussian case : Localization

Take ϕ smooth in L2(R) with ϕ(0) = 1

E[δw(Wti)v(Wti+1)]

=∫ ∫δw (x)ϕ(x− w)v (x + y) fti(x)f(ti+1−ti)(y)dxdy

=

. . .

=

E[1Wti

≥wv(Wti+1

)ϕ (Wti − w)

(Wtiti− Wti+1

−Wtiti+1−ti

)− ϕ′ (Wti − w)

]

79

Page 80: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

This shows that

E[v(Wti+1

) | Wti = w]

=

E

[1Wti≥w

v(Wti+1)ϕ(Wti

−w)(Wtiti−Wti+1

−Wtiti+1−ti

)−ϕ′(Wti

−w)]

E

[1Wti≥w

ϕ(Wti

−w)Wtiti−ϕ′(Wti

−w)]

One can then try to minimize an indicator of the variance

minϕ∈L2(R), ϕ(0)=1

∫RE[1Wti

≥w [Fϕ (Wti − w)−Gϕ′ (Wti − w)]2]dw

with

F = v(Wti+1

)(Wti

ti−Wti+1

−Wti

ti+1 − ti

)G = v

(Wti+1

)

80

Page 81: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Calculus of Variation : ϕ is optimal iif for all smooth φ with φ(0) = 0 andcompact support, and ε > 0∫

RE[1Wti

≥w [Fϕ (Wti − w)−Gϕ′ (Wti − w)]2]

≤∫RE[1Wti

≥w [F (ϕ± εφ) (Wti − w)−G(ϕ′ ± εφ′) (Wti − w)]2]

81

Page 82: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Sending ε→ 0

0 =

∫RE[1Wti

≥w (Fϕ (Wti − w)−Gϕ′ (Wti − w))

(Fφ (Wti − w)−Gφ′ (Wti − w))] dw

= E[∫ ∞

0

(Fϕ (y)−Gϕ′ (y)) (Fφ (y)−Gφ′ (y)) dy

]Fubini + change of variable y = Wti(ω)− w

= E[∫ ∞

0

φ(y)(F 2ϕ (y)−G2ϕ

′′(y))dy

]integration by parts

=

∫ ∞0

φ(y)(E[F 2]ϕ (y)− E

[G2]ϕ′′

(y))dy

⇒ E[F 2]ϕ (y)− E

[G2]ϕ′′

(y) = 0.

82

Page 83: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Optimal Localizing Function : ϕ(y) = e−ηy with

η2 = E[F 2]/E[G2]

=

E[v(Wti+1

)2 (Wtiti− Wti+1

−Wtiti+1−ti

)2]E[v(Wti+1

)2]∼ n

For ϕ(y) = e−ηy we have ϕ′(y) = −ηϕ(y)

E[1Wti

≥wv(Wti+1

)ϕ (Wti − w)

(Wtiti− Wti+1

−Wtiti+1−ti

)− ϕ′ (Wti − w)

]=

E[1Wti

≥wv(Wti+1

)ϕ (Wti − w)

(Wtiti− Wti+1

−Wtiti+1−ti

+ η)]

where

E

[(Wti+1

−Wti

ti+1 − ti

)2]1

2

=√n ∼ η

83

Page 84: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

3.4 Full numerical scheme (f indep. of Z for simplicity)

We consider N copies (Xπ(1), . . . , Xπ(N)) of Xπ.

Initialization : For all j : Y π(j)1 = g

(Xπ(j)

1

).

Backward induction : For i = n− 1, . . . , 1, we set, for all j :

Y π(j)

ti= E

[Y πti+1| Xπ

ti= Xπ(j)

ti

]+

1

nf (Xπ(j)

ti, Y π(j)

ti)

where

E[Y πti+1| Xπ

ti= Xπ(j)

ti

]=

∑`≤N

1Xπ(j)ti−1≤Xπ(`)

ti−1Y π(`)ti+1Sh(`)

(ϕ(Xπ(`)

ti−1 −Xπ(j)ti−1)

)∑`≤N

1Xn(j)ti−1≤Xπ(`)

ti−1Sh(`)

(ϕ(Xπ(`)

ti−1 −Xπ(j)ti−1)

)(up to an additional truncation given by a-priori bounds).

Remark: Only needs O(N lnN) operations (first sort the simulations) and notN 2.

84

Page 85: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

In dimension d, we can do the same by performing d integration by parts. Iff depends on Z, the corresponding conditional expectation is computed simi-larly.

Theorem (B. & Touzi [11])

‖Y πti− Y π

ti‖Lp ≤ nCp max

0≤j≤n−1Ej,p(E, n)

If we choose ϕn = ϕ(√nx). Then,

max1≤j≤n−1

Ej,p(E, n) ≤ Cnd4pN−

12p .

85

Page 86: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Global error :

max0≤i≤n

‖Y πti− Yti‖Lp ≤ Cp

(n−

12 + n

nd4p

N12p

)

• n−12 : discretization error

• n : number of regression estimations• N−

12p : convergence rate of the regression estimator in terms of the number

of simulations N .• n

d4p : "variance" of the regression operator.

For an Lp error of order of n−12 : N ∼ n3p+

d2 .

86

Page 87: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

3.5 The general case

• The conditional expectations can be written similarly by using a Mallia-vian calculus integration by parts argument. The terms corresponding tothe Sh are now Skorohod integrals that can be decomposed into Itô andLebesgue integrals. See B., Ekeland & Touzi [6]. The analysis of the erroris the same. See B. & Touzi [11].

• The sums in the estimators can be computed efficiently by using theequivalent of a sorting method in dimension d. Its complexity is of or-der O(N(lnN)(d−1)∨1). See B. & Warin [12].

• An alternative formulation with reduced complexity in the computationof the Malliavin weights is given in Crisan, Manolarakis & Touzi [23]. Itconsists in neglecting a (neglectable) term whose computation complexityis not neglectable.

87

Page 88: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

3.6 Importantly

• The localization by the ϕ functions is crucial in practice.

• Variance reduction by control variate is also important. One can re-place Y π by Y π − Y such that the conditional expectations of Yti+1

andYti+1

(Wti+1−Wti) are explicit.

• Truncation is also crucial: a-priori bounds for Y and Z are often easyto compute (recall the conditional expectation expression for Z obtainedabove).

88

Page 89: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

4 Non-parametric regression approach

Initiated by Carrière [14] and Longstaff & Schwartz [34] in the context ofAmerican option pricing (made first rigorous by Clément, Lamberton & Prot-ter [20]).

General idea: replace E[Y πti+1|Xπ

ti] by an estimator of its projection on the

space generated by (ψm(Xπti

))m≤M for some deterministic “basis functions”(ψm)m≤M .Given simulations (ξ(j), Xπ(j)

ti)j≤N of (ξ,Xπ

ti), we set

E[ξ|Xπti

= Xπ(j)

ti] :=

M∑m=1

αti,mψm(Xπ(j)

ti)

where (αti,m)m≤M minimizesN∑j=1

|ξ(j) −M∑m=1

αmψm(Xπ(j)

ti)|2

over (αm)m≤M ∈ RM .89

Page 90: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

4.1 Full numerical scheme (f indep. of Z for simplicity)

We consider N copies (Xπ(1), . . . , Xπ(N)) of Xπ.

Initialization : For all j : Y π(j)1 = g

(Xπ(j)

1

).

Backward induction (for the explicit scheme): For i = n− 1, . . . , 1, we set, forall j :

Y π(j)

ti= E

[Y πti+1

+1

nf (Xπ

ti, Y π

ti+1) | Xπ

ti= Xπ(j)

ti

]

90

Page 91: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

4.2 Choice of the basis functions

Ideally, one should use a first element ψti,1 that is supposed to be close to thereal conditional expectation and then complete the basis so that it is orthogonalfor the law of Xπ

ti(or a proxy).

Very often, people use polynomials. The choice is very difficult in practice in“high” dimension.... and the induced error difficult to quantify.

It turns out to be more stable to use local polynomials: e.g. piecewise lin-ear maps on a space partition. When the size of the partition vanishes, weare certain that convergence holds (and we can quantify it, think at v beingLipschitz).

This partition should be consistent with the law of Xπti. It can be constructed

in an adaptative way: simulate copies of Xπti, then built up a partition so that

each part contains approximately the same number of points. This can bedone efficiently by sorting like methods, see [12].

91

Page 92: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

4.3 Error analysis

The precise error analysis du to estimating the coefficients of the regressionhas been performed in Gobet, Lemore & Warin [33].

The expression of the error takes half a page... see the paper.

92

Page 93: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

4.4 The multistep forward dynamic programming scheme

The MDP scheme consists in replacing

Yπti

:= Eti[Yπti+1

] +1

nf (Xπ

ti, Y

πti, Z

πti) , Z

πti

:= nEti[Yπti+1

(Wti+1−Wti)],

by

Yπti

:= Eti[g(Xπtn

) +1

n

n−1∑k=i

f (Xπtk, Y

πtk+1

, Zπtk

)]

Zπti

:= nEti[g(Xπtn

) +1

n

n−1∑k=i+1

f (Xπtk, Y

πtk+1

, Zπtk

)(Wti+1−Wti)].

In terms of discrete time approximation, it is equivalent to the forward versionof our scheme, the convergence is not impacted as |π| → 0.

However, it is shown in Gobet & Turkedjiev [32] that it provides a bettercontrol on the conditional expectations estimation error : this procedure avoidsthe propagation of the error estimation along the backward algorithm.

93

Page 94: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

4.5 Parallelized algorithm

A version of the algorithm that can be parallelized on GPU has been proposedin Gobet et al. [30]:

• Fix hyper-cubes (Hk)k≤K.

• For each k, simulate N path of Xπ on [ti, 1] starting from an iid drawnpoint in Hk according to a conditional logistic distribution.

• Backward induction: For each j = n − 1, . . . , 0, use the MDP schemeto compute Y π

tigiven that Xπ ∈ Hk by using the simulated path of Xπ

on [ti, 1] starting from Hk and the knowledge of the estimated functionalform of (Y

πtj

)i<j≤n.

Each hyper-cube can be treated in parallel at each backward step.

94

Page 95: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

5 Quality test

5.1 The case of American options

For American options, the algorithm becomes:

Initialization : For all j : Y π(j)1 = g

(Xπ(j)

1

).

Backward induction: For i = n− 1, . . . , 1, we set, for all j :

Y π(j)

ti= maxg(Xπ(j)

ti), E

[Y πti+1| Xπ

ti= Xπ(j)

ti

]

Because of the max operator, one can expect that the estimator is upper-biased Y π(1)

0 ≥ Y <0 , where Y < is the price of the Bermudean option withdiscrete exercise times.

95

Page 96: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

Alternatively, one can use:

Initialization : For all j : Y π(j)1 = g

(Xπ(j)

1

)and τ (j) = T .

Backward induction: For i = n− 1, . . . , 1, we set, for all j :

• Case 1: If g(Xπ(j)ti

) < E[g(Xπτ ) | Xπ

ti= Xπ(j)

ti] then τ (j) ← τ (j).

• Case 2: If g(Xπ(j)ti

) ≥ E[g(Xπτ ) | Xπ

ti= Xπ(j)

ti] then τ (j) ← ti.

This provides an estimate of the optimal stopping policy on the different path.Then, one computes

Y π0 :=

1

M

M∑j=1

g(Xπ(j)

τ (j))

Because τ is suboptimal, one expects that Y π0 ≤ Y <.

Quality test: Y π0 should be less but close to Y π(1)

0 .

96

Page 97: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

5.2 Mean L2-discrete trajectory error

This was proposed by Bender & Steiner [3] in the context of the non-parametricapproach.It consists in computing:

1

M

M∑j=1

maxk≤n|k−1∑i=0

Y π(j)

ti+1− Y π(j)

ti− 1

nf (Xπ(j)

ti, Y π(j)

ti, Zπ(j)

ti)− Zπ(j)

ti(W

(j)ti+1−W (j)

ti)|2.

By the law of large number, it converges to

Err2ψ := maxk≤n

E[|k−1∑i=0

Y πti+1− Y π

ti− 1

nf (Xπ

ti, Y π

ti, Zπ

ti)− Zπ

ti(Wti+1

−Wti)|2].

If the above is computed with the true coefficients corresponding to the projec-tion on the basis (ψti,m)m≤M , then Errψ provides, up to an additional O(|π|)term, an upper-bound of the error due to replacing the true conditional ex-pectations by the corresponding projections.In practice, one estimates the coefficients with a bunch of simulations, andthen use independent simulations to compute the above error criteria.

97

Page 98: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

6 More...

One can also use more “deterministic” techniques:

• The quantization approach: Bally, Pagès & Printems [2], Bronstein, Pagès& Portès [13].

• The cubature approach: Crisan & Manolarakis [22].

98

Page 99: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

References[1] V. Bally and G. Pages. Error analysis of the optimal quantization algorithm for obstacle problems. Stochastic processes

and their applications, 106(1):1–40, 2003.

[2] V Bally, G Pages, and J Printems. A quantization metod for pricing and hedging multi-dimensional american style options.Mathematical Finance, 15(1):119–168, 2005.

[3] C. Bender and J. Steiner. Error criteria for numerical solutions of backward sdes. Preprint, 42, 2010.

[4] C. Bender and J. Zhang. Time discretization and markovian iteration for coupled fbsdes. The Annals of Applied Probability,18(1):143–177, 2008.

[5] B. Bouchard. Lecture notes on bsdes main existence and stability results. Hal preprint hal-01158912, 2015.

[6] B. Bouchard, I. Ekeland, and N. Touzi. On the malliavin approach to monte carlo approximation of conditional expectations.Finance and Stochastics, 8(1):45–71, 2004.

[7] B. Bouchard and R. Elie. Discrete-time approximation of decoupled forward–backward sde with jumps. Stochastic Processesand their Applications, 118(1):53–75, 2008.

[8] B. Bouchard, R. Elie, and N. Touzi. Discrete-time approximation of bsdes and probabilistic schemes for fully nonlinearpdes. Advanced Financial Modelling. Radon Ser. Comput. Appl. Math, 8:91–124, 2009.

[9] B. Bouchard, S. Geiss, and E. Gobet. First time to exit of a continuous itô process: general moment estimates andl1-convergence rate for discrete time approximations. arXiv preprint arXiv:1307.4247, 2013.

[10] B. Bouchard and S. Menozzi. Strong approximations of bsdes in a domain. Bernoulli, 15(4):1117–1147, 2009.

[11] B. Bouchard and N. Touzi. Discrete-time approximation and monte-carlo simulation of backward stochastic differentialequations. Stochastic Processes and their applications, 111(2):175–206, 2004.

[12] B. Bouchard and X. Warin. Monte-carlo valuation of american options: facts and new algorithms to improve existingmethods. In Numerical methods in finance, pages 215–255. Springer, 2012.

99

Page 100: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

[13] A. L. Bronstein, G. Pagès, and J. Portès. Multi-asset american options and parallel quantization. Methodology andComputing in Applied Probability, 15(3):547–561, 2013.

[14] Jacques F Carriere. Valuation of the early-exercise price for options using simulations and nonparametric regression.Insurance: mathematics and Economics, 19(1):19–30, 1996.

[15] J.-F. Chassagneux. A discrete-time approximation for doubly reflected bsdes. Advances in Applied Probability, 41(1):101–130, 2009.

[16] J.-F. Chassagneux. Linear multistep schemes for bsdes. SIAM Journal on Numerical Analysis, 52(6):2815–2836, 2014.

[17] J.-F. Chassagneux and D. Crisan. Runge–kutta schemes for backward stochastic differential equations. The Annals ofApplied Probability, 24(2):679–720, 2014.

[18] J.-F. Chassagneux, R. Elie, and I. Kharroubi. Discrete-time approximation of multidimensional bsdes with oblique reflec-tions. The Annals of Applied Probability, 22(3):971–1007, 2012.

[19] J.-F. Chassagneux and A. Richou. Numerical simulation of quadratic bsdes. The Annals of Applied Probability, 26(1):262–304, 2016.

[20] E. Clément, D. Lamberton, and P. Protter. An analysis of a least squares regression method for american option pricing.Finance and Stochastics, 6(4):449–471, 2002.

[21] Michael G Crandall, Hitoshi Ishii, and Pierre-Louis Lions. User?s guide to viscosity solutions of second order partialdifferential equations. Bulletin of the American Mathematical Society, 27(1):1–67, 1992.

[22] D. Crisan and K. Manolarakis. Second order discretization of backward sdes and simulation with the cubature method.The Annals of Applied Probability, 24(2):652–678, 2014.

[23] D. Crisan, K. Manolarakis, and N. Touzi. On the monte carlo simulation of bsdes: An improvement on the malliavinweights. Stochastic Processes and their Applications, 120(7):1133–1158, 2010.

[24] F Delarue and S Menozzi. A forward-backward algorithm for quasi-linear pde. (6):140–184.

100

Page 101: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

[25] N. El Karoui, S. Hamadène, and A. Matoussi. Backward stochastic differential equations and applications. Indifferencepricing: theory and applications, pages 267–320, 2008.

[26] N. El Karoui and L. Mazliak. Backward stochastic differential equations, volume 364. CRC Press, 1997.

[27] A. Fahim, N. Touzi, and X. Warin. A probabilistic numerical method for fully nonlinear parabolic pdes. The Annals ofApplied Probability, pages 1322–1364, 2011.

[28] C. Geiss, S. Geiss, and E. Gobet. Generalized fractional smoothness and lp-variation of bsdes with non-lipschitz terminalcondition. Stochastic Processes and their Applications, 122(5):2078–2116, 2012.

[29] E. Gobet and C. Labart. Error expansion for the discretization of backward stochastic differential equations. Stochasticprocesses and their applications, 117(7):803–829, 2007.

[30] E. Gobet, J. Lopez-Salas, P. Turkedjiev, and C. Vasquez. Stratified regression monte-carlo scheme for semilinear pdes andbsdes with large scale parallelization on gpus. preprint, 2015.

[31] E. Gobet and A. Makhlouf. Time regularity of bsdes with irregular terminal functions. Stochastic Processes and theirApplications, 120(7):1105–1132, 2010.

[32] E. Gobet and P. Turkedjiev. Linear regression mdp scheme for discrete backward stochastic differential equations undergeneral conditions. Mathematics of Computation, 85(299):1359–1391, 2016.

[33] J.-P. Lemor, E. Gobet, and X. Warin. Rate of convergence of an empirical regression method for solving generalizedbackward stochastic differential equations. Bernoulli, 12(5):889–916, 2006.

[34] F. A. Longstaff and E. S. Schwartz. Valuing american options by simulation: a simple least-squares approach. Review ofFinancial studies, 14(1):113–147, 2001.

[35] J. Ma and J. Zhang. Path regularity for solutions of backward stochastic differential equations. Probability Theory andRelated Fields, 122(2):163–190, 2002.

[36] J. Ma and J. Zhang. Representations and regularities for solutions to bsdes with reflections. Stochastic processes and theirapplications, 115(4):539–569, 2005.

101

Page 102: Bruno Bouchard - CEREMADEbouchard/pdf/BSDE...2Generalities72 3Integrationbypartstechnique74 3.1ConditionalexpectationrepresentationintheGaussiancase. .74 3.2VarianceReductionintheGaussianCase:

[37] E. Pardoux, F. Pradeilles, and Z. Rao. Probabilistic interpretation of a system of semi-linear parabolic partial differentialequations. In Annales de l’IHP Probabilités et statistiques, volume 33, pages 467–490, 1997.

[38] S. Peng. Backward stochastic differential equation, nonlinear expectation and their applications. In Proceedings of theInternational Congress of Mathematicians, volume 1, pages 393–432, 2011.

[39] J. Zhang. A numerical scheme for bsdes. The Annals of Applied Probability, 14(1):459–488, 2004.

102