24
CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire de Probabilit´ es et Mod` eles al´ eatoires Universit´ e Paris 6 et Paris 7 Laboratoire de Finance-Assurance CREST [email protected] Abstract. We generalize the notion of brownian bridge. More precisely, we study a standard brownian motion for which a certain functional is conditioned to follow a given law. Such processes appear as weak solutions of stochastic differential equations which we call conditioned stochastic differential equa- tions. The link with the theory of initial enlargement of filtration is made and after a general presentation several examples are studied: the conditioning of a standard brownian motion by its value at a given date, the conditioning of a geometric brownian motion with negative drift by its quadratic variation and finally the conditioning of a standard brownian motion by its first hitting time of a given level. The conditioned stochastic differential equation associated with the quadratic variation of the geometric brownian motion allows us to give a new proof of the extension of the Matsumoto-Yor’s hXi X theorem. More- over, we show that the set of all the bridges over a given diffusion Z can be parametrized by a generalized Burger’s equation whose solutions are related by the Hopf-Cole transformation to the positive space-time harmonic functions of Z. As a consequence of this, we deduce that the set of diffusions which have the same bridges as Z is parametrized by the positive eigenfunctions of the generator of Z. Key words and phrases. Brownian bridge, Stochastic differential equation, Initial enlargement of filtrations, Filtering, Burger’s equation, Matsumoto-Yor’s hXi X property. Mathematics Subject Classification. 60H10, 60J65, 60J60. 1

CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS:THEORY AND APPLICATIONS

FABRICE BAUDOIN

Laboratoire de Probabilites et Modeles aleatoires

Universite Paris 6 et Paris 7Laboratoire de Finance-Assurance

[email protected]

Abstract. We generalize the notion of brownian bridge. More precisely, westudy a standard brownian motion for which a certain functional is conditioned

to follow a given law. Such processes appear as weak solutions of stochasticdifferential equations which we call conditioned stochastic differential equa-

tions. The link with the theory of initial enlargement of filtration is made and

after a general presentation several examples are studied: the conditioning ofa standard brownian motion by its value at a given date, the conditioning of a

geometric brownian motion with negative drift by its quadratic variation andfinally the conditioning of a standard brownian motion by its first hitting timeof a given level. The conditioned stochastic differential equation associated

with the quadratic variation of the geometric brownian motion allows us to

give a new proof of the extension of the Matsumoto-Yor’s〈X〉X

theorem. More-over, we show that the set of all the bridges over a given diffusion Z can beparametrized by a generalized Burger’s equation whose solutions are related

by the Hopf-Cole transformation to the positive space-time harmonic functionsof Z. As a consequence of this, we deduce that the set of diffusions which havethe same bridges as Z is parametrized by the positive eigenfunctions of thegenerator of Z.

Key words and phrases. Brownian bridge, Stochastic differential equation, Initial enlargement

of filtrations, Filtering, Burger’s equation, Matsumoto-Yor’s〈X〉X

property.

Mathematics Subject Classification. 60H10, 60J65, 60J60.

1

Page 2: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

2 FABRICE BAUDOIN

1. Introduction

In this paper, we present a natural generalization of the brownian bridges.More precisely, given1. A horizon time T ∈]0,+∞]2. A functional Y on the Wiener space measurable with respect to events which

can occur before the time T and valued in some polish space S3. A probability measure ν on B (S) ,we construct a probability Pν on the Wiener space which satisfies:1. Pν and P coincide on the events independent of Y2. The law of Y under Pν is precisely ν.The dynamics of the coordinate process under Pν defines a stochastic differential

equation which we call a conditioned stochastic differential equation (in abbreviateCSDE). Under some pathwise uniqueness conditions, this CSDE defines hence ona general filtered probability space a process whose law is Pν .

The present paper is organized as follows.In section 2, we show how to construct the probability Pν and we give explicitely

the associated CSDE. Then we show the link between the classical theory of initialenlargement of filtration (see [3], [15], [17], [18] and [19]) and the theory of CSDEs.First, the decomposition of X in its natural filtration enlarged by Y does notdepend on the probability ν and then we can recover the CSDE satisfied by X fromthis decomposition by a simple filtering formula. In financial applications this linkis very interesting because several authors (see [3] and [30]) considered securitiesmarkets in which one insider possesses from the beginning extra information aboutthe outcome of some variable (this is a higher information level). Moreover, we showhow our probability Pν is related to the martingale preserving measure consideredin [3]. To conclude the section, we show how Malliavin calculus and Fourier analysiscan be used in the computations when S = R

N and the functional Y differentiablein Malliavin’s sense.

In section 3, we show that the set of the probabilities associated with the con-ditioning of the terminal value of a markovian diffusion can be parametrized bya generalized Burger’s equation whose solutions are related to the positive space-time harmonics by the Hopf-Cole transformation (see [11] and [16]). From this, wededuce a set of diffusions which have the same bridges as the given diffusion. Thisgeneralizes [8].

In section 4, we provide several examples. First, we consider the important casewhere the functional Y is the value at a given date. In this case the CSDE obtainedis nothing else but the well-known Doob’s h-transform of a brownian motion seenas a stochastic differential equation. This equation is studied and many particularexamples are given. As a second example, we study the case where the conditionedfunctional is the quadratic variation of a geometric brownian motion with negativedrift. In this case, the set of processes which is obtained is nothing else but the setof the processes X considered in [4] and for which the process 〈X〉X is a diffusion inits own filtration. This gives hence a new look on this property which is completelydifferent of the Matsumoto-Yor’s approach [4], [23], [25]. We give a last exampledirectly deduced from the previous one by a Lamperti representation, which is theconditioning by the first hitting time of a level.

Page 3: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

CSDE 3

2. Conditioned stochastic differential equation locally equivalent to abrownian motion

2.1. The setting. We work on the Wiener space

W =(C∞, (Ft)t≥0 , (Bt)t≥0 ,P

)1. C∞ is the space of continuous functions R+ → R (for T > 0 , CT will denote

the space of continuous functions [0, T ]→ R)2. (Bt)t≥0 is the coordinate process (Bt)t≥0 defined by Bt(f) = f (t)3. (Ft)t≥0 is the natural filtration of (Bt)t≥0

4. P is the Wiener measure.

Definition 2.1. A conditioning on the Wiener space is a triplet (T, Y, ν) with:1. T ∈]0,+∞] a time horizon. It corresponds to the period of time on which a

conditioning is made2. Y a FT -measurable random variable valued in a Polish space S endowed with

its Borel σ−algebra B (S) . It corresponds to the functionals of the trajectoriesbeing conditioned

3. ν a probability measure on B (S) . It corresponds to the conditioning.

We will write simply P for P/FT , as the following study only involves the timeinterval [0, T ] and we shall make the following assumptions:• (A1) For 0 ≤ t < T , on the probability space CT × S endowed with the

filtration

Ft = Ft ⊗ B (S)

the law of(

(Bu)u≤t , Y)

under P is absolutely continuous with respect to thelaw

P/Ft ⊗ PYwhere PY is the law of Y under P. The Radon-Nikodym density will bedenoted by

ηyt , 0 ≤ t < T , y ∈ S

• (A2)

Supp ν ⊂ SuppPYand

L1 (S,PY ) ⊂ L1 (S, ν)

2.2. Minimal probability. We first note the following immediate consequence ofthe existence of a regular conditional probability given Y

Proposition 2.1. On FT there exists a unique probability measure Pν such that:1. If X : (CT ,FT )→ (S,B (S)) is a bounded random variable then

Eν (X | Y ) = E (X | Y )

2. The law of Y under Pν is ν

Page 4: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

4 FABRICE BAUDOIN

Pν is given by the following disintegration formula

For A ∈ FT , Pν (A) =∫S

P (A | Y = y) ν (dy)

The following remarks are worth recording:

Remark 2.1.

1. Pν = P⇔ ν = PY

2. The map ν → Pν is continuous for the weak topology

3. If A ∈ FT is P−independent of Y then it is also Pν−independent of Y4.

Supp Pν ⊂ SuppPand

L1 (S,P) ⊂ L1 (S,Pν)

Definition 2.2. We call Pν the minimal probability associated with the condition-ing (T, Y, ν) .

Let us denote by E is the set of probability measures on FT absolutely continuouswith respect to P such that:

1.

E

((dQ

dP

)2)< +∞

2. The law of Y under Q is ν.The terminology ”minimal” for Pν comes from the two following minimizing

properties.

Proposition 2.2. Assume that

ν PY

and ∫S

(dν

dPY

)2

dPY < +∞

Then:1. Pν is the minimal variance probability, i.e.

infQ∈E

E

((dQ

dP

)2)

= E

((dPν

dP

)2)

2. Pν is the minimal relative entropy probability, i.e.

infQ∈E

H (Q | P) = H (Pν | P)

where H (Q | P) is the relative entropy of Q with respect to P defined by

H (Q | P) := E

(dQ

dPlndQ

dP

)Proof.1. It suffices to make an orthogonal decomposition in L2 (FT ,P)

Page 5: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

CSDE 5

2. See [14] and [28] pp 166 The following proposition shows the heredity of the negligible sets by the map

ν → Pν which is trivially extended to general measures.

Proposition 2.3. Let

ν = νa + νs

with

νa PY and νs ⊥ PYthe Lebesgue decomposition of ν. Then

dPν =dνadPY

(Y ) dP+dPνs

is the Lebesgue decomposition of Pν with respect to P.In particular

Pν ∼ P⇔ ν ∼ PY

Our aim, now, is to make stochastic calculus under the minimal probability Pν .To do this, the first step is to exhibit the martingale density process of Pν withrespect to P.

Proposition 2.4. For t < T , Pν/Ft is absolutely continuous with respect to P/Ftand

dPν/Ft =∫S

ηyt ν (dy) dP/Ft

Proof. Let t < T. Let X a bounded Ft−measurable random variable

Eν (X) =

∫S

E (X | Y = y) ν (dy)

but, from (A1) for PY − a.s. y ∈ SE (X | Y = y) = E (ηyt X)

Now because of (A2), we can apply Fubini’s theorem to get the expected result

Eν (X) = E

(∫S

ηyt ν (dy)X)

Example 2.1. Let us consider the conditioning (T,ZT , ν) where Z is a markoviandiffusion started at 0, in this case the martingale density process of the minimalprobability associated with this conditioning is given by∫ +∞

−∞

PT−t (Zt, dy)PT (0, dy)

ν (dy) , t < T

where Pt (z, dy) is the semi group of Z.

Proposition 2.5. The martingale density process

Dt =∫S

ηyt ν (dy) , t < T

is not in general uniformly integrable, but has the following P−a.s limit

limt→T

∫S

ηyt ν (dy) =dνadPY

(Y )

Page 6: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

6 FABRICE BAUDOIN

In order to explicit the semimartingale decomposition of B under Pν by theGirsanov’s theorem, we need the following well-known lemma in the theory of initialenlargement of a brownian filtration.

Lemma 2.6. (see [3], [15] , [17] ). There exists a jointly measurable process

[0, T ] × CT × S → R

( t , ω , y ) → αyt (ω)

such that:

1. For PY − a.s. y ∈ S, the process (αyt )0≤t≤T is F−predictable2. For PY − a.s. y ∈ S and for 0 ≤ t < T,

P

(∫ t

0

(αys)2ds < +∞

)= 1

3. For PY − a.s. y ∈ S and for 0 ≤ t < T,

〈ηy, B〉t =∫ t

0

αysηysds

Remark 2.2. We can choose α such that for PY − a.s. y ∈ S and for 0 ≤ t < T,

ηyt = exp[∫ t

0

αysdBs −12

∫ t

0

(αys)2ds

]on ηyt > 0

see [15].

With this lemma, we can now state

Theorem 2.7. (Bt)0≤t<T is a (Ft,Pν) semimartingale whose decomposition isgiven by

dBt =

∫Sαyt η

yt ν (dy)∫

Sηyt ν (dy)

dt+ dWt , t < T(2.1)

where (Wt)0≤t<T is a standard (Ft,Pν) brownian motion.

Proof. The process

Dt =∫S

ηyt ν (dy) , 0 ≤ t < T

is the density process of Pν with respect to P. By lemma 2.6. and Fubini’s theorem,we have

d〈D,B〉t =(∫ +∞

−∞αyt η

yt ν (dy)

)dt

The result is then a consequence of Girsanov’s theorem As the process

( ∫Sαyt η

yt ν(dy)∫

Sηyt ν(dy)

)0≤t<T

is F−adapted, there exists a predictable

function F such that for all t < T∫Sαyt η

yt ν (dy)∫

Sηyt ν (dy)

= F(t, (Bs)s≤t

)

Page 7: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

CSDE 7

Definition 2.3. Let(

Ω, (Ht)0≤t<T ,Q)

a filtered probability space on which astandard brownian motion (βt)0≤t<T is defined. The stochastic differential equation

Xt =∫ t

0

F(s, (Xu)u≤s

)ds+ βt , t < T(2.2)

will be called the conditioned stochastic differential equation (in abbreviate CSDE)associated with the conditioning (T, Y, ν) .

Remark 2.3.1. By construction, the stochastic differential equation (2.2) has always the weak

solution (B,W ) defined on the filtered probability space(CT , (Ft)0≤t<T ,P

ν)

2. The previous definition is not rigid. We mean that if the conditioned func-tional Y can be simply expressed by a semimartingale (Zt)0≤t≤T which is Fadapted , then it will be more convenient to work on the semimartingale de-composition of Z in the filtered probability space

(CT , (Ft)0≤t<T ,P

ν). And

if this decomposition can be seen as a stochastic differential on Z, we willspeak again of CSDE (see examples) .

It is easy to prove by the usual techniques of ordinary differential equations(Gronwall’s lemma) a convenient criterion which ensures that pathwise uniquenessholds for (2.2) , this leads, thanks to Yamada-Watanabe’s theorem (see [32] pp.368) to

Theorem 2.8. Assume that F is such that ∀ τ ∈ [0, T [ , ∀ a > 0 there existsa constant M > 0 such that for all t ∈ [0, τ ] and for all f , g ∈ Cτ such thatsup[0,τ ] |f | ≤ a and sup[0,τ ] |g| ≤ a∣∣∣F (t, (f (u))u≤t

)− F

(t, (g (u))u≤t

)∣∣∣ ≤M sup[0,t]

|f − g|

then the stochastic differential equation (2.2) enjoys the pathwise uniqueness prop-erty. Hence (2.2) has a unique strong solution associated with the initial conditionX0 = 0 and the law of (Xt)0≤t<T is the minimal probability associated with theconditioning (T, Y, ν) .

2.3. Initial enlargement of the natural filtration. As a consequence of theprevious paragraph, we can prove the well-known Jacod’s theorem (see for example[15], [17], [18], [19]) about the initial enlargement of a brownian filtration

Theorem 2.9. The process (Mt)0≤t<T defined by

dMt = −αYt dt+ dBt , 0 ≤ t < T(2.3)

is a standard brownian motion, not only for P but also for Pν in the enlargedfiltration

Gt = Ft ∨ σ (Y ) , 0 ≤ t ≤ T

Proof. We first show that (Mt)0≤t<T is a (Gt,P) standard brownian motion.As

d〈M〉t = dt

it is enough to show that (Mt)0≤t<T is a (Gt,P) martingale, according to PaulLevy’s characterization of brownian motion.

Page 8: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

8 FABRICE BAUDOIN

For this, we note as a consequence of the previous theorem that (Mt)0≤t<T is,for PY − a.s. y ∈ S a

(Ft,Pδy

)martingale, where δy is the Dirac measure at y. So,

for s < t < T, A ∈ Fs and Λ ∈ B (S) we have

E

((Mt −Ms) 1A∩(Y ∈Λ)

)=∫

Λ

Eδy ((Mt −Ms) 1A)PY (dy) = 0

This shows that (Mt)0≤t<T is a (Gt,P) standard brownian motion.To see that (Mt)0≤t<T is also a (Gt,Pν) standard brownian motion, it suffices to

note that P and Pν are equal on events independent of σ (Y )

Remark 2.4.1. Formally, we recover the decomposition (2.3) from (2.1) with ν = δY . This

shows the analogy between Jacod’s and Girsanov’s theorem2. Assume that ν is equivalent to PY . Let Q on FT such that

Q ∼ Pand such that the law of Y under Q is ν. Then Pν is the unique probabilityon the orthogonal decomposition σ (Ms , s < T ) ∨ σ (Y ) such that

Pν = P on σ (Ms , s < T )

and

Pν = Q on σ (Y )

With the terminology of [3], Pν is hence the martingale preserving measureassociated with Q

3. Roughly speaking, all the CSDEs associated with the same functional havethe same bridges on this functional.

The relation between the decompositions (2.1) and (2.3) is given by the followinginteresting filtering formula which is a consequence of Bayes formula

Proposition 2.10. For 0 ≤ t < T

Eν(αYt | Ft

)=

∫Sαyt η

yt ν (dy)∫

Sηyt ν (dy)

2.4. Computations with Malliavin calculus and Fourier analysis. In thisparagraph, we give a tool to obtain in some special cases explicit computations.

We assume that

S = RN

with N ∈ N∗ and that the functional Y admits a Malliavin’s differential D in thefollowing strong sense

1.

Y : CT → RN

has a directional derivative in all directions γ of the form

γ (t) =∫ t

0

g (s) ds with g ∈ L2 ([0, T ])

in the sense that

DγY (ω) := limε→0

Y (ω + εγ)− Y (ω)ε

Page 9: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

CSDE 9

exists in L2 (CT ,P)2. There exists ψ ∈ L2 ([0, T ]× CT ) such that

DγY (ω) =∫ T

0

ψ (t, ω) g (t) dt

We set

DtY (ω) := ψ (t, ω)

For furthers details on Malliavin calculus we refer to [10], [22] and [29].

Proposition 2.11. Assume that:1. ν ∼ PY2. ξ := dν

dPYadmits a continuously differentiable version

then, for t < T ∫RN

αyt ηyt ν (dy)∫

RNηyt ν (dy)

= Eν

((∇ξξ

)(Y ) ·DtY | Ft

)Proof. Under these assumptions, we have

dPν = ξ (Y ) dP

and so for t ≤ TdPν/Ft = E (ξ (Y ) | Ft) dP/Ft

Now from the Clark-Ocone formula (see [10])

E (ξ (Y ) | Ft) = 1 +∫ t

0E (Dsξ (Y ) | Fs) ds

= 1 +∫ t

0E (∇ξ (Y ) · DsY | Fs) ds

Hence ∫RN

αyt ηyt ν (dy)∫

RNηyt ν (dy)

=E (∇ξ (Y ) ·DtY | Ft)

E (ξ (Y ) | Ft)And the Bayes formula gives the expected result

Proposition 2.12. For PY − a.s , y ∈ RN and for all t < T

ηyt =∫RN

e−iy·λ E(eiλ·Y | Ft

)dλ(2.4)

and

αYt = i

∫RN

e−iy·Y E((DtY · y) eiy·Y | Ft

)dy∫

RNe−iy·Y E (eiy·Y | Ft) dy

Proof. Let m a signed measure on RN such that∫RN

|m| (dy) < +∞

Let now m the Fourier transform of m defined on RN by

m (y) =∫RN

eiy·λm (dλ)

We have for t < T ∫RN

ηyt m (y)PY (dy) = E (m (Y ) | Ft)

Page 10: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

10 FABRICE BAUDOIN

But

E (m (Y ) | Ft) =∫RN

E

(eiY ·λ | Ft

)m (dλ)

hence ∫RN

ηyt m (y)PY (dy) =∫RN

E

(eiY ·λ | Ft

)m (dλ)

As the previous equality takes place for all m, this implies that for PY −a.s , y ∈ RNand for all t < T

ηyt =∫RN

e−iy·λ E(eiλ·Y | Ft

)dλ

The formula on αY is a consequence of Clark-Ocone formula applied as in theproof of the previous proposition

Remark 2.5. Of course, formula (2.4) remains true even if Y is not differentiablein Malliavin’s sense. This formula will be very interesting in financial applications.

3. Bridges over a diffusion and generalized Burger’s equation

Let T > 0 a finite time horizon.Our aim, in this paragraph, is to show that the set of the minimal probabilities as-

sociated with the conditioning (T, Y, ZT ) where Z is a diffusion can be parametrizedby a non linear partial differential equation whose solutions are related to the space-time harmonics of Z by a Hopf-Cole transformation (see [11] and [16]).

Let us consider on the Wiener space the diffusion

dZt = b (Zt) dt+ σ (Zt) dBt , t ≤ T(3.1)

where b and σ are C∞ functions such that:1. For x ∈ R, the stochastic differential equation (3.1) has a strong solution

associated with the initial condition Z0 = x2. The semi-group Pt (x, dy) of (3.1) has smooth densities.Such conditions (Hormander) are well known.Let ν a probability measure on R. We assume that the function

ϕ (t, x) =∫ +∞

−∞

PT−t (x, dy)PT (0, dy)

ν (dy) , t < T, x ∈ R

is well-defined and C∞.

Proposition 3.1. Let Z the solution of (3.1) associated with the initial conditionZ0 = 0. The CSDE associated with the conditioning (T,ZT , ν) can be written

dXt =[b (Xt) + σ2 (Xt)φ (t,Xt)

]dt+ σ (Xt) dβt , t < T(3.2)

where φ : [0, T [×R → R is solution of the partial differential equation

∂φ

∂t+

∂x(bφ) +

12∂

∂x

(σ2φ2

)+

12∂

∂x

(σ2 ∂φ

∂x

)= 0(3.3)

Proof. Here, the density process of Pν with respect to P is given by

ϕ (t, Zt)

Page 11: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

CSDE 11

ϕ is a C∞ space time harmonic of Z hence

φ =∂

∂xlnϕ

is solution of (3.3) and we conclude by Girsanov’s theorem

Remark 3.1. The solutions of (3.3) are related to the positive space-time harmonicsof (3.1) by the Hopf-Cole transformation

φ =∂

∂xlnϕ

As a first corollary, we deduce

Corollary 3.2. Let Z the solution of (3.1) associated with the initial conditionZ0 = 0. The decomposition of Z in the filtration enlarged by ZT is given by:

dZt =[b (Zt) + σ2 (Zt) φ (t, Zt, ZT )

]dt+ σ (Zt) dγt , t < T

where φ (·, y) : [0, T [×R → R is solution for all y ∈ R of (3.3)

As a second corollary, we deduce the following generalization of [8].

Corollary 3.3. Let h a strictly positive C∞ function such that∫ +∞

−∞h (x)PT (0, dx) < +∞ ,

∫ T

0

∫ +∞

−∞σ2 (x)h′ (x)2

Pt (0, dx) dt < +∞

and

Lh = µh

for some µ ∈ R (L is the generator of Z). Let us now consider the diffusion

dZht =(b(Zht)

+ σ2(Zht) h′h

(Zht))

dt+ σ(Zht)dBt

for which, we assume there exists a unique solution(Zht)

0≤t≤T associated with theinitial condition Zh0 = 0. Then Z and Zh have the same bridges, i.e. the law of Zand the law of Zh coincide on the events independent of BT .

Conversely, if the drift in (3.2) is homogeneous, then there exists a strictly pos-itive C∞ function h such that

Lh = µh

with µ ∈ R and

φ (t, x) =h′

h(x)

Proof. Let h a function which satisfies the assumptions of the corollary. Con-sider the process

Dt = h (Zt) e−µt

From Ito’s formula, D is a true martingale. From Girsanov’s theorem, αD is thenthe density process of Zh with respect to Z where α is a normalization constant.

Conversely, assume that φ given by (3.2) is homogeneous. Then

∂2

∂t∂xlnϕ = 0

Page 12: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

12 FABRICE BAUDOIN

hence, there exist f and h strictly positive C∞ functions such that

ϕ (t, x) = f (t)h (x)

It implies on f and h

f ′ (t)h (x) + b (x) f (t)h′ (x) +12σ (x)2

f (t)h′′ (x) = 0

and hence

−f′

f(t) =

Lhh

(x) = µ

with µ ∈ R

Example 3.1.

1. For the brownian motion, we get the class of diffusions with generator

L =12d2

dx2+ α tanh (αx+ β)

d

dx

See [8]2. For the Bessel process with index −µ, we get a class of diffusions with gener-

ator

L =12d2

dx2+(µ+ 1

2

x− αK1+µ (αx)

Kµ (αx)

)d

dx

with α > 0, known (see [31], [33], and [34]) as being a generalized Besselprocess with ↓ drift. Kµ denotes the function of Mac-Donald with index µ

3. For the Bessel process with index +µ, we get a class of diffusions with gener-ator

L =12d2

dx2+(µ+ 1

2

x+ α

I1+µ (αx)Iµ (αx)

)d

dx

with α > 0, known (see [31], [33], and [34]) as being a generalized Besselprocess with ↑ drift. Iµ denotes the modified Bessel function with index µ.

Remark 3.2. The previous study can be generalized by taking for T a stopping time(see subsection 4.3.).

4. Examples

4.1. CSDE associated with the conditioning of a marginal law.Let

(Ω, (Ht)0≤t≤T ,Q

)a filtered probability space on which a standard brownian

motion (βt)0≤t≤T is defined. Let us write the CSDE associated with the condition-ing (T,BT , ν) . The conditioned functional is then the value of the process at thegiven date T < +∞.

We assume that the probability ν is such that

∫ +∞

−∞y2ν (dy) < +∞(4.1)

Page 13: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

CSDE 13

Theorem 4.1. The stochastic differential equation

dXt =

∫ +∞−∞

(y−XtT−t

)ey2

2T −(y−Xt)22(T−t) ν (dy)∫ +∞

−∞ ey22T −

(y−Xt)22(T−t) ν (dy)

dt+ dβt , t < T(4.2)

has a unique strong solution (Xt)0≤t<T associated with the initial condition

X0 = 0

and the law of (Xt)0≤t<T is the minimal probability associated with the conditioning(T,BT , ν) .

Remark 4.1. We note here that X is gaussian if and only if ν is (see below).

Before we give the proof of this theorem, we present a number of interestingproperties of the solution of (4.2) .

Theorem 4.1. gives the convergence in law of (Xt)0≤t<T when t → T , the firstquestion is then to study the Q−a.s. convergence.

Proposition 4.2. For the solution (Xt)0≤t<T of (4.2)we have:

1. For all g ∈ L2 ([0, T ],R) , the process(∫ t

0g (s) dXs

)0≤t<T

converges Q−a.s.

and in L2 when t→ T to a random variable∫ T

0g (s) dXs such that

E

(∫ T

0

g (s) dXs

)=

∫ +∞−∞ y ν (dy)

T

∫ T

0

g (s) ds, t < T

and

E

(∫ T

0

g (s) dXs

)2 =

∫ T

0

g (u)2du+

∫ +∞−∞ y2ν (dy)− T

T 2

(∫ T

0

g (u) du

)2

2. The law of XT under Q is ν.

Furthermore, we have the following decomposition in the enlarged filtration anda related non canonical representation well known for a standard brownian motion(see [27] and [35]).

Proposition 4.3. For the solution (Xt)0≤t≤T of 4.2, the process

γt := Xt −∫ t

0

XT −Xs

T − sds , t < T(4.3)

is a standard brownian motion in the enlarged filtration X∨σ (XT ) (X is the naturalfiltration of X) and the following orthogonal decomposition takes place

XT = σ (γs, s < T ) ∨ σ (XT )(4.4)

Moreover, the process

γt := Xt −∫ t

0

Xs

sds , t < T

is well defined and is a standard brownian motion in its own filtration which isstrictly included in X . This brownian motion is independent of XT .

Page 14: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

14 FABRICE BAUDOIN

Remark 4.2.1. If ν e−

x22T dx with a density which admits a differentiable version ξ, then

(4.2) can be written after an integration by parts

dXt =

∫ +∞−∞ ξ′ (y) e−

(y−Xt)22(T−t) dy∫ +∞

−∞ ξ (y) e−(y−Xt)22(T−t) dy

dt+ dβt , t < T

This is the classical Doob’s h-tranform of a brownian motion associatedwith the space-time harmonic

h (t, Bt) =1√T − t

∫ +∞

−∞ξ (y) e−

(y−Bt)22(T−t) dy

This last result could have been directly derived from Malliavin’s calculus.2. For further details on the orthogonal decomposition (4.4) when ν = δy for

some y ∈ R we refer to [20].

Proof of theorem 4.1. Here we have

ηyt =

√T

T − texp

[y2

2T− (y −Bt)2

2 (T − t)

], t < T , y ∈ R

and

αyt =y −BtT − t

, t < T , y ∈ R

Hence, equation (4.2) corresponds to equation (2.2) . The assumption (4.1) im-plies that the function

φ (t, x) =

∫ +∞−∞

(y−xT−t

)ey2

2T −(y−x)2

2(T−t) ν (dy)∫ +∞−∞ e

y22T −

(y−x)22(T−t) ν (dy)

, t < T , x ∈ R

is of class C1 in the space variable and hence locally lipschitz. We are hence in theassumptions of theorem 2.8.

Proof of proposition 4.2. Let g ∈ L2 ([0, T ],R) and y ∈ R. Under Pδy(Bt)0≤t<T is a solution of the linear equation

dBt =y −BtT − t

dt+ dW yt , t < T

where (W yt )0≤t<T is a standard Pδy brownian motion. This implies that under Pδy

the process(∫ t

0g (u) dBu

)0≤t<T

is gaussian.

Its mean is given by

m (t) =y

T

∫ t

0

g (u) du , t < T(4.5)

and its variance by

σ2 (t) =∫ t

0

g (u)2du− 1

T

(∫ t

0

g (u) du)2

, t < T(4.6)

the Pδy -a.s. convergence of the process(∫ t

0g (u) dBu

)0≤t<T

when t → T is hence

easily checked.

Page 15: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

CSDE 15

As

Pν =

∫ +∞

−∞Pδyν (dy)

we deduce the convergence a.s. (and also in L2 (Pν)) of(∫ t

0g (u) dBu

)0≤t<T

under

Pν . The computation of the mean and of the moment of order 2 of the limit is a

direct consequence of (4.5) and (4.6)Proof of proposition 4.3. The process γ defined by

γt := Xt −∫ t

0

XT −Xs

T − sds , t < T

is a brownian motion by the Jacod’s theorem. As this decomposition implies

Xt =t

TXT + (T − t)

∫ t

0

dγsT − s

, t < T(4.7)

we have for t < T

Xt ⊂ σ (γs, s < t) ∨ σ (XT )

But, on the other hand, we have trivially

σ (γs, s < t) ⊂ Xt ∨ σ (XT )

This implies immediately

XT = σ (γs, s < T ) ∨ σ (XT )

Let now (Pt)0≤t<T the standard brownian bridge from 0 to 0 defined by

Pt := (T − t)∫ t

0

dγsT − s

, t < T

Let 0 < t < T. Decomposition (4.7) implies that for 0 < ε < t

Xt −∫ t

ε

Xs

sds =

XT

Tε+ Pt −

∫ t

ε

Pssds

Now, as the process P −∫ ·εPss ds is gaussian, it is easy to conclude

As seen before in section 2, the drift of (4.2) is not as innocent (or compli-cated !) as it might seen: from an analytical point of view, the set of all the CSDEsassociated with the conditioning (T,BT ) is parametrized by a non linear partial dif-ferential equation well known in potential theory and fluid mechanics: the Burger’sequation.

Proposition 4.4. Let φ : [0, T [×R→ R the function defined by

φ (t, x) =

∫ +∞−∞

(y−xT−t

)ey2

2T −(y−x)2

2(T−t) ν (dy)∫ +∞−∞ e

y22T −

(y−x)22(T−t) ν (dy)

Then φ is a weak (strong if∫ +∞−∞ |y|

3ν (dy)) solution of the Burger’s equation

∂φ

∂t+

12∂2φ

∂x2+ φ

∂φ

∂x= 0(4.8)

and we have the limit condition

e∫ x0 φ(t,s)ds− x2

2T dx converges weakly to C ν (dx) when t→ T

Page 16: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

16 FABRICE BAUDOIN

where C > 0 is a normalization constant. Moreover, if (Xt)0≤t<T is the solutionof (4.2) then

Nt = φ (t,Xt) , t < T

is a Q local martingale.

Remark 4.3. The solutions of Burger’s equation are related to the positive solutionsof the heat equation by the Hopf-Cole transformation

φ =∂

∂xlnϕ

As a first generalization, let us show how to recover quickly in particular thewell-known enlargement formula for the enlargement of the brownian filtration bya Wiener integral (see [2], [1] , [9]). Let (Xt)0≤t<T the solution of 4.2, by the timechange

Yt = X∫ t0 f(s)2ds , t ∈ R+

with f ∈ L2 (R+, dx) such that ∫ +∞

0

f (s)2ds = T

we immediately deduce from (4.2)

Proposition 4.5. For the conditioning(

+∞,∫ +∞

0f (s) dBs, ν

), for t ∈ R+ and

y ∈ R

ηyt =

√√√√∫ +∞0

f (s)2ds∫ +∞

tf (s)2

dsexp

y2

2∫ +∞

0f (s)2

ds−

(y −

∫ t0f (s) dBs

)2

2∫ t

0f (s)2

ds

and

αYt =

∫ +∞t

f (s) dBs∫ +∞t

f (s)2ds

Remark 4.4. For the CSDE associated with the functional

Y =(∫ +∞

0

fi (s) dBs

)1≤i≤N

where fi ∈ L2 (R+, dx) , the computations are easily made by the technics devel-opped in subsection 2.4. because

DtY = (fi (t))1≤i≤N

but as the expressions are complicated, we refer the interested reader to [2] and [9].In particular, Alili [2] uses different technics involving linear Volterra transforms ofthe brownian motion to obtain the CSDE.

Page 17: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

CSDE 17

We focus now our attention on some examples.

Example 4.1.

1. (Gaussian bridge)

Let us take ν (dx) = e− (x−m)2

2s2√2πs

dx with m ∈ R and 0 < s2 ≤ T , thenequation (4.2) becomes

dXt =

(s2 − T

)Xt +mT

(s2 − T ) t+ T 2dt+ dβt

The solution associated with the initial condition X0 = 0 is

Xt =m

Tt+[(s2 − T

)t+ T 2

] ∫ t

0

dβu(s2 − T )u+ T 2

and so, (Xt)0≤t<T is a gaussian process such that(a) Xt converges Q−a.s. when t → T to a random variable XT such that

XT ∼ N(m, s2

)(b) E (Xt) = m

T t

(c) E((Xu − m

T u) (Xv − m

T v))

= u ∧ v + s2−TT 2 uv

2. Let a > 0.For ν = 1

2δ−a + 12δa , the equation (4.2) becomes

dXt =a tanh

(aXtT−t

)−Xt

T − tdt+ dβt , t < T

The solution (Xt)0≤t<T associated with the initial condition X0 = 0 convergesQ−a.s. to a variable XT such that

Q (XT = a) = Q (XT = −a) =12

Furthermore, we can show the following property

Q (XT = a | Xt = x) =1

1 + e−axT−t

3. Let α ≥ 0.With ν (dx) = cosh[αx]√

2πTe−

x22T −

α2T2 dx, equation (4.2) becomes

dXt = α tanh [αXt] dt+ dβt(4.9)

This equation has only one solution associated with the initial condition

X0 = 0

And for this solution, we have

Q (Xt ∈ dx) =cosh [αx]√

2πte−

x22t −

12α

2tdx , t ≥ 0

See [8] and section 3.

Page 18: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

18 FABRICE BAUDOIN

4.2. CSDE associated with the quadratic variation of a geometric brow-nian motion.

Let(

Ω, (Ht)t≥0 ,Q)

a filtered probability space satisfying the usual conditionson which a standard (Ht,Q) brownian motion (βt)0≤t≤T is defined.

Let µ > 0 and x0 > 0.It is well-known (see [12]) that under the Wiener measure the functional∫ +∞

0

e2Bt−2µtdt

is distributed , up to a multiplicative constant, as the inverse of a gamma law. Moreprecisely, ∫ +∞

0

e2Bt−2µtdt ∼1

2γµThe computations made in [4] allow us to obtain the CSDE associated with the

conditioning(

+∞,∫ +∞

0e2Bt−2µtdt, ν

). We assume that ν admits with respect to

12γµ

a density ξ : R∗+ → R+ which is almost surely bounded.

Theorem 4.6. The stochastic differential equation

dXt = Xt

µ+12− 2

∫ +∞0

e−uuµξ(〈X〉t + X2

t

2u

)du∫ +∞

0e−uuµ−1ξ

(〈X〉t + X2

t

2u

)du

dt+ dβt

, t ≥ 0

(4.10)

has a unique strong solution (Xt)t≥0 associated with the initial condition

X0 = x0

And X is such that:1.

Q (∀ t ≥ 0 , Xt > 0) = 1

2. The law of the process (µt+

12

lnXt

x0

)t≥0

is the minimal probability associated with the conditioning(

+∞,∫ +∞

0e2Bt−2µtdt, ν

).

As in the previous subsection, we have to check a Q−a.s convergence, which canbe shown by the dominated convergence theorem.

Proposition 4.7. For the solution (Xt)t≥0 of (4.10) the process 〈X〉t convergesQ−a.s. when t→ +∞ to a random variable 〈X〉∞ and

Q (〈X〉∞ ∈ dx) = Cξ (x) e−

x20

2x

x1+µ, x > 0

where C > 0 is a normalization constant.

As in the case Y = BT , we have an explicit formula for the decomposition in theenlarged filtration and a related non canonical representation for which we give anew proof.

Page 19: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

CSDE 19

Proposition 4.8. For the solution (Xt)t≥0 of (4.10), in X ∨ σ (〈X〉∞) (X is thenatural filtration of X), we have the following decomposition

dXt = Xt

[(µ+

12− X2

t

〈X〉∞ − 〈X〉t

)dt+ dγt

], t ≥ 0(4.11)

where (γt)t≥0 is a Q−standard brownian motion in the enlarged filtration. And thefollowing orthogonal decomposition takes place

X∞ =

∨t≥0

σ (γu, u ≤ t)

∨ σ (〈X〉∞)

Moreover the process(〈X〉tXt

)t≥0

is a diffusion in its own filtration which is strictly

included in X and this diffusion is independent of 〈X〉∞.

Proof. The decomposition (4.11) is well-known (see [4] and [24]), it easilyimplies first the orthogonal decomposition and then

Xt = eγt+µt(

1− 〈X〉t〈X〉∞

), t ≥ 0(4.12)

HenceX2t(

1− 〈X〉t〈X〉∞

)2 = e2(γt+µt)

This implies by integrating along the trajectories:

〈e(µ)〉t =〈X〉∞〈X〉t〈X〉∞ − 〈X〉t

(4.13)

with

e(µ)t := eγt+µt

If we eliminate 〈X〉∞ between the two relations (4.13) and (4.12) we find:

〈X〉tXt

=〈e(µ)〉te

(µ)t

We get then the expected result from the classical Matsumoto-Yor’s result (see [4]and [23]).

The diffusion 〈X〉X is independent of 〈X〉∞ because γ is a brownian motion forall the disintegrated probabilities Q (· | 〈X〉∞ = x) , x ∈ R∗+

Here again, the set of the CSDEs associated with the conditioning(

+∞,∫ +∞

0e2Bt−2µtdt

)is parametrized by a non linear partial differential equation.

Proposition 4.9. Let φ : R∗+ × R∗+→ R the function defined by :

φ (t, x) =2µx− 2x

∫ +∞0

e−uuµξ(t+ x2

2u

)du∫ +∞

0e−uuµ−1ξ

(t+ x2

2u

)du

Then φ is a solution of the modified Burger’s equation

∂φ

∂t+

12∂2φ

∂x2+ φ

∂φ

∂x+(−µ+

12

)∂

∂x

x

)= 0

Page 20: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

20 FABRICE BAUDOIN

and we have the limit condition

limx→0

e∫φ(t,x)dx = C ξ (t)

where C > 0 is a normalization constant.

Remark 4.5.1. As in the case Y = BT there is also a Hopf-Cole transformation:

φ =∂

∂xlnϕ

with

ϕ (t, x) =∫ +∞

0

e−uuµ−1ξ

(t+

x2

2u

)du

positive solution of

∂ϕ

∂t+

12∂2ϕ

∂x2+−µ+ 1

2

x

∂ϕ

∂x= 0

2. If ξ has a continuously differentiable version, then we have

φ (t, x) = x

∫ +∞0

e−uuµ−2ξ′(t+ x2

2u

)du∫ +∞

0e−uuµ−1ξ

(t+ x2

2u

)du

Proposition 4.10. Let α > 0 and x0 > 0. By taking ξ (x) = Ce−α22 x we get:

The stochastic differential equation

dXt = Xt

[(µ+

12− αXt

K1+µ (αXt)Kµ (αXt)

)dt+ dβt

], t ≥ 0

admits one and only one non explosive solution (Xt)t≥0 associated with the initialcondition X0 = x0 and we have :

1.

Q (∀ t ≥ 0 , Xt > 0) = 1

2. The process∫ t

0X2sds converges P−a.s. when t → +∞ to a random variable∫ +∞

0X2sds which satisfies

Q

(∫ +∞

0

X2sds ∈ dx

)=

xµ02αµKµ (αx0)

e−α22 x− x

20

2x

x1+µdx , x > 0

For further details on this class of diffusions we refer to [4].

4.3. CSDE associated with the first hitting time of a level.Let

(Ω, (Ht)t≥0 ,Q

)a filtered probability space satisfying the usual conditions

on which a standard brownian motion (βt)0≤t≤T is defined .To give the CSDE associated with

Ta = inf t ≥ 0 , Bt = a , a > 0

we can use the results of the previous subsection because of the following Lampertirepresentation:

Page 21: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

CSDE 21

Lemma 4.11. Let (Xt)t≥0 the solution of (4.10) with µ = 12 and associated with

the initial condition X0 = a , then the law PZ of the process (Zt)t≤Ta defined by

Z∫ t0 X

2sds

= a−Xt , t ≥ 0

is the minimal probability associated with the conditioning (Ta, ν) , more preciselyit satisfies

dPZ/FTa = ξ (Ta) dP/FTa

Proof. The following absolute continuity relation takes place

dPX/F∞ = ξ

(∫ +∞

0

B2sds

)dP− 1

2 ,a

/F∞

where PX is the law of X and P−12 ,a the law of

(a eBt−

12 t , t ≥ 0

)under the Wiener

measure P.Now, the process Z defined on the Wiener space by

Z∫ t0 e

2Bs−sds = a− aeBt− 12 t , t ≥ 0

is a standard brownian motion under the Wiener measure considered up to its firsthitting time of a

In order to be homogeneous in our computations, we write the CSDE associatedwith the conditioning (Ta, ν) where ν is the Borel measure defined on R∗+ by

ν (dt) =(∫ +∞

0

e−tδ22 +δa m (dδ)

)γ (dt)

with m a probability measure on R∗+ such that∫ +∞

0δ2 m (dδ) < +∞ and

γ (dt) =a√2πt3

e−a22t dt

With the change of variable of the previous lemma, we deduce:

Theorem 4.12. The stochastic differential equation

dXt =

∫ +∞0

δ e−tδ22 +δXt m (dδ)∫ +∞

0e−t

δ22 +δXt m (dδ)

dt+ dβt, t ≥ 0(4.14)

has a unique strong solution (Xt)t≥0 associated with the initial condition

X0 = 0

and the law of (Xt)t≥0 is the minimal probability associated with the conditioning(Ta, ν) .

Corollary 4.13. For the solution (Xt)t≥0 of (4.14) the stopping time

τa = inf t ≥ 0 , Xt = a

is Q−a.s. finite and satisfies

Q (τa ∈ dt) = ν (dt) , t > 0

Page 22: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

22 FABRICE BAUDOIN

Corollary 4.14. For the solution (Xt)t≥0 of (4.14) , in the filtration((Xt ∩ t < τa) ∨ σ (τa))t≥0 (X is the natural filtration of X) we have the fol-

lowing decomposition

dXt =(− 1a−Xt

+a−Xt

τa − t

)dt+ dγt , 0 ≤ t < τa

where γ is a standard brownian motion in the enlarged filtration.

Remark 4.6. More generally, we can give the CSDE associated with the first hittingtime of 0 for a Bessel process with index −µ where µ is a strictly positive constant:it suffices to apply the same Lamperti representation to the process defined by(4.10) for a general µ.

As in the previous examples, the drift of the CSDE is a solution of a Burger’sequation.

Proposition 4.15. Let φ :]0,+∞[×R→ R the function defined by

φ (t, x) =

∫ +∞0

δ e−tδ22 +δx m (dδ)∫ +∞

0e−t

δ22 +δx m (dδ)

Then φ is a solution of the Burger’s equation

∂φ

∂t+

12∂2φ

∂x2+ φ

∂φ

∂x= 0

and we have the limit condition

limx→a

e∫φ(t,x)dx = C

ν (dt)γ (dt)

where C > 0 is a normalization constant.

Example 4.2. Let α ≥ 0.With m = 1

2δα + 12δ−α, equation (4.14) becomes

dXt = α tanh [αXt] dt+ dβt

This equation has only one solution associated with the initial condition

X0 = 0

And for this solution, we have

Q (τa ∈ dt) =a cosh (αa)√

2πt3e−

a22t −

tα22 dt , t > 0

We recover the diffusion (4.9), this is not surprising because of the remark of theend of section 3.

5. Opening

We will show in another paper [6] how conditioned stochastic differential equa-tions give portfolio’s optimization models for an insider who is in the followingposition :

1. His portfolio decisions are based on a public information flow2. He possesses extra information about the law of some functional of the future

prices of a stock.

Page 23: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

CSDE 23

Acknowledgments. I would like to thank Marc Yor and Huyen Pham fortheir judicious references and remarks.

References

[1] L. Alili, C.T. Wu : Further results on some singular linear stochastic differential equations,submitted to Stochastic processes and their applications (2000)

[2] L. Alili : Canonical decomposition of certain generalized Brownian bridges, preprint (2001)

[3] J. Amendinger, P. Imkeller, M. Schweizer : Additional logarithmic utility of an insider, SPA1998, 75, 263-286

[4] F. Baudoin : Bessel process with random drift and extension of the Matsumoto-Yor’s〈X〉X

property, submitted to Electronic journal of probability (2001)[5] F. Baudoin: Generalized Brownian bridges, CCF Reasearch and Innovation Note

[6] F. Baudoin: Portfolio optimization associated with a weak information, preprint (2001)

[7] J.D. Benarous, L. Mazliak, R. Rouge : Filtering and Control with information increasing,Methodology and Computing in Applied Probability, 123-135, 2000

[8] I. Benjamini and S. Lee : Conditioned diffusions which are brownian bridges, J. Theoret.Probab.; Vol.10, 3, pp.733-741 (1997)

[9] M. Chaleyat-Maurel, T. Jeulin, Grossissement gaussien de la filtration brownienne. LNM1118, Grossissements de filtration: exemples et applications, ed. T. jeulin and M. Yor, 59-109, Springer

[10] Clark J.M.C., The representation of functionals of Brownian motion by stochastics integrals,Ann. Math. Stat. 41 (1970), 1282-1295, 42 (1971), 1778

[11] J.D. Cole : On a quasi-linear parabolic equation occuring in aerodynamics, Quart. Appl.Math., 9(3):225-236, 1951

[12] D. Dufresne : The distribution of a perpetuity, with application to risk theory and pensionfunding, Scand. Actuarial J., 1990, 39-79

[13] Fitzsimmons, Pitman, Yor : Markov bridges, Construction, Palm interpretation and splicing,

Seminar on Stochastic Processes, Birkhauser 1993, p.101-134[14] H. Follmer : Random fields and diffusion processes. Ecole d’ete de Saint Flour XV-XVII

(85-87), LNM 1158, 119-129

[15] H. Follmer and P. Imkeller : Anticipation cancelled by a Girsanov transformation: A Paradoxon Wiener Space, Ann. IHP Vol. 29 pp-569-586

[16] E. Hopf : The partial differential equation ut + uux = µuxx, Comm. Pure Appl. Math.,

3:201-230, 1950[17] J. Jacod : Grossissement initial, hypothese (H′) et theoreme de Girsanov, in Grossissement

de filtrations: exemples et applications, LNM, 1118, 15-35, Springer, Berlin, Heidelberg,New-York 1985

[18] T. Jeulin : Semi-martingales et grossissement d’une filtration, L.N. Math 833, Springer-Verlag, 1980

[19] T. Jeulin and M. Yor Eds.: Grossissements de filtrations: exemples et applications, Lect.

Notes Math 1118, Springer-Verlag, 1985[20] T. Jeulin and M. Yor : Filtration des ponts browniens et equations differentielles stochastiques

lineaires, Seminaire de Probabilites XXIV, LNM 1426, pp. 227-265[21] I. Karatzas, J.P. Lehoczky, S. Shreve : Optimal Portfolio and Comsuption Decisions for a

”Small investor” on a finite horizon, SIAM Journal of Control and Optimization 27, 1557-

1586, 1987[22] P. Malliavin: Stochastic Analysis, Grundlehren der mathematischen Wissenschaften, Vol.

313, Springer 1997[23] H. Matsumoto, M. Yor : Some changes of probabilities related to a geometric brownian

motion version of Pitman’s 2M −X theorem, Elect. Comm. in Probab. 4 (1999) 15-23

[24] H. Matsumoto and M. Yor, A relationship between Brownian Motions with opposite drifts,to appear in Osaka J.math

[25] H. Matsumoto, M. Yor : An Analogue of Pitman’s 2M −X Theorem for Exponential WienerFunctionals, Part I: A Time-inversion Approach, Nagoya Math. J., 159 (2000), 1067-1074

Page 24: CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS · 2016-09-02 · CONDITIONED STOCHASTIC DIFFERENTIAL EQUATIONS: THEORY AND APPLICATIONS FABRICE BAUDOIN Laboratoire

24 FABRICE BAUDOIN

[26] H. Matsumoto, M. Yor : An Analogue of Pitman’s 2M −X Theorem for Exponential Wiener

Functionals, Part II: The Role of the Generalized Inverse Gaussian laws, to appear in Nagoya

Math. J.[27] P.A. Meyer : Sur une transformation du mouvement brownien due a Jeulin et Yor. Sem.

Prob. XXVIII, 98-101, LNM 1583, Springer 1994[28] M. Nagasawa: Stochastic Processes in Quantum Physics, Monographs in Mathematics, Vol.

94, Birkhauser 2000

[29] B. Oksendal : An introduction to Malliavin calculus with applications to economics, courseat the Norwegian school of Economics and Business administration (1996)

[30] I. Pikovsky and I. Karatzas : Anticipative Portfolio Optimization, Adv. Appl. Prob., 28,1996,1095-1122

[31] J. W. Pitman and M. Yor : Bessel processes and infinitely divisible laws, Stochastic Integrals,

ed. by D. Williams, Lecture Notes Maths. 851, 285-370, Springer-Verlag, Berlin, 1981[32] D. Revuz and M. Yor : Continuous Martingales and Brownian Motion, third edition, Springer-

Verlag, Berlin 1999[33] S. Watanabe : On time inversion of one-dimensional diffusion process, Zeitschrift fur Wahr.,

31, 1975, p. 115-124[34] S. Watanabe : Bilateral Bessel diffusion processes with drift and time inversion, preprint[35] M. Yor : Some Aspects of Brownian Motion, Part one : Some special functionals, Lectures

in Math., ETH Zurich, Birkhauser, Basel, 1997