74
Introduction Bounded Noise Extensions Comments and future work The size of the largest fluctuations of nonlinear finite dimensional SDEs with Markovian switching Terry Lynch (joint work with Dr. John Appleby) Dublin City University, Ireland. WCNA Presentation Orlando, Florida, USA. July 4 th 2008 Supported by the Irish Research Council for Science, Engineering and Technology

Wcna 08 (alt)

Embed Size (px)

Citation preview

Page 1: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

The size of the largest fluctuations of nonlinearfinite dimensional SDEs with Markovian switching

Terry Lynch(joint work with Dr. John Appleby)

Dublin City University, Ireland.

WCNA PresentationOrlando, Florida, USA.

July 4th 2008

Supported by the Irish Research Council for Science, Engineering and Technology

Page 2: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Outline

1 IntroductionMotivationRegular Variation

2 Bounded NoiseMain ResultsOutline of Proofs

3 ExtensionsUnbounded noise

4 Comments and future work

Page 3: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Outline

1 IntroductionMotivationRegular Variation

2 Bounded NoiseMain ResultsOutline of Proofs

3 ExtensionsUnbounded noise

4 Comments and future work

Page 4: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Introduction

• We consider the rate of growth of the partial maxima of thesolution X (t)t≥0 of a nonlinear finite-dimensional stochasticdifferential equation (SDE) with Markovian Switching.

• We find upper and lower estimates on this rate of growth byfinding constants α and β and an increasing function ρ s.t.

0 < α ≤ lim supt→∞

‖X (t)‖ρ(t)

≤ β, a.s.

• We look at processes which have mean-reverting drift termsand which have either: (a) bounded noise intensity, or (b)unbounded noise intensity.

Page 5: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Introduction

• We consider the rate of growth of the partial maxima of thesolution X (t)t≥0 of a nonlinear finite-dimensional stochasticdifferential equation (SDE) with Markovian Switching.

• We find upper and lower estimates on this rate of growth byfinding constants α and β and an increasing function ρ s.t.

0 < α ≤ lim supt→∞

‖X (t)‖ρ(t)

≤ β, a.s.

• We look at processes which have mean-reverting drift termsand which have either: (a) bounded noise intensity, or (b)unbounded noise intensity.

Page 6: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Introduction

• We consider the rate of growth of the partial maxima of thesolution X (t)t≥0 of a nonlinear finite-dimensional stochasticdifferential equation (SDE) with Markovian Switching.

• We find upper and lower estimates on this rate of growth byfinding constants α and β and an increasing function ρ s.t.

0 < α ≤ lim supt→∞

‖X (t)‖ρ(t)

≤ β, a.s.

• We look at processes which have mean-reverting drift termsand which have either: (a) bounded noise intensity, or (b)unbounded noise intensity.

Page 7: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Introduction

• We consider the rate of growth of the partial maxima of thesolution X (t)t≥0 of a nonlinear finite-dimensional stochasticdifferential equation (SDE) with Markovian Switching.

• We find upper and lower estimates on this rate of growth byfinding constants α and β and an increasing function ρ s.t.

0 < α ≤ lim supt→∞

‖X (t)‖ρ(t)

≤ β, a.s.

• We look at processes which have mean-reverting drift termsand which have either: (a) bounded noise intensity, or (b)unbounded noise intensity.

Page 8: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Introduction

• We consider the rate of growth of the partial maxima of thesolution X (t)t≥0 of a nonlinear finite-dimensional stochasticdifferential equation (SDE) with Markovian Switching.

• We find upper and lower estimates on this rate of growth byfinding constants α and β and an increasing function ρ s.t.

0 < α ≤ lim supt→∞

‖X (t)‖ρ(t)

≤ β, a.s.

• We look at processes which have mean-reverting drift termsand which have either: (a) bounded noise intensity, or (b)unbounded noise intensity.

Page 9: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Motivation

• The ability to model a self-regulating economic system whichis subjected to persistent stochastic shocks is facilitated bythe use of mean-reverting drift terms.

• The type of finite-dimensional system studied allows for theanalysis of heavy tailed returns distributions in stochasticvolatility market models in which many assets are traded.

• Equations with Markovian switching are motivated byeconometric evidence which suggests that security prices oftenmove from confident to nervous (or other) regimes.

Page 10: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Motivation

• The ability to model a self-regulating economic system whichis subjected to persistent stochastic shocks is facilitated bythe use of mean-reverting drift terms.

• The type of finite-dimensional system studied allows for theanalysis of heavy tailed returns distributions in stochasticvolatility market models in which many assets are traded.

• Equations with Markovian switching are motivated byeconometric evidence which suggests that security prices oftenmove from confident to nervous (or other) regimes.

Page 11: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Motivation

• The ability to model a self-regulating economic system whichis subjected to persistent stochastic shocks is facilitated bythe use of mean-reverting drift terms.

• The type of finite-dimensional system studied allows for theanalysis of heavy tailed returns distributions in stochasticvolatility market models in which many assets are traded.

• Equations with Markovian switching are motivated byeconometric evidence which suggests that security prices oftenmove from confident to nervous (or other) regimes.

Page 12: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Motivational case study

To demonstrate the potential impact of swift changes in investorsentiment, we look at a case study of Elan Corporation.

Page 13: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Motivational case study

To demonstrate the potential impact of swift changes in investorsentiment, we look at a case study of Elan Corporation.

Page 14: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Motivational case study

To demonstrate the potential impact of swift changes in investorsentiment, we look at a case study of Elan Corporation.

Page 15: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Motivational case study

To demonstrate the potential impact of swift changes in investorsentiment, we look at a case study of Elan Corporation.

Page 16: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Motivational case study

To demonstrate the potential impact of swift changes in investorsentiment, we look at a case study of Elan Corporation.

Page 17: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Regular Variation

• Our analysis is aided by the use of regularly varying functions

• In its basic form, may be viewed as relations such as

limx→∞

f (λx)

f (x)= λζ ∈ (0,∞) ∀ λ > 0

• We say that f is regularly varying at infinity with index ζ, orf ∈ RV∞(ζ)

• Regularly varying functions have useful properties such as:

f ∈ RV∞(ζ) ⇒ F (x) :=

∫ x

1f (t) dt ∈ RV∞(ζ + 1)

f ∈ RV∞(ζ) ⇒ f −1(x) ∈ RV∞(1

ζ).

Page 18: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Regular Variation

• Our analysis is aided by the use of regularly varying functions

• In its basic form, may be viewed as relations such as

limx→∞

f (λx)

f (x)= λζ ∈ (0,∞) ∀ λ > 0

• We say that f is regularly varying at infinity with index ζ, orf ∈ RV∞(ζ)

• Regularly varying functions have useful properties such as:

f ∈ RV∞(ζ) ⇒ F (x) :=

∫ x

1f (t) dt ∈ RV∞(ζ + 1)

f ∈ RV∞(ζ) ⇒ f −1(x) ∈ RV∞(1

ζ).

Page 19: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Regular Variation

• Our analysis is aided by the use of regularly varying functions

• In its basic form, may be viewed as relations such as

limx→∞

f (λx)

f (x)= λζ ∈ (0,∞) ∀ λ > 0

• We say that f is regularly varying at infinity with index ζ, orf ∈ RV∞(ζ)

• Regularly varying functions have useful properties such as:

f ∈ RV∞(ζ) ⇒ F (x) :=

∫ x

1f (t) dt ∈ RV∞(ζ + 1)

f ∈ RV∞(ζ) ⇒ f −1(x) ∈ RV∞(1

ζ).

Page 20: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Regular Variation

• Our analysis is aided by the use of regularly varying functions

• In its basic form, may be viewed as relations such as

limx→∞

f (λx)

f (x)= λζ ∈ (0,∞) ∀ λ > 0

• We say that f is regularly varying at infinity with index ζ, orf ∈ RV∞(ζ)

• Regularly varying functions have useful properties such as:

f ∈ RV∞(ζ) ⇒ F (x) :=

∫ x

1f (t) dt ∈ RV∞(ζ + 1)

f ∈ RV∞(ζ) ⇒ f −1(x) ∈ RV∞(1

ζ).

Page 21: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Outline

1 IntroductionMotivationRegular Variation

2 Bounded NoiseMain ResultsOutline of Proofs

3 ExtensionsUnbounded noise

4 Comments and future work

Page 22: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

S.D.E with Markovian switching

We study the finite-dimensional SDE with Markovian switching

dX (t) = f (X (t),Y (t)) dt + g(X (t),Y (t)) dB(t) (1)

where

• Y is a Markov chain with finite irreducible state space S,

• f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locallyLipschitz continuous functions,

• B is an r -dimensional Brownian motion, independent of Y .

Under these conditions, there is a unique continuous and adaptedprocess which satisfies (1).

Page 23: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

S.D.E with Markovian switching

We study the finite-dimensional SDE with Markovian switching

dX (t) = f (X (t),Y (t)) dt + g(X (t),Y (t)) dB(t) (1)

where

• Y is a Markov chain with finite irreducible state space S,

• f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locallyLipschitz continuous functions,

• B is an r -dimensional Brownian motion, independent of Y .

Under these conditions, there is a unique continuous and adaptedprocess which satisfies (1).

Page 24: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

S.D.E with Markovian switching

We study the finite-dimensional SDE with Markovian switching

dX (t) = f (X (t),Y (t)) dt + g(X (t),Y (t)) dB(t) (1)

where

• Y is a Markov chain with finite irreducible state space S,

• f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locallyLipschitz continuous functions,

• B is an r -dimensional Brownian motion, independent of Y .

Under these conditions, there is a unique continuous and adaptedprocess which satisfies (1).

Page 25: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

S.D.E with Markovian switching

We study the finite-dimensional SDE with Markovian switching

dX (t) = f (X (t),Y (t)) dt + g(X (t),Y (t)) dB(t) (1)

where

• Y is a Markov chain with finite irreducible state space S,

• f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locallyLipschitz continuous functions,

• B is an r -dimensional Brownian motion, independent of Y .

Under these conditions, there is a unique continuous and adaptedprocess which satisfies (1).

Page 26: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

S.D.E with Markovian switching

We study the finite-dimensional SDE with Markovian switching

dX (t) = f (X (t),Y (t)) dt + g(X (t),Y (t)) dB(t) (1)

where

• Y is a Markov chain with finite irreducible state space S,

• f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locallyLipschitz continuous functions,

• B is an r -dimensional Brownian motion, independent of Y .

Under these conditions, there is a unique continuous and adaptedprocess which satisfies (1).

Page 27: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Bounded Noise

Consider the i th component of our SDE

dXi (t) = fi (X (t),Y (t)) dt +r∑

j=1

gij(X (t),Y (t)) dBj(t).

We suppose that the noise intensity g is bounded in the sense that

• There exists K2 > 0 s.t. ‖g(x , y)‖F ≤ K2 for all y ∈ S,

• there exists K1 > 0 s.t. inf‖x‖∈Rd

y∈S

∑rj=1

(∑di=1 xigij (x ,y)

)2

‖x‖2 ≥ K 21 .

By Cauchy-Schwartz we get the following upper and lower estimate

K 21 ≤ ‖g(x , y)‖2

F ≤ K 22 , for all y ∈ S.

Page 28: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Bounded Noise

Consider the i th component of our SDE

dXi (t) = fi (X (t),Y (t)) dt +r∑

j=1

gij(X (t),Y (t)) dBj(t).

We suppose that the noise intensity g is bounded in the sense that

• There exists K2 > 0 s.t. ‖g(x , y)‖F ≤ K2 for all y ∈ S,

• there exists K1 > 0 s.t. inf‖x‖∈Rd

y∈S

∑rj=1

(∑di=1 xigij (x ,y)

)2

‖x‖2 ≥ K 21 .

By Cauchy-Schwartz we get the following upper and lower estimate

K 21 ≤ ‖g(x , y)‖2

F ≤ K 22 , for all y ∈ S.

Page 29: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Bounded Noise

Consider the i th component of our SDE

dXi (t) = fi (X (t),Y (t)) dt +r∑

j=1

gij(X (t),Y (t)) dBj(t).

We suppose that the noise intensity g is bounded in the sense that

• There exists K2 > 0 s.t. ‖g(x , y)‖F ≤ K2 for all y ∈ S,

• there exists K1 > 0 s.t. inf‖x‖∈Rd

y∈S

∑rj=1

(∑di=1 xigij (x ,y)

)2

‖x‖2 ≥ K 21 .

By Cauchy-Schwartz we get the following upper and lower estimate

K 21 ≤ ‖g(x , y)‖2

F ≤ K 22 , for all y ∈ S.

Page 30: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Bounded Noise

Consider the i th component of our SDE

dXi (t) = fi (X (t),Y (t)) dt +r∑

j=1

gij(X (t),Y (t)) dBj(t).

We suppose that the noise intensity g is bounded in the sense that

• There exists K2 > 0 s.t. ‖g(x , y)‖F ≤ K2 for all y ∈ S,

• there exists K1 > 0 s.t. inf‖x‖∈Rd

y∈S

∑rj=1

(∑di=1 xigij (x ,y)

)2

‖x‖2 ≥ K 21 .

By Cauchy-Schwartz we get the following upper and lower estimate

K 21 ≤ ‖g(x , y)‖2

F ≤ K 22 , for all y ∈ S.

Page 31: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Bounded Noise

Consider the i th component of our SDE

dXi (t) = fi (X (t),Y (t)) dt +r∑

j=1

gij(X (t),Y (t)) dBj(t).

We suppose that the noise intensity g is bounded in the sense that

• There exists K2 > 0 s.t. ‖g(x , y)‖F ≤ K2 for all y ∈ S,

• there exists K1 > 0 s.t. inf‖x‖∈Rd

y∈S

∑rj=1

(∑di=1 xigij (x ,y)

)2

‖x‖2 ≥ K 21 .

By Cauchy-Schwartz we get the following upper and lower estimate

K 21 ≤ ‖g(x , y)‖2

F ≤ K 22 , for all y ∈ S.

Page 32: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Upper boundThe strength of the restoring force is characterised by a locallyLipschitz continuous function φ : [0,∞) → (0,∞) withxφ(x) →∞ as x →∞, where

lim sup‖x‖→∞

supy∈S

〈x , f (x , y)〉‖x‖φ(‖x‖)

= −c2 ∈ (−∞, 0). (2)

Theorem 1

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (2) and that g satisfiesthe bounded noise conditions. Then X satisfies

lim supt→∞

‖X (t)‖

Φ−1(K2

2 (1+ε)2c2

log t) ≤ 1, a.s. on Ωε,

where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.

Page 33: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Upper boundThe strength of the restoring force is characterised by a locallyLipschitz continuous function φ : [0,∞) → (0,∞) withxφ(x) →∞ as x →∞, where

lim sup‖x‖→∞

supy∈S

〈x , f (x , y)〉‖x‖φ(‖x‖)

= −c2 ∈ (−∞, 0). (2)

Theorem 1

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (2) and that g satisfiesthe bounded noise conditions. Then X satisfies

lim supt→∞

‖X (t)‖

Φ−1(K2

2 (1+ε)2c2

log t) ≤ 1, a.s. on Ωε,

where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.

Page 34: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Upper boundThe strength of the restoring force is characterised by a locallyLipschitz continuous function φ : [0,∞) → (0,∞) withxφ(x) →∞ as x →∞, where

lim sup‖x‖→∞

supy∈S

〈x , f (x , y)〉‖x‖φ(‖x‖)

= −c2 ∈ (−∞, 0). (2)

Theorem 1

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (2) and that g satisfiesthe bounded noise conditions. Then X satisfies

lim supt→∞

‖X (t)‖

Φ−1(K2

2 (1+ε)2c2

log t) ≤ 1, a.s. on Ωε,

where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.

Page 35: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Lower boundMoreover, we ensure that the degree of non-linearity in f ischaracterised by φ also, by means of the assumption

lim sup‖x‖→∞

supy∈S

|〈x , f (x , y)〉|‖x‖φ(‖x‖)

= c1 ∈ (0,∞). (3)

Theorem 2

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (3) and that g satisfiesthe bounded noise conditions. Then X satisfies

lim supt→∞

‖X (t)‖

Φ−1(K2

1 (1−ε)2c1

log t) ≥ 1, a.s. on Ωε,

where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.

Page 36: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Lower boundMoreover, we ensure that the degree of non-linearity in f ischaracterised by φ also, by means of the assumption

lim sup‖x‖→∞

supy∈S

|〈x , f (x , y)〉|‖x‖φ(‖x‖)

= c1 ∈ (0,∞). (3)

Theorem 2

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (3) and that g satisfiesthe bounded noise conditions. Then X satisfies

lim supt→∞

‖X (t)‖

Φ−1(K2

1 (1−ε)2c1

log t) ≥ 1, a.s. on Ωε,

where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.

Page 37: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Lower boundMoreover, we ensure that the degree of non-linearity in f ischaracterised by φ also, by means of the assumption

lim sup‖x‖→∞

supy∈S

|〈x , f (x , y)〉|‖x‖φ(‖x‖)

= c1 ∈ (0,∞). (3)

Theorem 2

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (3) and that g satisfiesthe bounded noise conditions. Then X satisfies

lim supt→∞

‖X (t)‖

Φ−1(K2

1 (1−ε)2c1

log t) ≥ 1, a.s. on Ωε,

where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.

Page 38: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Growth rate of large fluctuations

Taking both theorems together, in the special case where φ is aregularly varying function, we get the following:

Theorem 3

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ ∈ RV∞(ζ) satisfying (2) and (3)and that g satisfies the bounded noise conditions. Then Xsatisfies(

K 21

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

a.s.,

where Φ(x) =∫ x1 φ(u) du ∈ RV∞(ζ + 1).

Page 39: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Growth rate of large fluctuations

Taking both theorems together, in the special case where φ is aregularly varying function, we get the following:

Theorem 3

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ ∈ RV∞(ζ) satisfying (2) and (3)and that g satisfies the bounded noise conditions. Then Xsatisfies(

K 21

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

a.s.,

where Φ(x) =∫ x1 φ(u) du ∈ RV∞(ζ + 1).

Page 40: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Growth rate of large fluctuations

Taking both theorems together, in the special case where φ is aregularly varying function, we get the following:

Theorem 3

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ ∈ RV∞(ζ) satisfying (2) and (3)and that g satisfies the bounded noise conditions. Then Xsatisfies(

K 21

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

a.s.,

where Φ(x) =∫ x1 φ(u) du ∈ RV∞(ζ + 1).

Page 41: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Comments and Example

• Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

• Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,• c1 = c2 = c and K1 = K2 = K ,• φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

• Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Page 42: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Comments and Example

• Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

• Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,• c1 = c2 = c and K1 = K2 = K ,• φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

• Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Page 43: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Comments and Example

• Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

• Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,• c1 = c2 = c and K1 = K2 = K ,• φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

• Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Page 44: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Comments and Example

• Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

• Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,• c1 = c2 = c and K1 = K2 = K ,• φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

• Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Page 45: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Comments and Example

• Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

• Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,• c1 = c2 = c and K1 = K2 = K ,• φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

• Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Page 46: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Comments and Example

• Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

• Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,• c1 = c2 = c and K1 = K2 = K ,• φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

• Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Page 47: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Comments and Example

• Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

• Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,• c1 = c2 = c and K1 = K2 = K ,• φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

• Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Page 48: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Comments and Example

• Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

• Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,• c1 = c2 = c and K1 = K2 = K ,• φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

• Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Page 49: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Comments and Example

• Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

• Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,• c1 = c2 = c and K1 = K2 = K ,• φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

• Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Page 50: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Outline of proofs

• Apply Ito transformation to reduce dimensionality andfacilitate the use of one-dimensional theory,

• Use the bounding conditions on f and g to construct upperand lower comparison processes whose dynamics are notdetermined by Y ,

• Use stochastic comparison theorem of Ikeda and Watanabe toprove that our transformed process is bounded above andbelow by the comparison processes for all t ≥ 0,

• The large deviations of the comparison processes aredetermined by means of a classical theorem of Motoo.

Page 51: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Outline of proofs

• Apply Ito transformation to reduce dimensionality andfacilitate the use of one-dimensional theory,

• Use the bounding conditions on f and g to construct upperand lower comparison processes whose dynamics are notdetermined by Y ,

• Use stochastic comparison theorem of Ikeda and Watanabe toprove that our transformed process is bounded above andbelow by the comparison processes for all t ≥ 0,

• The large deviations of the comparison processes aredetermined by means of a classical theorem of Motoo.

Page 52: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Outline of proofs

• Apply Ito transformation to reduce dimensionality andfacilitate the use of one-dimensional theory,

• Use the bounding conditions on f and g to construct upperand lower comparison processes whose dynamics are notdetermined by Y ,

• Use stochastic comparison theorem of Ikeda and Watanabe toprove that our transformed process is bounded above andbelow by the comparison processes for all t ≥ 0,

• The large deviations of the comparison processes aredetermined by means of a classical theorem of Motoo.

Page 53: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Outline of proofs

• Apply Ito transformation to reduce dimensionality andfacilitate the use of one-dimensional theory,

• Use the bounding conditions on f and g to construct upperand lower comparison processes whose dynamics are notdetermined by Y ,

• Use stochastic comparison theorem of Ikeda and Watanabe toprove that our transformed process is bounded above andbelow by the comparison processes for all t ≥ 0,

• The large deviations of the comparison processes aredetermined by means of a classical theorem of Motoo.

Page 54: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Outline

1 IntroductionMotivationRegular Variation

2 Bounded NoiseMain ResultsOutline of Proofs

3 ExtensionsUnbounded noise

4 Comments and future work

Page 55: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Unbounded noise

We suppose that the noise intensity g is unbounded in the sensethat there is an increasing function γ(x) →∞ as x →∞ s.t.

• ∃ K2 > 0 s.t. lim sup‖x‖→∞

supy∈S

‖g(x ,y)‖Fγ(‖x‖)

= K2

•∃K1 >0 s.t. lim inf‖x‖∈Rd

infy∈S

∑rj=1

(∑di=1 xigij (x ,y)

)2

γ2(‖x‖)‖x‖2

=K 2

1 .

By Cauchy-Schwartz we get the following upper and lower estimate

K 21 (1− ε) ≤

‖g(x , y)‖2F

γ2(‖x‖)≤ K 2

2 (1 + ε) ∀ y ∈ S, x ≥ X (ε).

Since we are interested in recurrent stationary processes we ensurethat xφ(x)

γ2(x)→∞ as x →∞.

Page 56: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Unbounded noise

We suppose that the noise intensity g is unbounded in the sensethat there is an increasing function γ(x) →∞ as x →∞ s.t.

• ∃ K2 > 0 s.t. lim sup‖x‖→∞

supy∈S

‖g(x ,y)‖Fγ(‖x‖)

= K2

•∃K1 >0 s.t. lim inf‖x‖∈Rd

infy∈S

∑rj=1

(∑di=1 xigij (x ,y)

)2

γ2(‖x‖)‖x‖2

=K 2

1 .

By Cauchy-Schwartz we get the following upper and lower estimate

K 21 (1− ε) ≤

‖g(x , y)‖2F

γ2(‖x‖)≤ K 2

2 (1 + ε) ∀ y ∈ S, x ≥ X (ε).

Since we are interested in recurrent stationary processes we ensurethat xφ(x)

γ2(x)→∞ as x →∞.

Page 57: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Unbounded noise

We suppose that the noise intensity g is unbounded in the sensethat there is an increasing function γ(x) →∞ as x →∞ s.t.

• ∃ K2 > 0 s.t. lim sup‖x‖→∞

supy∈S

‖g(x ,y)‖Fγ(‖x‖)

= K2

•∃K1 >0 s.t. lim inf‖x‖∈Rd

infy∈S

∑rj=1

(∑di=1 xigij (x ,y)

)2

γ2(‖x‖)‖x‖2

=K 2

1 .

By Cauchy-Schwartz we get the following upper and lower estimate

K 21 (1− ε) ≤

‖g(x , y)‖2F

γ2(‖x‖)≤ K 2

2 (1 + ε) ∀ y ∈ S, x ≥ X (ε).

Since we are interested in recurrent stationary processes we ensurethat xφ(x)

γ2(x)→∞ as x →∞.

Page 58: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Unbounded noise

We suppose that the noise intensity g is unbounded in the sensethat there is an increasing function γ(x) →∞ as x →∞ s.t.

• ∃ K2 > 0 s.t. lim sup‖x‖→∞

supy∈S

‖g(x ,y)‖Fγ(‖x‖)

= K2

•∃K1 >0 s.t. lim inf‖x‖∈Rd

infy∈S

∑rj=1

(∑di=1 xigij (x ,y)

)2

γ2(‖x‖)‖x‖2

=K 2

1 .

By Cauchy-Schwartz we get the following upper and lower estimate

K 21 (1− ε) ≤

‖g(x , y)‖2F

γ2(‖x‖)≤ K 2

2 (1 + ε) ∀ y ∈ S, x ≥ X (ε).

Since we are interested in recurrent stationary processes we ensurethat xφ(x)

γ2(x)→∞ as x →∞.

Page 59: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Unbounded noise

We suppose that the noise intensity g is unbounded in the sensethat there is an increasing function γ(x) →∞ as x →∞ s.t.

• ∃ K2 > 0 s.t. lim sup‖x‖→∞

supy∈S

‖g(x ,y)‖Fγ(‖x‖)

= K2

•∃K1 >0 s.t. lim inf‖x‖∈Rd

infy∈S

∑rj=1

(∑di=1 xigij (x ,y)

)2

γ2(‖x‖)‖x‖2

=K 2

1 .

By Cauchy-Schwartz we get the following upper and lower estimate

K 21 (1− ε) ≤

‖g(x , y)‖2F

γ2(‖x‖)≤ K 2

2 (1 + ε) ∀ y ∈ S, x ≥ X (ε).

Since we are interested in recurrent stationary processes we ensurethat xφ(x)

γ2(x)→∞ as x →∞.

Page 60: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Unbounded noise deviations

Using the same conditions on the drift, we get upper and lowerbound theorems analogous to theorems 1 and 2. Taking boththeorems together, in the special case where φ(x)

γ2(x)∈ RV∞(ζ), we

get the following:

Theorem 4

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (2) and (3) and afunction γ satisfying unbounded noise conditions. Then X satisfies(

K 21

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Ψ−1(log t)

≤(

K 22

2c2

) 1ζ+1

a.s.,

where Ψ(x) =∫ x1

φ(u)γ2(u)

du ∈ RV∞(ζ + 1).

Page 61: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Unbounded noise deviations

Using the same conditions on the drift, we get upper and lowerbound theorems analogous to theorems 1 and 2. Taking boththeorems together, in the special case where φ(x)

γ2(x)∈ RV∞(ζ), we

get the following:

Theorem 4

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (2) and (3) and afunction γ satisfying unbounded noise conditions. Then X satisfies(

K 21

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Ψ−1(log t)

≤(

K 22

2c2

) 1ζ+1

a.s.,

where Ψ(x) =∫ x1

φ(u)γ2(u)

du ∈ RV∞(ζ + 1).

Page 62: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Unbounded noise deviations

Using the same conditions on the drift, we get upper and lowerbound theorems analogous to theorems 1 and 2. Taking boththeorems together, in the special case where φ(x)

γ2(x)∈ RV∞(ζ), we

get the following:

Theorem 4

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (2) and (3) and afunction γ satisfying unbounded noise conditions. Then X satisfies(

K 21

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Ψ−1(log t)

≤(

K 22

2c2

) 1ζ+1

a.s.,

where Ψ(x) =∫ x1

φ(u)γ2(u)

du ∈ RV∞(ζ + 1).

Page 63: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Unbounded noise deviations

Using the same conditions on the drift, we get upper and lowerbound theorems analogous to theorems 1 and 2. Taking boththeorems together, in the special case where φ(x)

γ2(x)∈ RV∞(ζ), we

get the following:

Theorem 4

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (2) and (3) and afunction γ satisfying unbounded noise conditions. Then X satisfies(

K 21

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Ψ−1(log t)

≤(

K 22

2c2

) 1ζ+1

a.s.,

where Ψ(x) =∫ x1

φ(u)γ2(u)

du ∈ RV∞(ζ + 1).

Page 64: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Other Extensions

• In a system in economics, it is more useful to study the size ofthe largest component of the system, rather than the norm.

• Consequently, using norm equivalence in Rd , we can extendour results to get upper and lower bounds of the form

0 <1√d

α ≤ lim supt→∞

max1≤j≤d |Xj(t)|ρ(t)

≤ β ≤ +∞, a.s.

• Rather than having our process switch between a finitenumber of regimes, we can extend the state space of theMarkov chain to a countable state space.

Page 65: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Other Extensions

• In a system in economics, it is more useful to study the size ofthe largest component of the system, rather than the norm.

• Consequently, using norm equivalence in Rd , we can extendour results to get upper and lower bounds of the form

0 <1√d

α ≤ lim supt→∞

max1≤j≤d |Xj(t)|ρ(t)

≤ β ≤ +∞, a.s.

• Rather than having our process switch between a finitenumber of regimes, we can extend the state space of theMarkov chain to a countable state space.

Page 66: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Other Extensions

• In a system in economics, it is more useful to study the size ofthe largest component of the system, rather than the norm.

• Consequently, using norm equivalence in Rd , we can extendour results to get upper and lower bounds of the form

0 <1√d

α ≤ lim supt→∞

max1≤j≤d |Xj(t)|ρ(t)

≤ β ≤ +∞, a.s.

• Rather than having our process switch between a finitenumber of regimes, we can extend the state space of theMarkov chain to a countable state space.

Page 67: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Outline

1 IntroductionMotivationRegular Variation

2 Bounded NoiseMain ResultsOutline of Proofs

3 ExtensionsUnbounded noise

4 Comments and future work

Page 68: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Comments

• We emphasise the importance of the degree of nonlinearity inf and g in producing the essential growth rate of thedeviations

• The presence of Markovian switching does not play asignificant role in determining the rate of growth, but mayhave a significant impact on the constants α and β

• We pay particular attention to equations in which the mostinfluential factor driving each component of the system is thedegree of self mean-reversion of that component.

Page 69: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Comments

• We emphasise the importance of the degree of nonlinearity inf and g in producing the essential growth rate of thedeviations

• The presence of Markovian switching does not play asignificant role in determining the rate of growth, but mayhave a significant impact on the constants α and β

• We pay particular attention to equations in which the mostinfluential factor driving each component of the system is thedegree of self mean-reversion of that component.

Page 70: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Comments

• We emphasise the importance of the degree of nonlinearity inf and g in producing the essential growth rate of thedeviations

• The presence of Markovian switching does not play asignificant role in determining the rate of growth, but mayhave a significant impact on the constants α and β

• We pay particular attention to equations in which the mostinfluential factor driving each component of the system is thedegree of self mean-reversion of that component.

Page 71: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Future work

• To study finite-dimensional processes which are alwayspositive, e.g. the Cox Ingersoll Ross (CIR) model.

• To investigate the growth, explosion and simulation of ourdeveloped results. This will involve stochastic numericalanalysis and Monte Carlo simulation.

• This simulation will allow us to determine whether thequalitative features of our dynamical systems are preservedwhen analysed in discrete time.

• To investigate whether our results can be extended to SDEswith delay.

Page 72: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Future work

• To study finite-dimensional processes which are alwayspositive, e.g. the Cox Ingersoll Ross (CIR) model.

• To investigate the growth, explosion and simulation of ourdeveloped results. This will involve stochastic numericalanalysis and Monte Carlo simulation.

• This simulation will allow us to determine whether thequalitative features of our dynamical systems are preservedwhen analysed in discrete time.

• To investigate whether our results can be extended to SDEswith delay.

Page 73: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Future work

• To study finite-dimensional processes which are alwayspositive, e.g. the Cox Ingersoll Ross (CIR) model.

• To investigate the growth, explosion and simulation of ourdeveloped results. This will involve stochastic numericalanalysis and Monte Carlo simulation.

• This simulation will allow us to determine whether thequalitative features of our dynamical systems are preservedwhen analysed in discrete time.

• To investigate whether our results can be extended to SDEswith delay.

Page 74: Wcna 08 (alt)

Introduction Bounded Noise Extensions Comments and future work

Future work

• To study finite-dimensional processes which are alwayspositive, e.g. the Cox Ingersoll Ross (CIR) model.

• To investigate the growth, explosion and simulation of ourdeveloped results. This will involve stochastic numericalanalysis and Monte Carlo simulation.

• This simulation will allow us to determine whether thequalitative features of our dynamical systems are preservedwhen analysed in discrete time.

• To investigate whether our results can be extended to SDEswith delay.