67
Introduction Bounded Noise Extensions Future work On the Growth of the Extreme Fluctuations of SDEs with Markovian Switching Terry Lynch (joint work with Dr. John Appleby) Dublin City University, Ireland. Leverhulme International Network University of Chester, UK. November 7 th 2008 Supported by the Irish Research Council for Science, Engineering and Technology

Chester Nov08 Terry Lynch

Embed Size (px)

Citation preview

Page 1: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

On the Growth of the Extreme Fluctuationsof SDEs with Markovian Switching

Terry Lynch(joint work with Dr. John Appleby)

Dublin City University, Ireland.

Leverhulme International NetworkUniversity of Chester, UK.

November 7th 2008

Supported by the Irish Research Council for Science, Engineering and Technology

Page 2: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Outline

1 IntroductionMotivationRegular Variation

2 Bounded NoiseMain ResultsOutline of Proofs

3 Extensions

4 Future work

Page 3: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Outline

1 IntroductionMotivationRegular Variation

2 Bounded NoiseMain ResultsOutline of Proofs

3 Extensions

4 Future work

Page 4: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Recap - one year ago

Theorem 1

Let X be the unique adapted continuous solution satisfying

dX (t) = f (X (t), t) dt + g(X (t), t) dB(t).

If there exist ρ > 0 and real numbers K1 and K2 such that∀ (x , t) ∈ R× [0,∞)

xf (x , t) ≤ ρ, 0 < K2 ≤ g2(x , t) ≤ K1

then X satisfies

lim supt→∞

|X (t)|√2t log log t

≤√

K1, a.s.

Question: Why the hell is everyone using that iterated logarithm?!

Page 5: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Recap - one year ago

Theorem 1

Let X be the unique adapted continuous solution satisfying

dX (t) = f (X (t), t) dt + g(X (t), t) dB(t).

If there exist ρ > 0 and real numbers K1 and K2 such that∀ (x , t) ∈ R× [0,∞)

xf (x , t) ≤ ρ, 0 < K2 ≤ g2(x , t) ≤ K1

then X satisfies

lim supt→∞

|X (t)|√2t log log t

≤√

K1, a.s.

Question: Why the hell is everyone using that iterated logarithm?!

Page 6: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Recap - one year ago

Theorem 1

Let X be the unique adapted continuous solution satisfying

dX (t) = f (X (t), t) dt + g(X (t), t) dB(t).

If there exist ρ > 0 and real numbers K1 and K2 such that∀ (x , t) ∈ R× [0,∞)

xf (x , t) ≤ ρ, 0 < K2 ≤ g2(x , t) ≤ K1

then X satisfies

lim supt→∞

|X (t)|√2t log log t

≤√

K1, a.s.

Question: Why the hell is everyone using that iterated logarithm?!

Page 7: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Recap - one year ago

Theorem 1

Let X be the unique adapted continuous solution satisfying

dX (t) = f (X (t), t) dt + g(X (t), t) dB(t).

If there exist ρ > 0 and real numbers K1 and K2 such that∀ (x , t) ∈ R× [0,∞)

xf (x , t) ≤ ρ, 0 < K2 ≤ g2(x , t) ≤ K1

then X satisfies

lim supt→∞

|X (t)|√2t log log t

≤√

K1, a.s.

Question: Why the hell is everyone using that iterated logarithm?!

Page 8: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Introduction

We consider the rate of growth of the partial maxima of thesolution X (t)t≥0 of a nonlinear finite-dimensional stochasticdifferential equation (SDE) with Markovian Switching.

We find upper and lower estimates on this rate of growth byfinding constants α and β and an increasing function ρ s.t.

0 < α ≤ lim supt→∞

‖X (t)‖ρ(t)

≤ β, a.s.

We look at processes which have mean-reverting drift termsand which have either: (a) bounded noise intensity, or (b)unbounded noise intensity.

Page 9: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Introduction

We consider the rate of growth of the partial maxima of thesolution X (t)t≥0 of a nonlinear finite-dimensional stochasticdifferential equation (SDE) with Markovian Switching.

We find upper and lower estimates on this rate of growth byfinding constants α and β and an increasing function ρ s.t.

0 < α ≤ lim supt→∞

‖X (t)‖ρ(t)

≤ β, a.s.

We look at processes which have mean-reverting drift termsand which have either: (a) bounded noise intensity, or (b)unbounded noise intensity.

Page 10: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Introduction

We consider the rate of growth of the partial maxima of thesolution X (t)t≥0 of a nonlinear finite-dimensional stochasticdifferential equation (SDE) with Markovian Switching.

We find upper and lower estimates on this rate of growth byfinding constants α and β and an increasing function ρ s.t.

0 < α ≤ lim supt→∞

‖X (t)‖ρ(t)

≤ β, a.s.

We look at processes which have mean-reverting drift termsand which have either: (a) bounded noise intensity, or (b)unbounded noise intensity.

Page 11: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Introduction

We consider the rate of growth of the partial maxima of thesolution X (t)t≥0 of a nonlinear finite-dimensional stochasticdifferential equation (SDE) with Markovian Switching.

We find upper and lower estimates on this rate of growth byfinding constants α and β and an increasing function ρ s.t.

0 < α ≤ lim supt→∞

‖X (t)‖ρ(t)

≤ β, a.s.

We look at processes which have mean-reverting drift termsand which have either: (a) bounded noise intensity, or (b)unbounded noise intensity.

Page 12: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Introduction

We consider the rate of growth of the partial maxima of thesolution X (t)t≥0 of a nonlinear finite-dimensional stochasticdifferential equation (SDE) with Markovian Switching.

We find upper and lower estimates on this rate of growth byfinding constants α and β and an increasing function ρ s.t.

0 < α ≤ lim supt→∞

‖X (t)‖ρ(t)

≤ β, a.s.

We look at processes which have mean-reverting drift termsand which have either: (a) bounded noise intensity, or (b)unbounded noise intensity.

Page 13: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Motivation

The ability to model a self-regulating economic system whichis subjected to persistent stochastic shocks is facilitated bythe use of mean-reverting drift terms.

The type of finite-dimensional system studied allows for theanalysis of heavy tailed returns distributions in stochasticvolatility market models in which many assets are traded.

Equations with Markovian switching are motivated byeconometric evidence which suggests that security prices oftenmove from confident to nervous (or other) regimes.

Page 14: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Motivation

The ability to model a self-regulating economic system whichis subjected to persistent stochastic shocks is facilitated bythe use of mean-reverting drift terms.

The type of finite-dimensional system studied allows for theanalysis of heavy tailed returns distributions in stochasticvolatility market models in which many assets are traded.

Equations with Markovian switching are motivated byeconometric evidence which suggests that security prices oftenmove from confident to nervous (or other) regimes.

Page 15: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Motivation

The ability to model a self-regulating economic system whichis subjected to persistent stochastic shocks is facilitated bythe use of mean-reverting drift terms.

The type of finite-dimensional system studied allows for theanalysis of heavy tailed returns distributions in stochasticvolatility market models in which many assets are traded.

Equations with Markovian switching are motivated byeconometric evidence which suggests that security prices oftenmove from confident to nervous (or other) regimes.

Page 16: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Motivational case study

To demonstrate the potential impact of swift changes in investorsentiment, we look at a case study of Elan Corporation.

Page 17: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Motivational case study

To demonstrate the potential impact of swift changes in investorsentiment, we look at a case study of Elan Corporation.

Page 18: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Motivational case study

To demonstrate the potential impact of swift changes in investorsentiment, we look at a case study of Elan Corporation.

Page 19: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Motivational case study

To demonstrate the potential impact of swift changes in investorsentiment, we look at a case study of Elan Corporation.

Page 20: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Motivational case study

To demonstrate the potential impact of swift changes in investorsentiment, we look at a case study of Elan Corporation.

Page 21: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Regular Variation

Our analysis is aided by the use of regularly varying functions

In its basic form, may be viewed as relations such as

limx→∞

f (λx)

f (x)= λζ ∈ (0,∞) ∀ λ > 0

We say that f is regularly varying at infinity with index ζ, orf ∈ RV∞(ζ)

Regularly varying functions have useful properties such as:

f ∈ RV∞(ζ) ⇒ F (x) :=

∫ x

1f (t) dt ∈ RV∞(ζ + 1)

f ∈ RV∞(ζ) ⇒ f −1(x) ∈ RV∞(1/ζ).

Page 22: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Regular Variation

Our analysis is aided by the use of regularly varying functions

In its basic form, may be viewed as relations such as

limx→∞

f (λx)

f (x)= λζ ∈ (0,∞) ∀ λ > 0

We say that f is regularly varying at infinity with index ζ, orf ∈ RV∞(ζ)

Regularly varying functions have useful properties such as:

f ∈ RV∞(ζ) ⇒ F (x) :=

∫ x

1f (t) dt ∈ RV∞(ζ + 1)

f ∈ RV∞(ζ) ⇒ f −1(x) ∈ RV∞(1/ζ).

Page 23: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Regular Variation

Our analysis is aided by the use of regularly varying functions

In its basic form, may be viewed as relations such as

limx→∞

f (λx)

f (x)= λζ ∈ (0,∞) ∀ λ > 0

We say that f is regularly varying at infinity with index ζ, orf ∈ RV∞(ζ)

Regularly varying functions have useful properties such as:

f ∈ RV∞(ζ) ⇒ F (x) :=

∫ x

1f (t) dt ∈ RV∞(ζ + 1)

f ∈ RV∞(ζ) ⇒ f −1(x) ∈ RV∞(1/ζ).

Page 24: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Regular Variation

Our analysis is aided by the use of regularly varying functions

In its basic form, may be viewed as relations such as

limx→∞

f (λx)

f (x)= λζ ∈ (0,∞) ∀ λ > 0

We say that f is regularly varying at infinity with index ζ, orf ∈ RV∞(ζ)

Regularly varying functions have useful properties such as:

f ∈ RV∞(ζ) ⇒ F (x) :=

∫ x

1f (t) dt ∈ RV∞(ζ + 1)

f ∈ RV∞(ζ) ⇒ f −1(x) ∈ RV∞(1/ζ).

Page 25: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Outline

1 IntroductionMotivationRegular Variation

2 Bounded NoiseMain ResultsOutline of Proofs

3 Extensions

4 Future work

Page 26: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

S.D.E with Markovian switching

We study the finite-dimensional SDE with Markovian switching

dX (t) = f (X (t),Y (t)) dt + g(X (t),Y (t)) dB(t) (1)

where

Y is a Markov chain with finite irreducible state space S,

f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locallyLipschitz continuous functions,

B is an r -dimensional Brownian motion, independent of Y .

Under these conditions, there is a unique continuous and adaptedprocess which satisfies (1).

Page 27: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

S.D.E with Markovian switching

We study the finite-dimensional SDE with Markovian switching

dX (t) = f (X (t),Y (t)) dt + g(X (t),Y (t)) dB(t) (1)

where

Y is a Markov chain with finite irreducible state space S,

f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locallyLipschitz continuous functions,

B is an r -dimensional Brownian motion, independent of Y .

Under these conditions, there is a unique continuous and adaptedprocess which satisfies (1).

Page 28: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

S.D.E with Markovian switching

We study the finite-dimensional SDE with Markovian switching

dX (t) = f (X (t),Y (t)) dt + g(X (t),Y (t)) dB(t) (1)

where

Y is a Markov chain with finite irreducible state space S,

f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locallyLipschitz continuous functions,

B is an r -dimensional Brownian motion, independent of Y .

Under these conditions, there is a unique continuous and adaptedprocess which satisfies (1).

Page 29: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

S.D.E with Markovian switching

We study the finite-dimensional SDE with Markovian switching

dX (t) = f (X (t),Y (t)) dt + g(X (t),Y (t)) dB(t) (1)

where

Y is a Markov chain with finite irreducible state space S,

f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locallyLipschitz continuous functions,

B is an r -dimensional Brownian motion, independent of Y .

Under these conditions, there is a unique continuous and adaptedprocess which satisfies (1).

Page 30: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

S.D.E with Markovian switching

We study the finite-dimensional SDE with Markovian switching

dX (t) = f (X (t),Y (t)) dt + g(X (t),Y (t)) dB(t) (1)

where

Y is a Markov chain with finite irreducible state space S,

f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locallyLipschitz continuous functions,

B is an r -dimensional Brownian motion, independent of Y .

Under these conditions, there is a unique continuous and adaptedprocess which satisfies (1).

Page 31: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Bounded Noise

Consider the i th component of our SDE

dXi (t) = fi (X (t),Y (t)) dt +r∑

j=1

gij(X (t),Y (t)) dBj(t).

We suppose that the noise intensity g is bounded in the sense that

there exists K2 > 0 s.t. ‖g(x , y)‖F ≤ K2 for all y ∈ S,

there exists K1 > 0 s.t. inf‖x‖∈Rd

y∈S

∑rj=1

(∑di=1 xigij (x ,y)

)2

‖x‖2 ≥ K 21 .

By Cauchy-Schwarz we get the following upper and lower estimate

K 21 ≤ ‖g(x , y)‖2

F ≤ K 22 , for all y ∈ S.

Page 32: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Bounded Noise

Consider the i th component of our SDE

dXi (t) = fi (X (t),Y (t)) dt +r∑

j=1

gij(X (t),Y (t)) dBj(t).

We suppose that the noise intensity g is bounded in the sense that

there exists K2 > 0 s.t. ‖g(x , y)‖F ≤ K2 for all y ∈ S,

there exists K1 > 0 s.t. inf‖x‖∈Rd

y∈S

∑rj=1

(∑di=1 xigij (x ,y)

)2

‖x‖2 ≥ K 21 .

By Cauchy-Schwarz we get the following upper and lower estimate

K 21 ≤ ‖g(x , y)‖2

F ≤ K 22 , for all y ∈ S.

Page 33: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Bounded Noise

Consider the i th component of our SDE

dXi (t) = fi (X (t),Y (t)) dt +r∑

j=1

gij(X (t),Y (t)) dBj(t).

We suppose that the noise intensity g is bounded in the sense that

there exists K2 > 0 s.t. ‖g(x , y)‖F ≤ K2 for all y ∈ S,

there exists K1 > 0 s.t. inf‖x‖∈Rd

y∈S

∑rj=1

(∑di=1 xigij (x ,y)

)2

‖x‖2 ≥ K 21 .

By Cauchy-Schwarz we get the following upper and lower estimate

K 21 ≤ ‖g(x , y)‖2

F ≤ K 22 , for all y ∈ S.

Page 34: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Bounded Noise

Consider the i th component of our SDE

dXi (t) = fi (X (t),Y (t)) dt +r∑

j=1

gij(X (t),Y (t)) dBj(t).

We suppose that the noise intensity g is bounded in the sense that

there exists K2 > 0 s.t. ‖g(x , y)‖F ≤ K2 for all y ∈ S,

there exists K1 > 0 s.t. inf‖x‖∈Rd

y∈S

∑rj=1

(∑di=1 xigij (x ,y)

)2

‖x‖2 ≥ K 21 .

By Cauchy-Schwarz we get the following upper and lower estimate

K 21 ≤ ‖g(x , y)‖2

F ≤ K 22 , for all y ∈ S.

Page 35: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Bounded Noise

Consider the i th component of our SDE

dXi (t) = fi (X (t),Y (t)) dt +r∑

j=1

gij(X (t),Y (t)) dBj(t).

We suppose that the noise intensity g is bounded in the sense that

there exists K2 > 0 s.t. ‖g(x , y)‖F ≤ K2 for all y ∈ S,

there exists K1 > 0 s.t. inf‖x‖∈Rd

y∈S

∑rj=1

(∑di=1 xigij (x ,y)

)2

‖x‖2 ≥ K 21 .

By Cauchy-Schwarz we get the following upper and lower estimate

K 21 ≤ ‖g(x , y)‖2

F ≤ K 22 , for all y ∈ S.

Page 36: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Upper bound

The strength of the restoring force is characterised by a locallyLipschitz continuous function φ : [0,∞) → (0,∞) withxφ(x) →∞ as x →∞, where

lim sup‖x‖→∞

supy∈S

〈x , f (x , y)〉‖x‖φ(‖x‖)

≤ −c2 ∈ (−∞, 0). (2)

Theorem 1

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (2) and that g satisfiesthe bounded noise conditions. Then X satisfies

lim supt→∞

‖X (t)‖

Φ−1(K2

2 (1+ε)2c2

log t) ≤ 1, a.s. on Ωε,

where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.

Page 37: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Upper bound

The strength of the restoring force is characterised by a locallyLipschitz continuous function φ : [0,∞) → (0,∞) withxφ(x) →∞ as x →∞, where

lim sup‖x‖→∞

supy∈S

〈x , f (x , y)〉‖x‖φ(‖x‖)

≤ −c2 ∈ (−∞, 0). (2)

Theorem 1

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (2) and that g satisfiesthe bounded noise conditions. Then X satisfies

lim supt→∞

‖X (t)‖

Φ−1(K2

2 (1+ε)2c2

log t) ≤ 1, a.s. on Ωε,

where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.

Page 38: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Upper bound

The strength of the restoring force is characterised by a locallyLipschitz continuous function φ : [0,∞) → (0,∞) withxφ(x) →∞ as x →∞, where

lim sup‖x‖→∞

supy∈S

〈x , f (x , y)〉‖x‖φ(‖x‖)

≤ −c2 ∈ (−∞, 0). (2)

Theorem 1

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (2) and that g satisfiesthe bounded noise conditions. Then X satisfies

lim supt→∞

‖X (t)‖

Φ−1(K2

2 (1+ε)2c2

log t) ≤ 1, a.s. on Ωε,

where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.

Page 39: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Lower bound

Moreover, we ensure that the degree of non-linearity in f ischaracterised by φ also, by means of the assumption

lim sup‖x‖→∞

supy∈S

|〈x , f (x , y)〉|‖x‖φ(‖x‖)

≤ c1 ∈ (0,∞). (3)

Theorem 2

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (3) and that g satisfiesthe bounded noise conditions. Then X satisfies

lim supt→∞

‖X (t)‖

Φ−1(K2

1 (1−ε)2c1

log t) ≥ 1, a.s. on Ωε,

where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.

Page 40: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Lower bound

Moreover, we ensure that the degree of non-linearity in f ischaracterised by φ also, by means of the assumption

lim sup‖x‖→∞

supy∈S

|〈x , f (x , y)〉|‖x‖φ(‖x‖)

≤ c1 ∈ (0,∞). (3)

Theorem 2

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (3) and that g satisfiesthe bounded noise conditions. Then X satisfies

lim supt→∞

‖X (t)‖

Φ−1(K2

1 (1−ε)2c1

log t) ≥ 1, a.s. on Ωε,

where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.

Page 41: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Lower bound

Moreover, we ensure that the degree of non-linearity in f ischaracterised by φ also, by means of the assumption

lim sup‖x‖→∞

supy∈S

|〈x , f (x , y)〉|‖x‖φ(‖x‖)

≤ c1 ∈ (0,∞). (3)

Theorem 2

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (3) and that g satisfiesthe bounded noise conditions. Then X satisfies

lim supt→∞

‖X (t)‖

Φ−1(K2

1 (1−ε)2c1

log t) ≥ 1, a.s. on Ωε,

where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.

Page 42: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Growth rate of large fluctuations

Taking both theorems together, in the special case where φ is aregularly varying function, we get the following:

Theorem 3

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ ∈ RV∞(ζ) satisfying (2) and (3)and that g satisfies the bounded noise conditions. Then Xsatisfies(

K 21

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

a.s.,

where Φ(x) =∫ x1 φ(u) du ∈ RV∞(ζ + 1).

Page 43: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Growth rate of large fluctuations

Taking both theorems together, in the special case where φ is aregularly varying function, we get the following:

Theorem 3

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ ∈ RV∞(ζ) satisfying (2) and (3)and that g satisfies the bounded noise conditions. Then Xsatisfies(

K 21

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

a.s.,

where Φ(x) =∫ x1 φ(u) du ∈ RV∞(ζ + 1).

Page 44: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Growth rate of large fluctuations

Taking both theorems together, in the special case where φ is aregularly varying function, we get the following:

Theorem 3

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ ∈ RV∞(ζ) satisfying (2) and (3)and that g satisfies the bounded noise conditions. Then Xsatisfies(

K 21

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

a.s.,

where Φ(x) =∫ x1 φ(u) du ∈ RV∞(ζ + 1).

Page 45: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Comments and Example

Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,

c1 = c2 = c and K1 = K2 = K ,φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Page 46: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Comments and Example

Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,

c1 = c2 = c and K1 = K2 = K ,φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Page 47: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Comments and Example

Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,

c1 = c2 = c and K1 = K2 = K ,φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Page 48: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Comments and Example

Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,

c1 = c2 = c and K1 = K2 = K ,φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Page 49: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Comments and Example

Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,

c1 = c2 = c and K1 = K2 = K ,φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Page 50: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Comments and Example

Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,

c1 = c2 = c and K1 = K2 = K ,φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Page 51: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Comments and Example

Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,

c1 = c2 = c and K1 = K2 = K ,φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Page 52: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Comments and Example

Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,

c1 = c2 = c and K1 = K2 = K ,φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Page 53: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Comments and Example

Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,

c1 = c2 = c and K1 = K2 = K ,φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Page 54: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Outline of proofs

Apply time–change and Ito transformation to reducedimensionality and facilitate the use of one-dimensional theory,

Use the bounding conditions on f and g to construct upperand lower comparison processes whose dynamics are notdetermined by the switching parameter Y ,

Use stochastic comparison theorem to prove that ourtransformed process is bounded above and below by thecomparison processes for all t ≥ 0 a.s.,

The large deviations of the comparison processes aredetermined by means of a classical theorem of Motoo.

Page 55: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Outline of proofs

Apply time–change and Ito transformation to reducedimensionality and facilitate the use of one-dimensional theory,

Use the bounding conditions on f and g to construct upperand lower comparison processes whose dynamics are notdetermined by the switching parameter Y ,

Use stochastic comparison theorem to prove that ourtransformed process is bounded above and below by thecomparison processes for all t ≥ 0 a.s.,

The large deviations of the comparison processes aredetermined by means of a classical theorem of Motoo.

Page 56: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Outline of proofs

Apply time–change and Ito transformation to reducedimensionality and facilitate the use of one-dimensional theory,

Use the bounding conditions on f and g to construct upperand lower comparison processes whose dynamics are notdetermined by the switching parameter Y ,

Use stochastic comparison theorem to prove that ourtransformed process is bounded above and below by thecomparison processes for all t ≥ 0 a.s.,

The large deviations of the comparison processes aredetermined by means of a classical theorem of Motoo.

Page 57: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Outline of proofs

Apply time–change and Ito transformation to reducedimensionality and facilitate the use of one-dimensional theory,

Use the bounding conditions on f and g to construct upperand lower comparison processes whose dynamics are notdetermined by the switching parameter Y ,

Use stochastic comparison theorem to prove that ourtransformed process is bounded above and below by thecomparison processes for all t ≥ 0 a.s.,

The large deviations of the comparison processes aredetermined by means of a classical theorem of Motoo.

Page 58: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Outline

1 IntroductionMotivationRegular Variation

2 Bounded NoiseMain ResultsOutline of Proofs

3 Extensions

4 Future work

Page 59: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Extensions

We can impose a rate of nonlinear growth of g with ‖x‖ as‖x‖ → ∞ through an increasing scalar function γ.

Then the growth rate of the deviations are of the orderΨ−1(log t) where Ψ(x) =

∫ x1 φ(u)/γ2(u) du.

Using norm equivalence in Rd we can study the size of thelargest component of the system rather than the norm, to getupper and lower bounds of the form

0 <1√d

α ≤ lim supt→∞

max1≤j≤d |Xj(t)|ρ(t)

≤ β ≤ +∞, a.s.

We can extend the state space of the Markov chain to acountable state space.

Page 60: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Extensions

We can impose a rate of nonlinear growth of g with ‖x‖ as‖x‖ → ∞ through an increasing scalar function γ.

Then the growth rate of the deviations are of the orderΨ−1(log t) where Ψ(x) =

∫ x1 φ(u)/γ2(u) du.

Using norm equivalence in Rd we can study the size of thelargest component of the system rather than the norm, to getupper and lower bounds of the form

0 <1√d

α ≤ lim supt→∞

max1≤j≤d |Xj(t)|ρ(t)

≤ β ≤ +∞, a.s.

We can extend the state space of the Markov chain to acountable state space.

Page 61: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Extensions

We can impose a rate of nonlinear growth of g with ‖x‖ as‖x‖ → ∞ through an increasing scalar function γ.

Then the growth rate of the deviations are of the orderΨ−1(log t) where Ψ(x) =

∫ x1 φ(u)/γ2(u) du.

Using norm equivalence in Rd we can study the size of thelargest component of the system rather than the norm, to getupper and lower bounds of the form

0 <1√d

α ≤ lim supt→∞

max1≤j≤d |Xj(t)|ρ(t)

≤ β ≤ +∞, a.s.

We can extend the state space of the Markov chain to acountable state space.

Page 62: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Extensions

We can impose a rate of nonlinear growth of g with ‖x‖ as‖x‖ → ∞ through an increasing scalar function γ.

Then the growth rate of the deviations are of the orderΨ−1(log t) where Ψ(x) =

∫ x1 φ(u)/γ2(u) du.

Using norm equivalence in Rd we can study the size of thelargest component of the system rather than the norm, to getupper and lower bounds of the form

0 <1√d

α ≤ lim supt→∞

max1≤j≤d |Xj(t)|ρ(t)

≤ β ≤ +∞, a.s.

We can extend the state space of the Markov chain to acountable state space.

Page 63: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Outline

1 IntroductionMotivationRegular Variation

2 Bounded NoiseMain ResultsOutline of Proofs

3 Extensions

4 Future work

Page 64: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Future work

To study finite-dimensional processes which are alwayspositive, e.g. the Cox Ingersoll Ross (CIR) model.

To investigate the growth, explosion and simulation of ourdeveloped results. This will involve stochastic numericalanalysis and Monte Carlo simulation.

This simulation will allow us to determine whether thequalitative features of our dynamical systems are preservedwhen analysed in discrete time.

To investigate whether our results can be extended to SDEswith delay.

Page 65: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Future work

To study finite-dimensional processes which are alwayspositive, e.g. the Cox Ingersoll Ross (CIR) model.

To investigate the growth, explosion and simulation of ourdeveloped results. This will involve stochastic numericalanalysis and Monte Carlo simulation.

This simulation will allow us to determine whether thequalitative features of our dynamical systems are preservedwhen analysed in discrete time.

To investigate whether our results can be extended to SDEswith delay.

Page 66: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Future work

To study finite-dimensional processes which are alwayspositive, e.g. the Cox Ingersoll Ross (CIR) model.

To investigate the growth, explosion and simulation of ourdeveloped results. This will involve stochastic numericalanalysis and Monte Carlo simulation.

This simulation will allow us to determine whether thequalitative features of our dynamical systems are preservedwhen analysed in discrete time.

To investigate whether our results can be extended to SDEswith delay.

Page 67: Chester Nov08 Terry Lynch

Introduction Bounded Noise Extensions Future work

Future work

To study finite-dimensional processes which are alwayspositive, e.g. the Cox Ingersoll Ross (CIR) model.

To investigate the growth, explosion and simulation of ourdeveloped results. This will involve stochastic numericalanalysis and Monte Carlo simulation.

This simulation will allow us to determine whether thequalitative features of our dynamical systems are preservedwhen analysed in discrete time.

To investigate whether our results can be extended to SDEswith delay.