View
240
Download
0
Category
Tags:
Preview:
Citation preview
Introduction Bounded Noise Extensions Comments and future work
The size of the largest fluctuations of nonlinearfinite dimensional SDEs with Markovian switching
Terry Lynch(joint work with Dr. John Appleby)
Dublin City University, Ireland.
WCNA PresentationOrlando, Florida, USA.
July 4th 2008
Supported by the Irish Research Council for Science, Engineering and Technology
Introduction Bounded Noise Extensions Comments and future work
Outline
1 IntroductionMotivationRegular Variation
2 Bounded NoiseMain ResultsOutline of Proofs
3 ExtensionsUnbounded noise
4 Comments and future work
Introduction Bounded Noise Extensions Comments and future work
Outline
1 IntroductionMotivationRegular Variation
2 Bounded NoiseMain ResultsOutline of Proofs
3 ExtensionsUnbounded noise
4 Comments and future work
Introduction Bounded Noise Extensions Comments and future work
Introduction
• We consider the rate of growth of the partial maxima of thesolution X (t)t≥0 of a nonlinear finite-dimensional stochasticdifferential equation (SDE) with Markovian Switching.
• We find upper and lower estimates on this rate of growth byfinding constants α and β and an increasing function ρ s.t.
0 < α ≤ lim supt→∞
‖X (t)‖ρ(t)
≤ β, a.s.
• We look at processes which have mean-reverting drift termsand which have either: (a) bounded noise intensity, or (b)unbounded noise intensity.
Introduction Bounded Noise Extensions Comments and future work
Introduction
• We consider the rate of growth of the partial maxima of thesolution X (t)t≥0 of a nonlinear finite-dimensional stochasticdifferential equation (SDE) with Markovian Switching.
• We find upper and lower estimates on this rate of growth byfinding constants α and β and an increasing function ρ s.t.
0 < α ≤ lim supt→∞
‖X (t)‖ρ(t)
≤ β, a.s.
• We look at processes which have mean-reverting drift termsand which have either: (a) bounded noise intensity, or (b)unbounded noise intensity.
Introduction Bounded Noise Extensions Comments and future work
Introduction
• We consider the rate of growth of the partial maxima of thesolution X (t)t≥0 of a nonlinear finite-dimensional stochasticdifferential equation (SDE) with Markovian Switching.
• We find upper and lower estimates on this rate of growth byfinding constants α and β and an increasing function ρ s.t.
0 < α ≤ lim supt→∞
‖X (t)‖ρ(t)
≤ β, a.s.
• We look at processes which have mean-reverting drift termsand which have either: (a) bounded noise intensity, or (b)unbounded noise intensity.
Introduction Bounded Noise Extensions Comments and future work
Introduction
• We consider the rate of growth of the partial maxima of thesolution X (t)t≥0 of a nonlinear finite-dimensional stochasticdifferential equation (SDE) with Markovian Switching.
• We find upper and lower estimates on this rate of growth byfinding constants α and β and an increasing function ρ s.t.
0 < α ≤ lim supt→∞
‖X (t)‖ρ(t)
≤ β, a.s.
• We look at processes which have mean-reverting drift termsand which have either: (a) bounded noise intensity, or (b)unbounded noise intensity.
Introduction Bounded Noise Extensions Comments and future work
Introduction
• We consider the rate of growth of the partial maxima of thesolution X (t)t≥0 of a nonlinear finite-dimensional stochasticdifferential equation (SDE) with Markovian Switching.
• We find upper and lower estimates on this rate of growth byfinding constants α and β and an increasing function ρ s.t.
0 < α ≤ lim supt→∞
‖X (t)‖ρ(t)
≤ β, a.s.
• We look at processes which have mean-reverting drift termsand which have either: (a) bounded noise intensity, or (b)unbounded noise intensity.
Introduction Bounded Noise Extensions Comments and future work
Motivation
• The ability to model a self-regulating economic system whichis subjected to persistent stochastic shocks is facilitated bythe use of mean-reverting drift terms.
• The type of finite-dimensional system studied allows for theanalysis of heavy tailed returns distributions in stochasticvolatility market models in which many assets are traded.
• Equations with Markovian switching are motivated byeconometric evidence which suggests that security prices oftenmove from confident to nervous (or other) regimes.
Introduction Bounded Noise Extensions Comments and future work
Motivation
• The ability to model a self-regulating economic system whichis subjected to persistent stochastic shocks is facilitated bythe use of mean-reverting drift terms.
• The type of finite-dimensional system studied allows for theanalysis of heavy tailed returns distributions in stochasticvolatility market models in which many assets are traded.
• Equations with Markovian switching are motivated byeconometric evidence which suggests that security prices oftenmove from confident to nervous (or other) regimes.
Introduction Bounded Noise Extensions Comments and future work
Motivation
• The ability to model a self-regulating economic system whichis subjected to persistent stochastic shocks is facilitated bythe use of mean-reverting drift terms.
• The type of finite-dimensional system studied allows for theanalysis of heavy tailed returns distributions in stochasticvolatility market models in which many assets are traded.
• Equations with Markovian switching are motivated byeconometric evidence which suggests that security prices oftenmove from confident to nervous (or other) regimes.
Introduction Bounded Noise Extensions Comments and future work
Motivational case study
To demonstrate the potential impact of swift changes in investorsentiment, we look at a case study of Elan Corporation.
Introduction Bounded Noise Extensions Comments and future work
Motivational case study
To demonstrate the potential impact of swift changes in investorsentiment, we look at a case study of Elan Corporation.
Introduction Bounded Noise Extensions Comments and future work
Motivational case study
To demonstrate the potential impact of swift changes in investorsentiment, we look at a case study of Elan Corporation.
Introduction Bounded Noise Extensions Comments and future work
Motivational case study
To demonstrate the potential impact of swift changes in investorsentiment, we look at a case study of Elan Corporation.
Introduction Bounded Noise Extensions Comments and future work
Motivational case study
To demonstrate the potential impact of swift changes in investorsentiment, we look at a case study of Elan Corporation.
Introduction Bounded Noise Extensions Comments and future work
Regular Variation
• Our analysis is aided by the use of regularly varying functions
• In its basic form, may be viewed as relations such as
limx→∞
f (λx)
f (x)= λζ ∈ (0,∞) ∀ λ > 0
• We say that f is regularly varying at infinity with index ζ, orf ∈ RV∞(ζ)
• Regularly varying functions have useful properties such as:
f ∈ RV∞(ζ) ⇒ F (x) :=
∫ x
1f (t) dt ∈ RV∞(ζ + 1)
f ∈ RV∞(ζ) ⇒ f −1(x) ∈ RV∞(1
ζ).
Introduction Bounded Noise Extensions Comments and future work
Regular Variation
• Our analysis is aided by the use of regularly varying functions
• In its basic form, may be viewed as relations such as
limx→∞
f (λx)
f (x)= λζ ∈ (0,∞) ∀ λ > 0
• We say that f is regularly varying at infinity with index ζ, orf ∈ RV∞(ζ)
• Regularly varying functions have useful properties such as:
f ∈ RV∞(ζ) ⇒ F (x) :=
∫ x
1f (t) dt ∈ RV∞(ζ + 1)
f ∈ RV∞(ζ) ⇒ f −1(x) ∈ RV∞(1
ζ).
Introduction Bounded Noise Extensions Comments and future work
Regular Variation
• Our analysis is aided by the use of regularly varying functions
• In its basic form, may be viewed as relations such as
limx→∞
f (λx)
f (x)= λζ ∈ (0,∞) ∀ λ > 0
• We say that f is regularly varying at infinity with index ζ, orf ∈ RV∞(ζ)
• Regularly varying functions have useful properties such as:
f ∈ RV∞(ζ) ⇒ F (x) :=
∫ x
1f (t) dt ∈ RV∞(ζ + 1)
f ∈ RV∞(ζ) ⇒ f −1(x) ∈ RV∞(1
ζ).
Introduction Bounded Noise Extensions Comments and future work
Regular Variation
• Our analysis is aided by the use of regularly varying functions
• In its basic form, may be viewed as relations such as
limx→∞
f (λx)
f (x)= λζ ∈ (0,∞) ∀ λ > 0
• We say that f is regularly varying at infinity with index ζ, orf ∈ RV∞(ζ)
• Regularly varying functions have useful properties such as:
f ∈ RV∞(ζ) ⇒ F (x) :=
∫ x
1f (t) dt ∈ RV∞(ζ + 1)
f ∈ RV∞(ζ) ⇒ f −1(x) ∈ RV∞(1
ζ).
Introduction Bounded Noise Extensions Comments and future work
Outline
1 IntroductionMotivationRegular Variation
2 Bounded NoiseMain ResultsOutline of Proofs
3 ExtensionsUnbounded noise
4 Comments and future work
Introduction Bounded Noise Extensions Comments and future work
S.D.E with Markovian switching
We study the finite-dimensional SDE with Markovian switching
dX (t) = f (X (t),Y (t)) dt + g(X (t),Y (t)) dB(t) (1)
where
• Y is a Markov chain with finite irreducible state space S,
• f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locallyLipschitz continuous functions,
• B is an r -dimensional Brownian motion, independent of Y .
Under these conditions, there is a unique continuous and adaptedprocess which satisfies (1).
Introduction Bounded Noise Extensions Comments and future work
S.D.E with Markovian switching
We study the finite-dimensional SDE with Markovian switching
dX (t) = f (X (t),Y (t)) dt + g(X (t),Y (t)) dB(t) (1)
where
• Y is a Markov chain with finite irreducible state space S,
• f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locallyLipschitz continuous functions,
• B is an r -dimensional Brownian motion, independent of Y .
Under these conditions, there is a unique continuous and adaptedprocess which satisfies (1).
Introduction Bounded Noise Extensions Comments and future work
S.D.E with Markovian switching
We study the finite-dimensional SDE with Markovian switching
dX (t) = f (X (t),Y (t)) dt + g(X (t),Y (t)) dB(t) (1)
where
• Y is a Markov chain with finite irreducible state space S,
• f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locallyLipschitz continuous functions,
• B is an r -dimensional Brownian motion, independent of Y .
Under these conditions, there is a unique continuous and adaptedprocess which satisfies (1).
Introduction Bounded Noise Extensions Comments and future work
S.D.E with Markovian switching
We study the finite-dimensional SDE with Markovian switching
dX (t) = f (X (t),Y (t)) dt + g(X (t),Y (t)) dB(t) (1)
where
• Y is a Markov chain with finite irreducible state space S,
• f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locallyLipschitz continuous functions,
• B is an r -dimensional Brownian motion, independent of Y .
Under these conditions, there is a unique continuous and adaptedprocess which satisfies (1).
Introduction Bounded Noise Extensions Comments and future work
S.D.E with Markovian switching
We study the finite-dimensional SDE with Markovian switching
dX (t) = f (X (t),Y (t)) dt + g(X (t),Y (t)) dB(t) (1)
where
• Y is a Markov chain with finite irreducible state space S,
• f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locallyLipschitz continuous functions,
• B is an r -dimensional Brownian motion, independent of Y .
Under these conditions, there is a unique continuous and adaptedprocess which satisfies (1).
Introduction Bounded Noise Extensions Comments and future work
Bounded Noise
Consider the i th component of our SDE
dXi (t) = fi (X (t),Y (t)) dt +r∑
j=1
gij(X (t),Y (t)) dBj(t).
We suppose that the noise intensity g is bounded in the sense that
• There exists K2 > 0 s.t. ‖g(x , y)‖F ≤ K2 for all y ∈ S,
• there exists K1 > 0 s.t. inf‖x‖∈Rd
y∈S
∑rj=1
(∑di=1 xigij (x ,y)
)2
‖x‖2 ≥ K 21 .
By Cauchy-Schwartz we get the following upper and lower estimate
K 21 ≤ ‖g(x , y)‖2
F ≤ K 22 , for all y ∈ S.
Introduction Bounded Noise Extensions Comments and future work
Bounded Noise
Consider the i th component of our SDE
dXi (t) = fi (X (t),Y (t)) dt +r∑
j=1
gij(X (t),Y (t)) dBj(t).
We suppose that the noise intensity g is bounded in the sense that
• There exists K2 > 0 s.t. ‖g(x , y)‖F ≤ K2 for all y ∈ S,
• there exists K1 > 0 s.t. inf‖x‖∈Rd
y∈S
∑rj=1
(∑di=1 xigij (x ,y)
)2
‖x‖2 ≥ K 21 .
By Cauchy-Schwartz we get the following upper and lower estimate
K 21 ≤ ‖g(x , y)‖2
F ≤ K 22 , for all y ∈ S.
Introduction Bounded Noise Extensions Comments and future work
Bounded Noise
Consider the i th component of our SDE
dXi (t) = fi (X (t),Y (t)) dt +r∑
j=1
gij(X (t),Y (t)) dBj(t).
We suppose that the noise intensity g is bounded in the sense that
• There exists K2 > 0 s.t. ‖g(x , y)‖F ≤ K2 for all y ∈ S,
• there exists K1 > 0 s.t. inf‖x‖∈Rd
y∈S
∑rj=1
(∑di=1 xigij (x ,y)
)2
‖x‖2 ≥ K 21 .
By Cauchy-Schwartz we get the following upper and lower estimate
K 21 ≤ ‖g(x , y)‖2
F ≤ K 22 , for all y ∈ S.
Introduction Bounded Noise Extensions Comments and future work
Bounded Noise
Consider the i th component of our SDE
dXi (t) = fi (X (t),Y (t)) dt +r∑
j=1
gij(X (t),Y (t)) dBj(t).
We suppose that the noise intensity g is bounded in the sense that
• There exists K2 > 0 s.t. ‖g(x , y)‖F ≤ K2 for all y ∈ S,
• there exists K1 > 0 s.t. inf‖x‖∈Rd
y∈S
∑rj=1
(∑di=1 xigij (x ,y)
)2
‖x‖2 ≥ K 21 .
By Cauchy-Schwartz we get the following upper and lower estimate
K 21 ≤ ‖g(x , y)‖2
F ≤ K 22 , for all y ∈ S.
Introduction Bounded Noise Extensions Comments and future work
Bounded Noise
Consider the i th component of our SDE
dXi (t) = fi (X (t),Y (t)) dt +r∑
j=1
gij(X (t),Y (t)) dBj(t).
We suppose that the noise intensity g is bounded in the sense that
• There exists K2 > 0 s.t. ‖g(x , y)‖F ≤ K2 for all y ∈ S,
• there exists K1 > 0 s.t. inf‖x‖∈Rd
y∈S
∑rj=1
(∑di=1 xigij (x ,y)
)2
‖x‖2 ≥ K 21 .
By Cauchy-Schwartz we get the following upper and lower estimate
K 21 ≤ ‖g(x , y)‖2
F ≤ K 22 , for all y ∈ S.
Introduction Bounded Noise Extensions Comments and future work
Upper boundThe strength of the restoring force is characterised by a locallyLipschitz continuous function φ : [0,∞) → (0,∞) withxφ(x) →∞ as x →∞, where
lim sup‖x‖→∞
supy∈S
〈x , f (x , y)〉‖x‖φ(‖x‖)
= −c2 ∈ (−∞, 0). (2)
Theorem 1
Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (2) and that g satisfiesthe bounded noise conditions. Then X satisfies
lim supt→∞
‖X (t)‖
Φ−1(K2
2 (1+ε)2c2
log t) ≤ 1, a.s. on Ωε,
where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.
Introduction Bounded Noise Extensions Comments and future work
Upper boundThe strength of the restoring force is characterised by a locallyLipschitz continuous function φ : [0,∞) → (0,∞) withxφ(x) →∞ as x →∞, where
lim sup‖x‖→∞
supy∈S
〈x , f (x , y)〉‖x‖φ(‖x‖)
= −c2 ∈ (−∞, 0). (2)
Theorem 1
Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (2) and that g satisfiesthe bounded noise conditions. Then X satisfies
lim supt→∞
‖X (t)‖
Φ−1(K2
2 (1+ε)2c2
log t) ≤ 1, a.s. on Ωε,
where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.
Introduction Bounded Noise Extensions Comments and future work
Upper boundThe strength of the restoring force is characterised by a locallyLipschitz continuous function φ : [0,∞) → (0,∞) withxφ(x) →∞ as x →∞, where
lim sup‖x‖→∞
supy∈S
〈x , f (x , y)〉‖x‖φ(‖x‖)
= −c2 ∈ (−∞, 0). (2)
Theorem 1
Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (2) and that g satisfiesthe bounded noise conditions. Then X satisfies
lim supt→∞
‖X (t)‖
Φ−1(K2
2 (1+ε)2c2
log t) ≤ 1, a.s. on Ωε,
where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.
Introduction Bounded Noise Extensions Comments and future work
Lower boundMoreover, we ensure that the degree of non-linearity in f ischaracterised by φ also, by means of the assumption
lim sup‖x‖→∞
supy∈S
|〈x , f (x , y)〉|‖x‖φ(‖x‖)
= c1 ∈ (0,∞). (3)
Theorem 2
Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (3) and that g satisfiesthe bounded noise conditions. Then X satisfies
lim supt→∞
‖X (t)‖
Φ−1(K2
1 (1−ε)2c1
log t) ≥ 1, a.s. on Ωε,
where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.
Introduction Bounded Noise Extensions Comments and future work
Lower boundMoreover, we ensure that the degree of non-linearity in f ischaracterised by φ also, by means of the assumption
lim sup‖x‖→∞
supy∈S
|〈x , f (x , y)〉|‖x‖φ(‖x‖)
= c1 ∈ (0,∞). (3)
Theorem 2
Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (3) and that g satisfiesthe bounded noise conditions. Then X satisfies
lim supt→∞
‖X (t)‖
Φ−1(K2
1 (1−ε)2c1
log t) ≥ 1, a.s. on Ωε,
where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.
Introduction Bounded Noise Extensions Comments and future work
Lower boundMoreover, we ensure that the degree of non-linearity in f ischaracterised by φ also, by means of the assumption
lim sup‖x‖→∞
supy∈S
|〈x , f (x , y)〉|‖x‖φ(‖x‖)
= c1 ∈ (0,∞). (3)
Theorem 2
Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (3) and that g satisfiesthe bounded noise conditions. Then X satisfies
lim supt→∞
‖X (t)‖
Φ−1(K2
1 (1−ε)2c1
log t) ≥ 1, a.s. on Ωε,
where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.
Introduction Bounded Noise Extensions Comments and future work
Growth rate of large fluctuations
Taking both theorems together, in the special case where φ is aregularly varying function, we get the following:
Theorem 3
Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ ∈ RV∞(ζ) satisfying (2) and (3)and that g satisfies the bounded noise conditions. Then Xsatisfies(
K 21
2c1
) 1ζ+1
≤ lim supt→∞
‖X (t)‖Φ−1(log t)
≤(
K 22
2c2
) 1ζ+1
a.s.,
where Φ(x) =∫ x1 φ(u) du ∈ RV∞(ζ + 1).
Introduction Bounded Noise Extensions Comments and future work
Growth rate of large fluctuations
Taking both theorems together, in the special case where φ is aregularly varying function, we get the following:
Theorem 3
Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ ∈ RV∞(ζ) satisfying (2) and (3)and that g satisfies the bounded noise conditions. Then Xsatisfies(
K 21
2c1
) 1ζ+1
≤ lim supt→∞
‖X (t)‖Φ−1(log t)
≤(
K 22
2c2
) 1ζ+1
a.s.,
where Φ(x) =∫ x1 φ(u) du ∈ RV∞(ζ + 1).
Introduction Bounded Noise Extensions Comments and future work
Growth rate of large fluctuations
Taking both theorems together, in the special case where φ is aregularly varying function, we get the following:
Theorem 3
Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ ∈ RV∞(ζ) satisfying (2) and (3)and that g satisfies the bounded noise conditions. Then Xsatisfies(
K 21
2c1
) 1ζ+1
≤ lim supt→∞
‖X (t)‖Φ−1(log t)
≤(
K 22
2c2
) 1ζ+1
a.s.,
where Φ(x) =∫ x1 φ(u) du ∈ RV∞(ζ + 1).
Introduction Bounded Noise Extensions Comments and future work
Comments and Example
• Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.
• Take a simple one-dimensional Ornstein Uhlenbeck process
dX (t) = −cX (t)dt + K dB(t)
where, in our notation,• c1 = c2 = c and K1 = K2 = K ,• φ(x) = x ∈ RV∞(1) ⇒ ζ = 1
• Φ(x) = x2
2 and Φ−1(x) =√
2x
Then applying theorem 3 recovers the well known result(K 2
1
2c1
) 1ζ+1
≤ lim supt→∞
‖X (t)‖Φ−1(log t)
≤(
K 22
2c2
) 1ζ+1
lim supt→∞
|X (t)|√2 log t
=
(K 2
2c
) 12
, a.s.
Introduction Bounded Noise Extensions Comments and future work
Comments and Example
• Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.
• Take a simple one-dimensional Ornstein Uhlenbeck process
dX (t) = −cX (t)dt + K dB(t)
where, in our notation,• c1 = c2 = c and K1 = K2 = K ,• φ(x) = x ∈ RV∞(1) ⇒ ζ = 1
• Φ(x) = x2
2 and Φ−1(x) =√
2x
Then applying theorem 3 recovers the well known result(K 2
1
2c1
) 1ζ+1
≤ lim supt→∞
‖X (t)‖Φ−1(log t)
≤(
K 22
2c2
) 1ζ+1
lim supt→∞
|X (t)|√2 log t
=
(K 2
2c
) 12
, a.s.
Introduction Bounded Noise Extensions Comments and future work
Comments and Example
• Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.
• Take a simple one-dimensional Ornstein Uhlenbeck process
dX (t) = −cX (t)dt + K dB(t)
where, in our notation,• c1 = c2 = c and K1 = K2 = K ,• φ(x) = x ∈ RV∞(1) ⇒ ζ = 1
• Φ(x) = x2
2 and Φ−1(x) =√
2x
Then applying theorem 3 recovers the well known result(K 2
1
2c1
) 1ζ+1
≤ lim supt→∞
‖X (t)‖Φ−1(log t)
≤(
K 22
2c2
) 1ζ+1
lim supt→∞
|X (t)|√2 log t
=
(K 2
2c
) 12
, a.s.
Introduction Bounded Noise Extensions Comments and future work
Comments and Example
• Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.
• Take a simple one-dimensional Ornstein Uhlenbeck process
dX (t) = −cX (t)dt + K dB(t)
where, in our notation,• c1 = c2 = c and K1 = K2 = K ,• φ(x) = x ∈ RV∞(1) ⇒ ζ = 1
• Φ(x) = x2
2 and Φ−1(x) =√
2x
Then applying theorem 3 recovers the well known result(K 2
1
2c1
) 1ζ+1
≤ lim supt→∞
‖X (t)‖Φ−1(log t)
≤(
K 22
2c2
) 1ζ+1
lim supt→∞
|X (t)|√2 log t
=
(K 2
2c
) 12
, a.s.
Introduction Bounded Noise Extensions Comments and future work
Comments and Example
• Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.
• Take a simple one-dimensional Ornstein Uhlenbeck process
dX (t) = −cX (t)dt + K dB(t)
where, in our notation,• c1 = c2 = c and K1 = K2 = K ,• φ(x) = x ∈ RV∞(1) ⇒ ζ = 1
• Φ(x) = x2
2 and Φ−1(x) =√
2x
Then applying theorem 3 recovers the well known result(K 2
1
2c1
) 1ζ+1
≤ lim supt→∞
‖X (t)‖Φ−1(log t)
≤(
K 22
2c2
) 1ζ+1
lim supt→∞
|X (t)|√2 log t
=
(K 2
2c
) 12
, a.s.
Introduction Bounded Noise Extensions Comments and future work
Comments and Example
• Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.
• Take a simple one-dimensional Ornstein Uhlenbeck process
dX (t) = −cX (t)dt + K dB(t)
where, in our notation,• c1 = c2 = c and K1 = K2 = K ,• φ(x) = x ∈ RV∞(1) ⇒ ζ = 1
• Φ(x) = x2
2 and Φ−1(x) =√
2x
Then applying theorem 3 recovers the well known result(K 2
1
2c1
) 1ζ+1
≤ lim supt→∞
‖X (t)‖Φ−1(log t)
≤(
K 22
2c2
) 1ζ+1
lim supt→∞
|X (t)|√2 log t
=
(K 2
2c
) 12
, a.s.
Introduction Bounded Noise Extensions Comments and future work
Comments and Example
• Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.
• Take a simple one-dimensional Ornstein Uhlenbeck process
dX (t) = −cX (t)dt + K dB(t)
where, in our notation,• c1 = c2 = c and K1 = K2 = K ,• φ(x) = x ∈ RV∞(1) ⇒ ζ = 1
• Φ(x) = x2
2 and Φ−1(x) =√
2x
Then applying theorem 3 recovers the well known result(K 2
1
2c1
) 1ζ+1
≤ lim supt→∞
‖X (t)‖Φ−1(log t)
≤(
K 22
2c2
) 1ζ+1
lim supt→∞
|X (t)|√2 log t
=
(K 2
2c
) 12
, a.s.
Introduction Bounded Noise Extensions Comments and future work
Comments and Example
• Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.
• Take a simple one-dimensional Ornstein Uhlenbeck process
dX (t) = −cX (t)dt + K dB(t)
where, in our notation,• c1 = c2 = c and K1 = K2 = K ,• φ(x) = x ∈ RV∞(1) ⇒ ζ = 1
• Φ(x) = x2
2 and Φ−1(x) =√
2x
Then applying theorem 3 recovers the well known result(K 2
1
2c1
) 1ζ+1
≤ lim supt→∞
‖X (t)‖Φ−1(log t)
≤(
K 22
2c2
) 1ζ+1
lim supt→∞
|X (t)|√2 log t
=
(K 2
2c
) 12
, a.s.
Introduction Bounded Noise Extensions Comments and future work
Comments and Example
• Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.
• Take a simple one-dimensional Ornstein Uhlenbeck process
dX (t) = −cX (t)dt + K dB(t)
where, in our notation,• c1 = c2 = c and K1 = K2 = K ,• φ(x) = x ∈ RV∞(1) ⇒ ζ = 1
• Φ(x) = x2
2 and Φ−1(x) =√
2x
Then applying theorem 3 recovers the well known result(K 2
1
2c1
) 1ζ+1
≤ lim supt→∞
‖X (t)‖Φ−1(log t)
≤(
K 22
2c2
) 1ζ+1
lim supt→∞
|X (t)|√2 log t
=
(K 2
2c
) 12
, a.s.
Introduction Bounded Noise Extensions Comments and future work
Outline of proofs
• Apply Ito transformation to reduce dimensionality andfacilitate the use of one-dimensional theory,
• Use the bounding conditions on f and g to construct upperand lower comparison processes whose dynamics are notdetermined by Y ,
• Use stochastic comparison theorem of Ikeda and Watanabe toprove that our transformed process is bounded above andbelow by the comparison processes for all t ≥ 0,
• The large deviations of the comparison processes aredetermined by means of a classical theorem of Motoo.
Introduction Bounded Noise Extensions Comments and future work
Outline of proofs
• Apply Ito transformation to reduce dimensionality andfacilitate the use of one-dimensional theory,
• Use the bounding conditions on f and g to construct upperand lower comparison processes whose dynamics are notdetermined by Y ,
• Use stochastic comparison theorem of Ikeda and Watanabe toprove that our transformed process is bounded above andbelow by the comparison processes for all t ≥ 0,
• The large deviations of the comparison processes aredetermined by means of a classical theorem of Motoo.
Introduction Bounded Noise Extensions Comments and future work
Outline of proofs
• Apply Ito transformation to reduce dimensionality andfacilitate the use of one-dimensional theory,
• Use the bounding conditions on f and g to construct upperand lower comparison processes whose dynamics are notdetermined by Y ,
• Use stochastic comparison theorem of Ikeda and Watanabe toprove that our transformed process is bounded above andbelow by the comparison processes for all t ≥ 0,
• The large deviations of the comparison processes aredetermined by means of a classical theorem of Motoo.
Introduction Bounded Noise Extensions Comments and future work
Outline of proofs
• Apply Ito transformation to reduce dimensionality andfacilitate the use of one-dimensional theory,
• Use the bounding conditions on f and g to construct upperand lower comparison processes whose dynamics are notdetermined by Y ,
• Use stochastic comparison theorem of Ikeda and Watanabe toprove that our transformed process is bounded above andbelow by the comparison processes for all t ≥ 0,
• The large deviations of the comparison processes aredetermined by means of a classical theorem of Motoo.
Introduction Bounded Noise Extensions Comments and future work
Outline
1 IntroductionMotivationRegular Variation
2 Bounded NoiseMain ResultsOutline of Proofs
3 ExtensionsUnbounded noise
4 Comments and future work
Introduction Bounded Noise Extensions Comments and future work
Unbounded noise
We suppose that the noise intensity g is unbounded in the sensethat there is an increasing function γ(x) →∞ as x →∞ s.t.
• ∃ K2 > 0 s.t. lim sup‖x‖→∞
supy∈S
‖g(x ,y)‖Fγ(‖x‖)
= K2
•∃K1 >0 s.t. lim inf‖x‖∈Rd
infy∈S
∑rj=1
(∑di=1 xigij (x ,y)
)2
γ2(‖x‖)‖x‖2
=K 2
1 .
By Cauchy-Schwartz we get the following upper and lower estimate
K 21 (1− ε) ≤
‖g(x , y)‖2F
γ2(‖x‖)≤ K 2
2 (1 + ε) ∀ y ∈ S, x ≥ X (ε).
Since we are interested in recurrent stationary processes we ensurethat xφ(x)
γ2(x)→∞ as x →∞.
Introduction Bounded Noise Extensions Comments and future work
Unbounded noise
We suppose that the noise intensity g is unbounded in the sensethat there is an increasing function γ(x) →∞ as x →∞ s.t.
• ∃ K2 > 0 s.t. lim sup‖x‖→∞
supy∈S
‖g(x ,y)‖Fγ(‖x‖)
= K2
•∃K1 >0 s.t. lim inf‖x‖∈Rd
infy∈S
∑rj=1
(∑di=1 xigij (x ,y)
)2
γ2(‖x‖)‖x‖2
=K 2
1 .
By Cauchy-Schwartz we get the following upper and lower estimate
K 21 (1− ε) ≤
‖g(x , y)‖2F
γ2(‖x‖)≤ K 2
2 (1 + ε) ∀ y ∈ S, x ≥ X (ε).
Since we are interested in recurrent stationary processes we ensurethat xφ(x)
γ2(x)→∞ as x →∞.
Introduction Bounded Noise Extensions Comments and future work
Unbounded noise
We suppose that the noise intensity g is unbounded in the sensethat there is an increasing function γ(x) →∞ as x →∞ s.t.
• ∃ K2 > 0 s.t. lim sup‖x‖→∞
supy∈S
‖g(x ,y)‖Fγ(‖x‖)
= K2
•∃K1 >0 s.t. lim inf‖x‖∈Rd
infy∈S
∑rj=1
(∑di=1 xigij (x ,y)
)2
γ2(‖x‖)‖x‖2
=K 2
1 .
By Cauchy-Schwartz we get the following upper and lower estimate
K 21 (1− ε) ≤
‖g(x , y)‖2F
γ2(‖x‖)≤ K 2
2 (1 + ε) ∀ y ∈ S, x ≥ X (ε).
Since we are interested in recurrent stationary processes we ensurethat xφ(x)
γ2(x)→∞ as x →∞.
Introduction Bounded Noise Extensions Comments and future work
Unbounded noise
We suppose that the noise intensity g is unbounded in the sensethat there is an increasing function γ(x) →∞ as x →∞ s.t.
• ∃ K2 > 0 s.t. lim sup‖x‖→∞
supy∈S
‖g(x ,y)‖Fγ(‖x‖)
= K2
•∃K1 >0 s.t. lim inf‖x‖∈Rd
infy∈S
∑rj=1
(∑di=1 xigij (x ,y)
)2
γ2(‖x‖)‖x‖2
=K 2
1 .
By Cauchy-Schwartz we get the following upper and lower estimate
K 21 (1− ε) ≤
‖g(x , y)‖2F
γ2(‖x‖)≤ K 2
2 (1 + ε) ∀ y ∈ S, x ≥ X (ε).
Since we are interested in recurrent stationary processes we ensurethat xφ(x)
γ2(x)→∞ as x →∞.
Introduction Bounded Noise Extensions Comments and future work
Unbounded noise
We suppose that the noise intensity g is unbounded in the sensethat there is an increasing function γ(x) →∞ as x →∞ s.t.
• ∃ K2 > 0 s.t. lim sup‖x‖→∞
supy∈S
‖g(x ,y)‖Fγ(‖x‖)
= K2
•∃K1 >0 s.t. lim inf‖x‖∈Rd
infy∈S
∑rj=1
(∑di=1 xigij (x ,y)
)2
γ2(‖x‖)‖x‖2
=K 2
1 .
By Cauchy-Schwartz we get the following upper and lower estimate
K 21 (1− ε) ≤
‖g(x , y)‖2F
γ2(‖x‖)≤ K 2
2 (1 + ε) ∀ y ∈ S, x ≥ X (ε).
Since we are interested in recurrent stationary processes we ensurethat xφ(x)
γ2(x)→∞ as x →∞.
Introduction Bounded Noise Extensions Comments and future work
Unbounded noise deviations
Using the same conditions on the drift, we get upper and lowerbound theorems analogous to theorems 1 and 2. Taking boththeorems together, in the special case where φ(x)
γ2(x)∈ RV∞(ζ), we
get the following:
Theorem 4
Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (2) and (3) and afunction γ satisfying unbounded noise conditions. Then X satisfies(
K 21
2c1
) 1ζ+1
≤ lim supt→∞
‖X (t)‖Ψ−1(log t)
≤(
K 22
2c2
) 1ζ+1
a.s.,
where Ψ(x) =∫ x1
φ(u)γ2(u)
du ∈ RV∞(ζ + 1).
Introduction Bounded Noise Extensions Comments and future work
Unbounded noise deviations
Using the same conditions on the drift, we get upper and lowerbound theorems analogous to theorems 1 and 2. Taking boththeorems together, in the special case where φ(x)
γ2(x)∈ RV∞(ζ), we
get the following:
Theorem 4
Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (2) and (3) and afunction γ satisfying unbounded noise conditions. Then X satisfies(
K 21
2c1
) 1ζ+1
≤ lim supt→∞
‖X (t)‖Ψ−1(log t)
≤(
K 22
2c2
) 1ζ+1
a.s.,
where Ψ(x) =∫ x1
φ(u)γ2(u)
du ∈ RV∞(ζ + 1).
Introduction Bounded Noise Extensions Comments and future work
Unbounded noise deviations
Using the same conditions on the drift, we get upper and lowerbound theorems analogous to theorems 1 and 2. Taking boththeorems together, in the special case where φ(x)
γ2(x)∈ RV∞(ζ), we
get the following:
Theorem 4
Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (2) and (3) and afunction γ satisfying unbounded noise conditions. Then X satisfies(
K 21
2c1
) 1ζ+1
≤ lim supt→∞
‖X (t)‖Ψ−1(log t)
≤(
K 22
2c2
) 1ζ+1
a.s.,
where Ψ(x) =∫ x1
φ(u)γ2(u)
du ∈ RV∞(ζ + 1).
Introduction Bounded Noise Extensions Comments and future work
Unbounded noise deviations
Using the same conditions on the drift, we get upper and lowerbound theorems analogous to theorems 1 and 2. Taking boththeorems together, in the special case where φ(x)
γ2(x)∈ RV∞(ζ), we
get the following:
Theorem 4
Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (2) and (3) and afunction γ satisfying unbounded noise conditions. Then X satisfies(
K 21
2c1
) 1ζ+1
≤ lim supt→∞
‖X (t)‖Ψ−1(log t)
≤(
K 22
2c2
) 1ζ+1
a.s.,
where Ψ(x) =∫ x1
φ(u)γ2(u)
du ∈ RV∞(ζ + 1).
Introduction Bounded Noise Extensions Comments and future work
Other Extensions
• In a system in economics, it is more useful to study the size ofthe largest component of the system, rather than the norm.
• Consequently, using norm equivalence in Rd , we can extendour results to get upper and lower bounds of the form
0 <1√d
α ≤ lim supt→∞
max1≤j≤d |Xj(t)|ρ(t)
≤ β ≤ +∞, a.s.
• Rather than having our process switch between a finitenumber of regimes, we can extend the state space of theMarkov chain to a countable state space.
Introduction Bounded Noise Extensions Comments and future work
Other Extensions
• In a system in economics, it is more useful to study the size ofthe largest component of the system, rather than the norm.
• Consequently, using norm equivalence in Rd , we can extendour results to get upper and lower bounds of the form
0 <1√d
α ≤ lim supt→∞
max1≤j≤d |Xj(t)|ρ(t)
≤ β ≤ +∞, a.s.
• Rather than having our process switch between a finitenumber of regimes, we can extend the state space of theMarkov chain to a countable state space.
Introduction Bounded Noise Extensions Comments and future work
Other Extensions
• In a system in economics, it is more useful to study the size ofthe largest component of the system, rather than the norm.
• Consequently, using norm equivalence in Rd , we can extendour results to get upper and lower bounds of the form
0 <1√d
α ≤ lim supt→∞
max1≤j≤d |Xj(t)|ρ(t)
≤ β ≤ +∞, a.s.
• Rather than having our process switch between a finitenumber of regimes, we can extend the state space of theMarkov chain to a countable state space.
Introduction Bounded Noise Extensions Comments and future work
Outline
1 IntroductionMotivationRegular Variation
2 Bounded NoiseMain ResultsOutline of Proofs
3 ExtensionsUnbounded noise
4 Comments and future work
Introduction Bounded Noise Extensions Comments and future work
Comments
• We emphasise the importance of the degree of nonlinearity inf and g in producing the essential growth rate of thedeviations
• The presence of Markovian switching does not play asignificant role in determining the rate of growth, but mayhave a significant impact on the constants α and β
• We pay particular attention to equations in which the mostinfluential factor driving each component of the system is thedegree of self mean-reversion of that component.
Introduction Bounded Noise Extensions Comments and future work
Comments
• We emphasise the importance of the degree of nonlinearity inf and g in producing the essential growth rate of thedeviations
• The presence of Markovian switching does not play asignificant role in determining the rate of growth, but mayhave a significant impact on the constants α and β
• We pay particular attention to equations in which the mostinfluential factor driving each component of the system is thedegree of self mean-reversion of that component.
Introduction Bounded Noise Extensions Comments and future work
Comments
• We emphasise the importance of the degree of nonlinearity inf and g in producing the essential growth rate of thedeviations
• The presence of Markovian switching does not play asignificant role in determining the rate of growth, but mayhave a significant impact on the constants α and β
• We pay particular attention to equations in which the mostinfluential factor driving each component of the system is thedegree of self mean-reversion of that component.
Introduction Bounded Noise Extensions Comments and future work
Future work
• To study finite-dimensional processes which are alwayspositive, e.g. the Cox Ingersoll Ross (CIR) model.
• To investigate the growth, explosion and simulation of ourdeveloped results. This will involve stochastic numericalanalysis and Monte Carlo simulation.
• This simulation will allow us to determine whether thequalitative features of our dynamical systems are preservedwhen analysed in discrete time.
• To investigate whether our results can be extended to SDEswith delay.
Introduction Bounded Noise Extensions Comments and future work
Future work
• To study finite-dimensional processes which are alwayspositive, e.g. the Cox Ingersoll Ross (CIR) model.
• To investigate the growth, explosion and simulation of ourdeveloped results. This will involve stochastic numericalanalysis and Monte Carlo simulation.
• This simulation will allow us to determine whether thequalitative features of our dynamical systems are preservedwhen analysed in discrete time.
• To investigate whether our results can be extended to SDEswith delay.
Introduction Bounded Noise Extensions Comments and future work
Future work
• To study finite-dimensional processes which are alwayspositive, e.g. the Cox Ingersoll Ross (CIR) model.
• To investigate the growth, explosion and simulation of ourdeveloped results. This will involve stochastic numericalanalysis and Monte Carlo simulation.
• This simulation will allow us to determine whether thequalitative features of our dynamical systems are preservedwhen analysed in discrete time.
• To investigate whether our results can be extended to SDEswith delay.
Introduction Bounded Noise Extensions Comments and future work
Future work
• To study finite-dimensional processes which are alwayspositive, e.g. the Cox Ingersoll Ross (CIR) model.
• To investigate the growth, explosion and simulation of ourdeveloped results. This will involve stochastic numericalanalysis and Monte Carlo simulation.
• This simulation will allow us to determine whether thequalitative features of our dynamical systems are preservedwhen analysed in discrete time.
• To investigate whether our results can be extended to SDEswith delay.
Recommended