174
Applied Probability School of Mathematics and Statistics, University of Sheffield 2019–20 (University of Sheffield) Applied Probability 2019–20 1 / 59

Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

  • Upload
    others

  • View
    10

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Applied Probability

School of Mathematics and Statistics, University of Sheffield

2019–20

(University of Sheffield) Applied Probability 2019–20 1 / 59

Page 2: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Modelling sets of points

Aspects of many phenomena can be represented by sets of points, sopoint process models are widely useful.

These sets of points may refer to moments in time, in which casethey are thought of as points in one dimensional space, such asevents in a Poisson process.

More generally they may refer to locations in space, often two orthree dimensional.

(University of Sheffield) Applied Probability 2019–20 2 / 59

Page 3: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Modelling sets of points

Aspects of many phenomena can be represented by sets of points, sopoint process models are widely useful.

These sets of points may refer to moments in time, in which casethey are thought of as points in one dimensional space, such asevents in a Poisson process.

More generally they may refer to locations in space, often two orthree dimensional.

(University of Sheffield) Applied Probability 2019–20 2 / 59

Page 4: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Modelling sets of points

Aspects of many phenomena can be represented by sets of points, sopoint process models are widely useful.

These sets of points may refer to moments in time, in which casethey are thought of as points in one dimensional space, such asevents in a Poisson process.

More generally they may refer to locations in space, often two orthree dimensional.

(University of Sheffield) Applied Probability 2019–20 2 / 59

Page 5: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Floods in Burbage Brook

(University of Sheffield) Applied Probability 2019–20 3 / 59

Page 6: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Insurance claims

Major fire insurance claims in Denmark, 1980–1990

(University of Sheffield) Applied Probability 2019–20 4 / 59

Page 7: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Japanese pine saplings

Locations of saplings of Japanese black pines

(University of Sheffield) Applied Probability 2019–20 5 / 59

Page 8: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: East Yorkshire leukaemia cases

(University of Sheffield) Applied Probability 2019–20 6 / 59

Page 9: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Fitting a Poisson process model to data

In this section we consider how to fit a Poisson process model tosome data.

Given observation of a point process over an interval [0, t], how canwe fit a Poisson process model?

That is, how do we estimate the rate λ?

Having fitted a Poisson process, how can we assess the adequacy ofthe model?

(University of Sheffield) Applied Probability 2019–20 7 / 59

Page 10: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Fitting a Poisson process model to data

In this section we consider how to fit a Poisson process model tosome data.

Given observation of a point process over an interval [0, t], how canwe fit a Poisson process model?

That is, how do we estimate the rate λ?

Having fitted a Poisson process, how can we assess the adequacy ofthe model?

(University of Sheffield) Applied Probability 2019–20 7 / 59

Page 11: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Fitting a Poisson process model to data

In this section we consider how to fit a Poisson process model tosome data.

Given observation of a point process over an interval [0, t], how canwe fit a Poisson process model?

That is, how do we estimate the rate λ?

Having fitted a Poisson process, how can we assess the adequacy ofthe model?

(University of Sheffield) Applied Probability 2019–20 7 / 59

Page 12: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Fitting a Poisson process model to data

In this section we consider how to fit a Poisson process model tosome data.

Given observation of a point process over an interval [0, t], how canwe fit a Poisson process model?

That is, how do we estimate the rate λ?

Having fitted a Poisson process, how can we assess the adequacy ofthe model?

(University of Sheffield) Applied Probability 2019–20 7 / 59

Page 13: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Two possibilities

Two initial possibilities for the estimation:

1 from the observed no. of events N(t) = n;

2 by the methods developed for continuous time Markov chains

(University of Sheffield) Applied Probability 2019–20 8 / 59

Page 14: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Two possibilities

Two initial possibilities for the estimation:

1 from the observed no. of events N(t) = n;

2 by the methods developed for continuous time Markov chains

(University of Sheffield) Applied Probability 2019–20 8 / 59

Page 15: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Number of events method

Fitting using N(t) ∼ Po(λt):

Log-likelihood having observedN(t) = n:

l = −λt + n log(λt) + constant

= −λt + n log(λ) + n log(t) + constant

= −λt + n log(λ) + constant

so the derivative ∂l∂λ

= −t + nλ

.

So the MLE λ̂ = n/t.

Also ese(λ̂) = λ̂/√n.

(University of Sheffield) Applied Probability 2019–20 9 / 59

Page 16: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Number of events method

Fitting using N(t) ∼ Po(λt): Log-likelihood having observedN(t) = n:

l = −λt + n log(λt) + constant

= −λt + n log(λ) + n log(t) + constant

= −λt + n log(λ) + constant

so the derivative ∂l∂λ

= −t + nλ

.

So the MLE λ̂ = n/t.

Also ese(λ̂) = λ̂/√n.

(University of Sheffield) Applied Probability 2019–20 9 / 59

Page 17: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Number of events method

Fitting using N(t) ∼ Po(λt): Log-likelihood having observedN(t) = n:

l = −λt + n log(λt) + constant

= −λt + n log(λ) + n log(t) + constant

= −λt + n log(λ) + constant

so the derivative ∂l∂λ

= −t + nλ

.

So the MLE λ̂ = n/t.

Also ese(λ̂) = λ̂/√n.

(University of Sheffield) Applied Probability 2019–20 9 / 59

Page 18: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Number of events method

Fitting using N(t) ∼ Po(λt): Log-likelihood having observedN(t) = n:

l = −λt + n log(λt) + constant

= −λt + n log(λ) + n log(t) + constant

= −λt + n log(λ) + constant

so the derivative ∂l∂λ

= −t + nλ

.

So the MLE λ̂ = n/t.

Also ese(λ̂) = λ̂/√n.

(University of Sheffield) Applied Probability 2019–20 9 / 59

Page 19: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Number of events method

Fitting using N(t) ∼ Po(λt): Log-likelihood having observedN(t) = n:

l = −λt + n log(λt) + constant

= −λt + n log(λ) + n log(t) + constant

= −λt + n log(λ) + constant

so the derivative ∂l∂λ

= −t + nλ

.

So the MLE λ̂ = n/t.

Also ese(λ̂) = λ̂/√n.

(University of Sheffield) Applied Probability 2019–20 9 / 59

Page 20: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Number of events method

Fitting using N(t) ∼ Po(λt): Log-likelihood having observedN(t) = n:

l = −λt + n log(λt) + constant

= −λt + n log(λ) + n log(t) + constant

= −λt + n log(λ) + constant

so the derivative ∂l∂λ

= −t + nλ

.

So the MLE λ̂ = n/t.

Also ese(λ̂) = λ̂/√n.

(University of Sheffield) Applied Probability 2019–20 9 / 59

Page 21: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Number of events method

Fitting using N(t) ∼ Po(λt): Log-likelihood having observedN(t) = n:

l = −λt + n log(λt) + constant

= −λt + n log(λ) + n log(t) + constant

= −λt + n log(λ) + constant

so the derivative ∂l∂λ

= −t + nλ

.

So the MLE λ̂ = n/t.

Also ese(λ̂) = λ̂/√n.

(University of Sheffield) Applied Probability 2019–20 9 / 59

Page 22: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Markov chain method

Fitting using the Markov chain method:

The durations in states i = 0, . . . , n are

a0 = T1, . . . , an−1 = Tn, an = t −n∑1

Ti .

Also, the transition rates in the chain are simply

gi i+1 = λ = −gii = gi

(University of Sheffield) Applied Probability 2019–20 10 / 59

Page 23: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Markov chain method

Fitting using the Markov chain method:

The durations in states i = 0, . . . , n are

a0 = T1, . . . , an−1 = Tn, an = t −n∑1

Ti .

Also, the transition rates in the chain are simply

gi i+1 = λ = −gii = gi

(University of Sheffield) Applied Probability 2019–20 10 / 59

Page 24: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Markov chain method

Fitting using the Markov chain method:

The durations in states i = 0, . . . , n are

a0 = T1, . . . , an−1 = Tn, an = t −n∑1

Ti .

Also, the transition rates in the chain are simply

gi i+1 = λ = −gii = gi

(University of Sheffield) Applied Probability 2019–20 10 / 59

Page 25: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Log likelihood

So the log-likelihood is

l = −∑

giai +∑i 6=j

nij log gij

= −λ∑i

ai + log λ∑i

ni i+1

= −λ t + n log λ

since∑

i ai =∑n

1 Ti + (t −∑n

1 Ti) = t and∑

i ni i+1 = totalnumber of points in [0, t].

Since this is the same as the log-likelihood from approach 1,inferences from the two approaches are the same.

(University of Sheffield) Applied Probability 2019–20 11 / 59

Page 26: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Log likelihood

So the log-likelihood is

l = −∑

giai +∑i 6=j

nij log gij

= −λ∑i

ai + log λ∑i

ni i+1

= −λ t + n log λ

since∑

i ai =∑n

1 Ti + (t −∑n

1 Ti) = t and∑

i ni i+1 = totalnumber of points in [0, t].

Since this is the same as the log-likelihood from approach 1,inferences from the two approaches are the same.

(University of Sheffield) Applied Probability 2019–20 11 / 59

Page 27: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Log likelihood

So the log-likelihood is

l = −∑

giai +∑i 6=j

nij log gij

= −λ∑i

ai + log λ∑i

ni i+1

= −λ t + n log λ

since∑

i ai =∑n

1 Ti + (t −∑n

1 Ti) = t and∑

i ni i+1 = totalnumber of points in [0, t].

Since this is the same as the log-likelihood from approach 1,inferences from the two approaches are the same.

(University of Sheffield) Applied Probability 2019–20 11 / 59

Page 28: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Log likelihood

So the log-likelihood is

l = −∑

giai +∑i 6=j

nij log gij

= −λ∑i

ai + log λ∑i

ni i+1

= −λ t + n log λ

since∑

i ai =∑n

1 Ti + (t −∑n

1 Ti) = t and∑

i ni i+1 = totalnumber of points in [0, t].

Since this is the same as the log-likelihood from approach 1,inferences from the two approaches are the same.

(University of Sheffield) Applied Probability 2019–20 11 / 59

Page 29: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Log likelihood

So the log-likelihood is

l = −∑

giai +∑i 6=j

nij log gij

= −λ∑i

ai + log λ∑i

ni i+1

= −λ t + n log λ

since∑

i ai =∑n

1 Ti + (t −∑n

1 Ti) = t and∑

i ni i+1 = totalnumber of points in [0, t].

Since this is the same as the log-likelihood from approach 1,inferences from the two approaches are the same.

(University of Sheffield) Applied Probability 2019–20 11 / 59

Page 30: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Conclusion

Thus, slightly surprisingly at first sight, what appears to be extrainformation here, the values of individual Ti , doesn’t actually makeany difference.

Checks on model adequacy: in notes

(University of Sheffield) Applied Probability 2019–20 12 / 59

Page 31: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Conclusion

Thus, slightly surprisingly at first sight, what appears to be extrainformation here, the values of individual Ti , doesn’t actually makeany difference.

Checks on model adequacy: in notes

(University of Sheffield) Applied Probability 2019–20 12 / 59

Page 32: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Burbage Brook floods

Between 1925 and 1983 (inclusive) there were 48 flood events (flows≥ 4 cumecs) in Burbage Brook.

Since the observation period is t = 59 years, the maximum likelihoodestimator of the rate of occurrence is

λ̂ = 48/59 = 0.81, with ese = 0.12

events/year.

(University of Sheffield) Applied Probability 2019–20 13 / 59

Page 33: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Burbage Brook floods

Between 1925 and 1983 (inclusive) there were 48 flood events (flows≥ 4 cumecs) in Burbage Brook.

Since the observation period is t = 59 years, the maximum likelihoodestimator of the rate of occurrence is

λ̂ = 48/59 = 0.81, with ese = 0.12

events/year.

(University of Sheffield) Applied Probability 2019–20 13 / 59

Page 34: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Inhomogeneous one-dimensional Poisson processes

The rate λ governs the probability that a point occurs in anarbitrarily short interval.

So far we have assumed that it is constant.

However in many applications it is plausible that events could occurrandomly and independently, but that the probability of occurrencecould change over time.

A flood, for example, might be more likely in winter, arrivals at acasualty department more likely in the rush hour, etc.

(University of Sheffield) Applied Probability 2019–20 14 / 59

Page 35: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Inhomogeneous one-dimensional Poisson processes

The rate λ governs the probability that a point occurs in anarbitrarily short interval.

So far we have assumed that it is constant.

However in many applications it is plausible that events could occurrandomly and independently, but that the probability of occurrencecould change over time.

A flood, for example, might be more likely in winter, arrivals at acasualty department more likely in the rush hour, etc.

(University of Sheffield) Applied Probability 2019–20 14 / 59

Page 36: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Inhomogeneous one-dimensional Poisson processes

The rate λ governs the probability that a point occurs in anarbitrarily short interval.

So far we have assumed that it is constant.

However in many applications it is plausible that events could occurrandomly and independently, but that the probability of occurrencecould change over time.

A flood, for example, might be more likely in winter, arrivals at acasualty department more likely in the rush hour, etc.

(University of Sheffield) Applied Probability 2019–20 14 / 59

Page 37: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Inhomogeneous one-dimensional Poisson processes

The rate λ governs the probability that a point occurs in anarbitrarily short interval.

So far we have assumed that it is constant.

However in many applications it is plausible that events could occurrandomly and independently, but that the probability of occurrencecould change over time.

A flood, for example, might be more likely in winter, arrivals at acasualty department more likely in the rush hour, etc.

(University of Sheffield) Applied Probability 2019–20 14 / 59

Page 38: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Definition and properties

We now allow λ = λ(t) to depend on time. Assume that thecounting process N(t) satisfies

P(N(t + h) = i + 1 |N(t) = i) = λ(t)h + o(h)

andP(N(t + h) = i |N(t) = i) = 1− λ(t)h + o(h),

and that the probabilities of all other changes are o(h).

The resulting process is called an inhomogeneous Poisson process ofrate λ(t).

(University of Sheffield) Applied Probability 2019–20 15 / 59

Page 39: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Definition and properties

We now allow λ = λ(t) to depend on time. Assume that thecounting process N(t) satisfies

P(N(t + h) = i + 1 |N(t) = i) = λ(t)h + o(h)

andP(N(t + h) = i |N(t) = i) = 1− λ(t)h + o(h),

and that the probabilities of all other changes are o(h).

The resulting process is called an inhomogeneous Poisson process ofrate λ(t).

(University of Sheffield) Applied Probability 2019–20 15 / 59

Page 40: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Definition and properties

We now allow λ = λ(t) to depend on time. Assume that thecounting process N(t) satisfies

P(N(t + h) = i + 1 |N(t) = i) = λ(t)h + o(h)

andP(N(t + h) = i |N(t) = i) = 1− λ(t)h + o(h),

and that the probabilities of all other changes are o(h).

The resulting process is called an inhomogeneous Poisson process ofrate λ(t).

(University of Sheffield) Applied Probability 2019–20 15 / 59

Page 41: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Definition and properties

We now allow λ = λ(t) to depend on time. Assume that thecounting process N(t) satisfies

P(N(t + h) = i + 1 |N(t) = i) = λ(t)h + o(h)

andP(N(t + h) = i |N(t) = i) = 1− λ(t)h + o(h),

and that the probabilities of all other changes are o(h).

The resulting process is called an inhomogeneous Poisson process ofrate λ(t).

(University of Sheffield) Applied Probability 2019–20 15 / 59

Page 42: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Time change

Let N(t) be an inhomogeneous Poisson process of intensity λ(t), anddefine

Λ(t) =

∫ t

0

λ(u) du.

Suppose we change the time-scale and define a new counting process

M(s) = N(t) where s = Λ(t).

Let t(s) = Λ−1(s) be the inverse transformation of the time scale,taking the new time s back to t.

Thus M(s) is a homogeneous Poisson process with intensity 1.(Proof in notes.)

(University of Sheffield) Applied Probability 2019–20 16 / 59

Page 43: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Time change

Let N(t) be an inhomogeneous Poisson process of intensity λ(t), anddefine

Λ(t) =

∫ t

0

λ(u) du.

Suppose we change the time-scale and define a new counting process

M(s) = N(t) where s = Λ(t).

Let t(s) = Λ−1(s) be the inverse transformation of the time scale,taking the new time s back to t.

Thus M(s) is a homogeneous Poisson process with intensity 1.(Proof in notes.)

(University of Sheffield) Applied Probability 2019–20 16 / 59

Page 44: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Time change

Let N(t) be an inhomogeneous Poisson process of intensity λ(t), anddefine

Λ(t) =

∫ t

0

λ(u) du.

Suppose we change the time-scale and define a new counting process

M(s) = N(t) where s = Λ(t).

Let t(s) = Λ−1(s) be the inverse transformation of the time scale,taking the new time s back to t.

Thus M(s) is a homogeneous Poisson process with intensity 1.(Proof in notes.)

(University of Sheffield) Applied Probability 2019–20 16 / 59

Page 45: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Time change

Let N(t) be an inhomogeneous Poisson process of intensity λ(t), anddefine

Λ(t) =

∫ t

0

λ(u) du.

Suppose we change the time-scale and define a new counting process

M(s) = N(t) where s = Λ(t).

Let t(s) = Λ−1(s) be the inverse transformation of the time scale,taking the new time s back to t.

Thus M(s) is a homogeneous Poisson process with intensity 1.(Proof in notes.)

(University of Sheffield) Applied Probability 2019–20 16 / 59

Page 46: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Transferring properties

We can therefore transfer properties of the homogeneous processes tothe inhomogeneous case.

1 N(t) has a Poisson distribution with mean Λ(t) =∫ t

0λ(u)du.

Reason: N(t) = M(s) ∼ Po(s) = Po(Λ(t)).

2 The numbers of points in disjoint intervals I1, . . . , Ik areindependent and Poisson distributed with means∫Iiλ(u)du, i = 1, . . . , k .

Reason: independence follows from translation of thecorresponding property of the basic Poisson process, and thedistributions follow as above.

(University of Sheffield) Applied Probability 2019–20 17 / 59

Page 47: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Transferring properties

We can therefore transfer properties of the homogeneous processes tothe inhomogeneous case.

1 N(t) has a Poisson distribution with mean Λ(t) =∫ t

0λ(u)du.

Reason: N(t) = M(s) ∼ Po(s) = Po(Λ(t)).

2 The numbers of points in disjoint intervals I1, . . . , Ik areindependent and Poisson distributed with means∫Iiλ(u)du, i = 1, . . . , k .

Reason: independence follows from translation of thecorresponding property of the basic Poisson process, and thedistributions follow as above.

(University of Sheffield) Applied Probability 2019–20 17 / 59

Page 48: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Transferring properties

We can therefore transfer properties of the homogeneous processes tothe inhomogeneous case.

1 N(t) has a Poisson distribution with mean Λ(t) =∫ t

0λ(u)du.

Reason: N(t) = M(s) ∼ Po(s) = Po(Λ(t)).

2 The numbers of points in disjoint intervals I1, . . . , Ik areindependent and Poisson distributed with means∫Iiλ(u)du, i = 1, . . . , k .

Reason: independence follows from translation of thecorresponding property of the basic Poisson process, and thedistributions follow as above.

(University of Sheffield) Applied Probability 2019–20 17 / 59

Page 49: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Transferring properties

We can therefore transfer properties of the homogeneous processes tothe inhomogeneous case.

1 N(t) has a Poisson distribution with mean Λ(t) =∫ t

0λ(u)du.

Reason: N(t) = M(s) ∼ Po(s) = Po(Λ(t)).

2 The numbers of points in disjoint intervals I1, . . . , Ik areindependent and Poisson distributed with means∫Iiλ(u)du, i = 1, . . . , k .

Reason: independence follows from translation of thecorresponding property of the basic Poisson process, and thedistributions follow as above.

(University of Sheffield) Applied Probability 2019–20 17 / 59

Page 50: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Transferring properties

We can therefore transfer properties of the homogeneous processes tothe inhomogeneous case.

1 N(t) has a Poisson distribution with mean Λ(t) =∫ t

0λ(u)du.

Reason: N(t) = M(s) ∼ Po(s) = Po(Λ(t)).

2 The numbers of points in disjoint intervals I1, . . . , Ik areindependent and Poisson distributed with means∫Iiλ(u)du, i = 1, . . . , k .

Reason: independence follows from translation of thecorresponding property of the basic Poisson process, and thedistributions follow as above.

(University of Sheffield) Applied Probability 2019–20 17 / 59

Page 51: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Distribution of points

Given that the total number of points in [0, t] is N(t) = n, thepositions of the points are independently distributed with pdfλ(v)/Λ(t), 0 ≤ v ≤ t.

Reason: conditional independence and identical distribution ofpositions in the N process follows from the same properties of thosein the M process.

Let V denote the position of a point in the N process. Then thecorresponding position for the M process is s(V ) and in the Mprocess positions are uniformly distributed over [0, s(t)].

(University of Sheffield) Applied Probability 2019–20 18 / 59

Page 52: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Distribution of points

Given that the total number of points in [0, t] is N(t) = n, thepositions of the points are independently distributed with pdfλ(v)/Λ(t), 0 ≤ v ≤ t.

Reason: conditional independence and identical distribution ofpositions in the N process follows from the same properties of thosein the M process.

Let V denote the position of a point in the N process. Then thecorresponding position for the M process is s(V ) and in the Mprocess positions are uniformly distributed over [0, s(t)].

(University of Sheffield) Applied Probability 2019–20 18 / 59

Page 53: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Distribution of points

Given that the total number of points in [0, t] is N(t) = n, thepositions of the points are independently distributed with pdfλ(v)/Λ(t), 0 ≤ v ≤ t.

Reason: conditional independence and identical distribution ofpositions in the N process follows from the same properties of thosein the M process.

Let V denote the position of a point in the N process. Then thecorresponding position for the M process is s(V ) and in the Mprocess positions are uniformly distributed over [0, s(t)].

(University of Sheffield) Applied Probability 2019–20 18 / 59

Page 54: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Distribution of points cont.

Thus the distribution function of V is

P(V ≤ v) = P(s(V ) ≤ s(v)) =s(v)

s(t)=

Λ(v)

Λ(t)

and so the pdf isdP(V ≤ v)

dv=λ(v)

Λ(t).

(University of Sheffield) Applied Probability 2019–20 19 / 59

Page 55: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Distribution of points cont.

Thus the distribution function of V is

P(V ≤ v) = P(s(V ) ≤ s(v)) =s(v)

s(t)=

Λ(v)

Λ(t)

and so the pdf isdP(V ≤ v)

dv=λ(v)

Λ(t).

(University of Sheffield) Applied Probability 2019–20 19 / 59

Page 56: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Fitting to data

The last property enables us to write down the likelihood for λ(·)based on observations over [0, t].

Suppose that we have observed N(t) = n points and that theirpositions are v1, . . . , vn.

Then

L = P(N(t) = n)× P(positions of the points |N(t) = n)

= e−Λ(t) Λ(t)n

n!×

n∏1

λ(vi)

Λ(t)

= e−Λ(t)

∏n1 λ(vi)

n!,

(University of Sheffield) Applied Probability 2019–20 20 / 59

Page 57: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Fitting to data

The last property enables us to write down the likelihood for λ(·)based on observations over [0, t].

Suppose that we have observed N(t) = n points and that theirpositions are v1, . . . , vn.

Then

L = P(N(t) = n)× P(positions of the points |N(t) = n)

= e−Λ(t) Λ(t)n

n!×

n∏1

λ(vi)

Λ(t)

= e−Λ(t)

∏n1 λ(vi)

n!,

(University of Sheffield) Applied Probability 2019–20 20 / 59

Page 58: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Fitting to data

The last property enables us to write down the likelihood for λ(·)based on observations over [0, t].

Suppose that we have observed N(t) = n points and that theirpositions are v1, . . . , vn.

Then

L = P(N(t) = n)× P(positions of the points |N(t) = n)

= e−Λ(t) Λ(t)n

n!×

n∏1

λ(vi)

Λ(t)

= e−Λ(t)

∏n1 λ(vi)

n!,

(University of Sheffield) Applied Probability 2019–20 20 / 59

Page 59: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Fitting to data

The last property enables us to write down the likelihood for λ(·)based on observations over [0, t].

Suppose that we have observed N(t) = n points and that theirpositions are v1, . . . , vn.

Then

L = P(N(t) = n)× P(positions of the points |N(t) = n)

= e−Λ(t) Λ(t)n

n!×

n∏1

λ(vi)

Λ(t)

= e−Λ(t)

∏n1 λ(vi)

n!,

(University of Sheffield) Applied Probability 2019–20 20 / 59

Page 60: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Fitting to data

The last property enables us to write down the likelihood for λ(·)based on observations over [0, t].

Suppose that we have observed N(t) = n points and that theirpositions are v1, . . . , vn.

Then

L = P(N(t) = n)× P(positions of the points |N(t) = n)

= e−Λ(t) Λ(t)n

n!×

n∏1

λ(vi)

Λ(t)

= e−Λ(t)

∏n1 λ(vi)

n!,

(University of Sheffield) Applied Probability 2019–20 20 / 59

Page 61: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Log likelihood

The log-likelihood is

l = −Λ(t) +n∑1

log λ(vi) + const.

If λ(t) is a constant λ, then Λ(t) reduces to Λ(t) = λt and thisbecomes becomes

l = −λt + n log λ + const

as before.

(University of Sheffield) Applied Probability 2019–20 21 / 59

Page 62: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Log likelihood

The log-likelihood is

l = −Λ(t) +n∑1

log λ(vi) + const.

If λ(t) is a constant λ, then Λ(t) reduces to Λ(t) = λt and thisbecomes becomes

l = −λt + n log λ + const

as before.

(University of Sheffield) Applied Probability 2019–20 21 / 59

Page 63: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Notes

In most applications of inhomogeneous Poisson models the functionλ(·) is specified in terms of a finite number of parameters.

Their maximum likelihood estimates are then the values maximizing l .

The asymptotic properties of likelihood inference carry over to suchinhomogeneous Poisson processes under reasonable conditions.

Approximate standard errors may then be found from the informationmatrix and tests may be based on twice the difference oflog-likelihoods.

(University of Sheffield) Applied Probability 2019–20 22 / 59

Page 64: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Notes

In most applications of inhomogeneous Poisson models the functionλ(·) is specified in terms of a finite number of parameters.

Their maximum likelihood estimates are then the values maximizing l .

The asymptotic properties of likelihood inference carry over to suchinhomogeneous Poisson processes under reasonable conditions.

Approximate standard errors may then be found from the informationmatrix and tests may be based on twice the difference oflog-likelihoods.

(University of Sheffield) Applied Probability 2019–20 22 / 59

Page 65: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Notes

In most applications of inhomogeneous Poisson models the functionλ(·) is specified in terms of a finite number of parameters.

Their maximum likelihood estimates are then the values maximizing l .

The asymptotic properties of likelihood inference carry over to suchinhomogeneous Poisson processes under reasonable conditions.

Approximate standard errors may then be found from the informationmatrix and tests may be based on twice the difference oflog-likelihoods.

(University of Sheffield) Applied Probability 2019–20 22 / 59

Page 66: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Freezes of Lake Constance 1300–1974

Years between 1300AD and 1974AD when major freezes of LakeConstance occurred

Year

1300 1400 1500 1600 1700 1800 1900

(University of Sheffield) Applied Probability 2019–20 23 / 59

Page 67: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Histogram

1300 1400 1500 1600 1700 1800 1900 2000

01

23

45

6

Freeze date

Fre

quen

cy

Figure: Histogram of Lake Constance Freeze Dates

(University of Sheffield) Applied Probability 2019–20 24 / 59

Page 68: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Uniform QQ plot

++

+++

+++

+++

++

+++++++

++

++

++

++

+

1300 1400 1500 1600 1700 1800 1900

1300

1400

1500

1600

1700

1800

1900

Freeze Dates

Uni

form

qua

ntile

s

Figure: QQ plot of Lake Constance Freeze Dates

(University of Sheffield) Applied Probability 2019–20 25 / 59

Page 69: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Model

The plots do not appear to support a Poisson process model with aconstant intensity.

Consider therefore a non-homogeneous Poisson process model.

To represent a changing rate in a simple form, we assume

λ(t) = α + βt

where α and β are constants.

t, for convenience in this problem, measures years post 1300 in unitsof 100 years.

(University of Sheffield) Applied Probability 2019–20 26 / 59

Page 70: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Model

The plots do not appear to support a Poisson process model with aconstant intensity.

Consider therefore a non-homogeneous Poisson process model.

To represent a changing rate in a simple form, we assume

λ(t) = α + βt

where α and β are constants.

t, for convenience in this problem, measures years post 1300 in unitsof 100 years.

(University of Sheffield) Applied Probability 2019–20 26 / 59

Page 71: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Model

The plots do not appear to support a Poisson process model with aconstant intensity.

Consider therefore a non-homogeneous Poisson process model.

To represent a changing rate in a simple form, we assume

λ(t) = α + βt

where α and β are constants.

t, for convenience in this problem, measures years post 1300 in unitsof 100 years.

(University of Sheffield) Applied Probability 2019–20 26 / 59

Page 72: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Model

The plots do not appear to support a Poisson process model with aconstant intensity.

Consider therefore a non-homogeneous Poisson process model.

To represent a changing rate in a simple form, we assume

λ(t) = α + βt

where α and β are constants.

t, for convenience in this problem, measures years post 1300 in unitsof 100 years.

(University of Sheffield) Applied Probability 2019–20 26 / 59

Page 73: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Model

The plots do not appear to support a Poisson process model with aconstant intensity.

Consider therefore a non-homogeneous Poisson process model.

To represent a changing rate in a simple form, we assume

λ(t) = α + βt

where α and β are constants.

t, for convenience in this problem, measures years post 1300 in unitsof 100 years.

(University of Sheffield) Applied Probability 2019–20 26 / 59

Page 74: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Log likelihood

The cumulative intensity function Λ(t) is

Λ(t) =

∫ t

0

λ(u)du = αt +1

2βt2,

so that the log-likelihood is

l = −αto −1

2βt2

o +n∑1

log(α + βvi),

where to denotes the length of the observation period, 6.75 centuries,n denotes the number of events, and the vi denote the times ofoccurrence of the events.

(University of Sheffield) Applied Probability 2019–20 27 / 59

Page 75: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Log likelihood

The cumulative intensity function Λ(t) is

Λ(t) =

∫ t

0

λ(u)du = αt +1

2βt2,

so that the log-likelihood is

l = −αto −1

2βt2

o +n∑1

log(α + βvi),

where to denotes the length of the observation period, 6.75 centuries,n denotes the number of events, and the vi denote the times ofoccurrence of the events.

(University of Sheffield) Applied Probability 2019–20 27 / 59

Page 76: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Log likelihood

The cumulative intensity function Λ(t) is

Λ(t) =

∫ t

0

λ(u)du = αt +1

2βt2,

so that the log-likelihood is

l = −αto −1

2βt2

o +n∑1

log(α + βvi),

where to denotes the length of the observation period, 6.75 centuries,n denotes the number of events, and the vi denote the times ofoccurrence of the events.

(University of Sheffield) Applied Probability 2019–20 27 / 59

Page 77: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Likelihood equations

To find the maximum likelihood estimators α̂ and β̂, consider thelikelihood equations:

−to +n∑1

1

α̂ + β̂vi= 0

− 1

2t2o +

n∑1

vi

α̂ + β̂vi= 0.

(University of Sheffield) Applied Probability 2019–20 28 / 59

Page 78: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Likelihood equations

To find the maximum likelihood estimators α̂ and β̂, consider thelikelihood equations:

−to +n∑1

1

α̂ + β̂vi= 0

− 1

2t2o +

n∑1

vi

α̂ + β̂vi= 0.

(University of Sheffield) Applied Probability 2019–20 28 / 59

Page 79: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

MLEsThese equations do not have a simple explicit solution, but numericalmaximization of l gives

α̂ = 7.015 (1.76)

β̂ = −0.81 (0.38)

where the values in brackets are estimated standard errors obtainedby inversion of the observed information matrix J :(

Var(α̂) Cov(α̂, β̂)

Cov(α̂, β̂) Var(β̂)

)= J−1,

where

J = −

(∂2l∂α2

∂2l∂α∂β

∂2l∂β∂α

∂2l∂β2

).

(University of Sheffield) Applied Probability 2019–20 29 / 59

Page 80: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

MLEsThese equations do not have a simple explicit solution, but numericalmaximization of l gives

α̂ = 7.015 (1.76)

β̂ = −0.81 (0.38)

where the values in brackets are estimated standard errors obtainedby inversion of the observed information matrix J :

(Var(α̂) Cov(α̂, β̂)

Cov(α̂, β̂) Var(β̂)

)= J−1,

where

J = −

(∂2l∂α2

∂2l∂α∂β

∂2l∂β∂α

∂2l∂β2

).

(University of Sheffield) Applied Probability 2019–20 29 / 59

Page 81: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

MLEsThese equations do not have a simple explicit solution, but numericalmaximization of l gives

α̂ = 7.015 (1.76)

β̂ = −0.81 (0.38)

where the values in brackets are estimated standard errors obtainedby inversion of the observed information matrix J :(

Var(α̂) Cov(α̂, β̂)

Cov(α̂, β̂) Var(β̂)

)= J−1,

where

J = −

(∂2l∂α2

∂2l∂α∂β

∂2l∂β∂α

∂2l∂β2

).

(University of Sheffield) Applied Probability 2019–20 29 / 59

Page 82: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

MLEsThese equations do not have a simple explicit solution, but numericalmaximization of l gives

α̂ = 7.015 (1.76)

β̂ = −0.81 (0.38)

where the values in brackets are estimated standard errors obtainedby inversion of the observed information matrix J :(

Var(α̂) Cov(α̂, β̂)

Cov(α̂, β̂) Var(β̂)

)= J−1,

where

J = −

(∂2l∂α2

∂2l∂α∂β

∂2l∂β∂α

∂2l∂β2

).

(University of Sheffield) Applied Probability 2019–20 29 / 59

Page 83: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Information matrix

We could calculate J by differentiating the log likelihood again andsubstituting α̂ and β̂.

However numerical differentiation is available as a by-product of thenumerical maximization of l in R.

(University of Sheffield) Applied Probability 2019–20 30 / 59

Page 84: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Information matrix

We could calculate J by differentiating the log likelihood again andsubstituting α̂ and β̂.

However numerical differentiation is available as a by-product of thenumerical maximization of l in R.

(University of Sheffield) Applied Probability 2019–20 30 / 59

Page 85: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Extra calculations

Adding in the years to 2020 (with no more freezes) changes theestimates to

α̂ = 7.459 (1.74)

β̂ = −0.952 (0.34)

Slightly smaller standard errors; estimates give a slightly steeper line.

(University of Sheffield) Applied Probability 2019–20 31 / 59

Page 86: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Extra calculations

Adding in the years to 2020 (with no more freezes) changes theestimates to

α̂ = 7.459 (1.74)

β̂ = −0.952 (0.34)

Slightly smaller standard errors; estimates give a slightly steeper line.

(University of Sheffield) Applied Probability 2019–20 31 / 59

Page 87: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Extra calculations

Adding in the years to 2020 (with no more freezes) changes theestimates to

α̂ = 7.459 (1.74)

β̂ = −0.952 (0.34)

Slightly smaller standard errors

; estimates give a slightly steeper line.

(University of Sheffield) Applied Probability 2019–20 31 / 59

Page 88: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Extra calculations

Adding in the years to 2020 (with no more freezes) changes theestimates to

α̂ = 7.459 (1.74)

β̂ = −0.952 (0.34)

Slightly smaller standard errors; estimates give a slightly steeper line.

(University of Sheffield) Applied Probability 2019–20 31 / 59

Page 89: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Changing rate?

The estimated rate λ̂(t) = 7.0− 0.81t is decreasing, in agreementwith the indications from the plots.

To assess the objective strength of evidence for a decreasing rate wecan carry out a test of the hypothesis H0 : β = 0, against thealternative H1 : that there is no restriction on the value of β.

Use a generalized likelihood ratio test.

(University of Sheffield) Applied Probability 2019–20 32 / 59

Page 90: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Changing rate?

The estimated rate λ̂(t) = 7.0− 0.81t is decreasing, in agreementwith the indications from the plots.

To assess the objective strength of evidence for a decreasing rate wecan carry out a test of the hypothesis H0 : β = 0, against thealternative H1 : that there is no restriction on the value of β.

Use a generalized likelihood ratio test.

(University of Sheffield) Applied Probability 2019–20 32 / 59

Page 91: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Changing rate?

The estimated rate λ̂(t) = 7.0− 0.81t is decreasing, in agreementwith the indications from the plots.

To assess the objective strength of evidence for a decreasing rate wecan carry out a test of the hypothesis H0 : β = 0, against thealternative H1 : that there is no restriction on the value of β.

Use a generalized likelihood ratio test.

(University of Sheffield) Applied Probability 2019–20 32 / 59

Page 92: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Generalized Likelihood Ratio Test

The test statistic is

W = −2(l(α̃, β̃)− l(α̂, β̂)),

Here l(α̂, β̂) is the maximum log-likelihood under H1, and l(α̃, β̃) themaximum under the restriction imposed by H0.

Under H0, the distribution of W is approximately χ2 with the degreesof freedom being the no. parameters under H1 minus the no.parameters under H0, and under H1 it tends to be larger.

Thus large values of W discredit H0 and the p-value corresponding toan observed value of W may be found from the χ2 distribution.

(University of Sheffield) Applied Probability 2019–20 33 / 59

Page 93: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Generalized Likelihood Ratio Test

The test statistic is

W = −2(l(α̃, β̃)− l(α̂, β̂)),

Here l(α̂, β̂) is the maximum log-likelihood under H1, and l(α̃, β̃) themaximum under the restriction imposed by H0.

Under H0, the distribution of W is approximately χ2 with the degreesof freedom being the no. parameters under H1 minus the no.parameters under H0, and under H1 it tends to be larger.

Thus large values of W discredit H0 and the p-value corresponding toan observed value of W may be found from the χ2 distribution.

(University of Sheffield) Applied Probability 2019–20 33 / 59

Page 94: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Generalized Likelihood Ratio Test

The test statistic is

W = −2(l(α̃, β̃)− l(α̂, β̂)),

Here l(α̂, β̂) is the maximum log-likelihood under H1, and l(α̃, β̃) themaximum under the restriction imposed by H0.

Under H0, the distribution of W is approximately χ2 with the degreesof freedom being the no. parameters under H1 minus the no.parameters under H0, and under H1 it tends to be larger.

Thus large values of W discredit H0 and the p-value corresponding toan observed value of W may be found from the χ2 distribution.

(University of Sheffield) Applied Probability 2019–20 33 / 59

Page 95: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Generalized Likelihood Ratio Test

The test statistic is

W = −2(l(α̃, β̃)− l(α̂, β̂)),

Here l(α̂, β̂) is the maximum log-likelihood under H1, and l(α̃, β̃) themaximum under the restriction imposed by H0.

Under H0, the distribution of W is approximately χ2 with the degreesof freedom being the no. parameters under H1 minus the no.parameters under H0, and under H1 it tends to be larger.

Thus large values of W discredit H0 and the p-value corresponding toan observed value of W may be found from the χ2 distribution.

(University of Sheffield) Applied Probability 2019–20 33 / 59

Page 96: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Lake Constance revisited

For the Lake Constance data the maximum likelihood estimators α̂and β̂ above give the values of α and β that maximize l under H1.

We only need to find the maximizing values under H0, the restrictedmaximum likelihood estimators α̃ and β̃.

When H0 is true, λ(t) = α so the Poisson process istime-homogeneous.

Therefore the maximum likelihood estimator of α is

α̃ =n

to=

29

6.75= 4.3 events/century

(and necessarily β̃ = 0).

(University of Sheffield) Applied Probability 2019–20 34 / 59

Page 97: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Lake Constance revisited

For the Lake Constance data the maximum likelihood estimators α̂and β̂ above give the values of α and β that maximize l under H1.

We only need to find the maximizing values under H0, the restrictedmaximum likelihood estimators α̃ and β̃.

When H0 is true, λ(t) = α so the Poisson process istime-homogeneous.

Therefore the maximum likelihood estimator of α is

α̃ =n

to=

29

6.75= 4.3 events/century

(and necessarily β̃ = 0).

(University of Sheffield) Applied Probability 2019–20 34 / 59

Page 98: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Lake Constance revisited

For the Lake Constance data the maximum likelihood estimators α̂and β̂ above give the values of α and β that maximize l under H1.

We only need to find the maximizing values under H0, the restrictedmaximum likelihood estimators α̃ and β̃.

When H0 is true, λ(t) = α so the Poisson process istime-homogeneous.

Therefore the maximum likelihood estimator of α is

α̃ =n

to=

29

6.75= 4.3 events/century

(and necessarily β̃ = 0).

(University of Sheffield) Applied Probability 2019–20 34 / 59

Page 99: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Lake Constance revisited

For the Lake Constance data the maximum likelihood estimators α̂and β̂ above give the values of α and β that maximize l under H1.

We only need to find the maximizing values under H0, the restrictedmaximum likelihood estimators α̃ and β̃.

When H0 is true, λ(t) = α so the Poisson process istime-homogeneous.

Therefore the maximum likelihood estimator of α is

α̃ =n

to=

29

6.75= 4.3 events/century

(and necessarily β̃ = 0).

(University of Sheffield) Applied Probability 2019–20 34 / 59

Page 100: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Results

Substitution into l now gives

w = −2(13.275− 15.249) = 3.948.

By comparison with χ21 the p-value is slightly less than 0.05.

Thus there is some evidence of a change in the rate of occurrence offreezing events, which from the sign of β̂ must be a reduction.

But the strength of the evidence is not overwhelming.

Extending the data to 2020 makes the p-value smaller: just under0.01. So the evidence seems to be getting stronger.

(University of Sheffield) Applied Probability 2019–20 35 / 59

Page 101: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Results

Substitution into l now gives

w = −2(13.275− 15.249) = 3.948.

By comparison with χ21 the p-value is slightly less than 0.05.

Thus there is some evidence of a change in the rate of occurrence offreezing events, which from the sign of β̂ must be a reduction.

But the strength of the evidence is not overwhelming.

Extending the data to 2020 makes the p-value smaller: just under0.01. So the evidence seems to be getting stronger.

(University of Sheffield) Applied Probability 2019–20 35 / 59

Page 102: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Results

Substitution into l now gives

w = −2(13.275− 15.249) = 3.948.

By comparison with χ21 the p-value is slightly less than 0.05.

Thus there is some evidence of a change in the rate of occurrence offreezing events, which from the sign of β̂ must be a reduction.

But the strength of the evidence is not overwhelming.

Extending the data to 2020 makes the p-value smaller: just under0.01. So the evidence seems to be getting stronger.

(University of Sheffield) Applied Probability 2019–20 35 / 59

Page 103: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Results

Substitution into l now gives

w = −2(13.275− 15.249) = 3.948.

By comparison with χ21 the p-value is slightly less than 0.05.

Thus there is some evidence of a change in the rate of occurrence offreezing events, which from the sign of β̂ must be a reduction.

But the strength of the evidence is not overwhelming.

Extending the data to 2020 makes the p-value smaller: just under0.01. So the evidence seems to be getting stronger.

(University of Sheffield) Applied Probability 2019–20 35 / 59

Page 104: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Results

Substitution into l now gives

w = −2(13.275− 15.249) = 3.948.

By comparison with χ21 the p-value is slightly less than 0.05.

Thus there is some evidence of a change in the rate of occurrence offreezing events, which from the sign of β̂ must be a reduction.

But the strength of the evidence is not overwhelming.

Extending the data to 2020 makes the p-value smaller: just under0.01. So the evidence seems to be getting stronger.

(University of Sheffield) Applied Probability 2019–20 35 / 59

Page 105: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Spatial Poisson processes

Given an intensity function λ(·) ≥ 0, a general Poisson process onthe line has the following properties:

for any interval I , the number of points N(I ) of the process in Ihas a Poisson distribution with mean

∫Iλ(u) du

for disjoint intervals I1, I2, . . . , Ik , the random variablesN(I1),N(I2), . . . ,N(Ik) are independent.

N is a counting process: the number of points it counts in aninterval is the sum of the numbers in subintervals.

(University of Sheffield) Applied Probability 2019–20 36 / 59

Page 106: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Spatial Poisson processes

Given an intensity function λ(·) ≥ 0, a general Poisson process onthe line has the following properties:

for any interval I , the number of points N(I ) of the process in Ihas a Poisson distribution with mean

∫Iλ(u) du

for disjoint intervals I1, I2, . . . , Ik , the random variablesN(I1),N(I2), . . . ,N(Ik) are independent.

N is a counting process: the number of points it counts in aninterval is the sum of the numbers in subintervals.

(University of Sheffield) Applied Probability 2019–20 36 / 59

Page 107: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Spatial Poisson processes

Given an intensity function λ(·) ≥ 0, a general Poisson process onthe line has the following properties:

for any interval I , the number of points N(I ) of the process in Ihas a Poisson distribution with mean

∫Iλ(u) du

for disjoint intervals I1, I2, . . . , Ik , the random variablesN(I1),N(I2), . . . ,N(Ik) are independent.

N is a counting process: the number of points it counts in aninterval is the sum of the numbers in subintervals.

(University of Sheffield) Applied Probability 2019–20 36 / 59

Page 108: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Spatial Poisson processes

Given an intensity function λ(·) ≥ 0, a general Poisson process onthe line has the following properties:

for any interval I , the number of points N(I ) of the process in Ihas a Poisson distribution with mean

∫Iλ(u) du

for disjoint intervals I1, I2, . . . , Ik , the random variablesN(I1),N(I2), . . . ,N(Ik) are independent.

N is a counting process: the number of points it counts in aninterval is the sum of the numbers in subintervals.

(University of Sheffield) Applied Probability 2019–20 36 / 59

Page 109: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Higher dimensions

A Poisson process in the plane or in space is defined as a countingprocess with these same properties, where the properties are merelyre-phrased to make sense in higher dimensions.

Suppose that λ(·) is a real-valued non-negative function on R2.

For each set B in the plane, let N(B) be a random variable takingnon-negative integer values (interpreted as the number of points ofthe process in B).

(University of Sheffield) Applied Probability 2019–20 37 / 59

Page 110: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Higher dimensions

A Poisson process in the plane or in space is defined as a countingprocess with these same properties, where the properties are merelyre-phrased to make sense in higher dimensions.

Suppose that λ(·) is a real-valued non-negative function on R2.

For each set B in the plane, let N(B) be a random variable takingnon-negative integer values (interpreted as the number of points ofthe process in B).

(University of Sheffield) Applied Probability 2019–20 37 / 59

Page 111: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Higher dimensions

A Poisson process in the plane or in space is defined as a countingprocess with these same properties, where the properties are merelyre-phrased to make sense in higher dimensions.

Suppose that λ(·) is a real-valued non-negative function on R2.

For each set B in the plane, let N(B) be a random variable takingnon-negative integer values (interpreted as the number of points ofthe process in B).

(University of Sheffield) Applied Probability 2019–20 37 / 59

Page 112: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Definition

If

N(B) has a Poisson distribution with mean∫Bλ(u) du;

when B1,B2, . . . ,Bk are disjoint, the random variablesN(B1),N(B2), . . . ,N(Bk) are independent;

N has the additive property N(∪ki=1Bi) =

∑ki=1 N(Bi) for

disjoint Bi ;

then N is called a spatial Poisson process, with intensity λ(·).

(University of Sheffield) Applied Probability 2019–20 38 / 59

Page 113: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Definition

If

N(B) has a Poisson distribution with mean∫Bλ(u) du;

when B1,B2, . . . ,Bk are disjoint, the random variablesN(B1),N(B2), . . . ,N(Bk) are independent;

N has the additive property N(∪ki=1Bi) =

∑ki=1 N(Bi) for

disjoint Bi ;

then N is called a spatial Poisson process, with intensity λ(·).

(University of Sheffield) Applied Probability 2019–20 38 / 59

Page 114: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Definition

If

N(B) has a Poisson distribution with mean∫Bλ(u) du;

when B1,B2, . . . ,Bk are disjoint, the random variablesN(B1),N(B2), . . . ,N(Bk) are independent;

N has the additive property N(∪ki=1Bi) =∑k

i=1 N(Bi) fordisjoint Bi ;

then N is called a spatial Poisson process, with intensity λ(·).

(University of Sheffield) Applied Probability 2019–20 38 / 59

Page 115: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Definition

If

N(B) has a Poisson distribution with mean∫Bλ(u) du;

when B1,B2, . . . ,Bk are disjoint, the random variablesN(B1),N(B2), . . . ,N(Bk) are independent;

N has the additive property N(∪ki=1Bi) =∑k

i=1 N(Bi) fordisjoint Bi ;

then N is called a spatial Poisson process, with intensity λ(·).

(University of Sheffield) Applied Probability 2019–20 38 / 59

Page 116: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Homogeneous case

A homogeneous spatial Poisson process is the special case of whenλ(·) is constant.

(University of Sheffield) Applied Probability 2019–20 39 / 59

Page 117: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

More dimensions

A Poisson process in a three- or more- dimensional Euclidean spaceS = Rd , d = 3, . . . is defined in the same way.

λ is a function of the spatial coordinates, and the B are sets in S .

(University of Sheffield) Applied Probability 2019–20 40 / 59

Page 118: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

More dimensions

A Poisson process in a three- or more- dimensional Euclidean spaceS = Rd , d = 3, . . . is defined in the same way.

λ is a function of the spatial coordinates, and the B are sets in S .

(University of Sheffield) Applied Probability 2019–20 40 / 59

Page 119: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Notation

To save writing let us define, for sets B in the space of the points(Rd , d = 2, 3, . . . ),

Λ(B) =

∫B

λ(u) du.

A Poisson process may be defined on an arbitrary subset of the plane,for example the region of East Yorkshire in the leukaemia example, bysimply restricting the definition above to that subset.

(University of Sheffield) Applied Probability 2019–20 41 / 59

Page 120: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Notation

To save writing let us define, for sets B in the space of the points(Rd , d = 2, 3, . . . ),

Λ(B) =

∫B

λ(u) du.

A Poisson process may be defined on an arbitrary subset of the plane,for example the region of East Yorkshire in the leukaemia example, bysimply restricting the definition above to that subset.

(University of Sheffield) Applied Probability 2019–20 41 / 59

Page 121: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Notation

To save writing let us define, for sets B in the space of the points(Rd , d = 2, 3, . . . ),

Λ(B) =

∫B

λ(u) du.

A Poisson process may be defined on an arbitrary subset of the plane,for example the region of East Yorkshire in the leukaemia example, bysimply restricting the definition above to that subset.

(University of Sheffield) Applied Probability 2019–20 41 / 59

Page 122: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Nearest point

Let R denote the distance from the origin to the nearest point in ahomogeneous planar Poisson process with intensity λ.

Then the probability density function of R is

hR(r) = 2λπre−λπr2

, r > 0,

(the density of a Rayleigh distribution).

Reason: The number of points in a circle of radius r centred at theorigin has a Poisson distribution with mean λπr 2.

If there are no points in this circle then R > r , and conversely. ThusP(R > r) = exp(−λπr 2) and the result follows by differentiation.

(University of Sheffield) Applied Probability 2019–20 42 / 59

Page 123: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Nearest point

Let R denote the distance from the origin to the nearest point in ahomogeneous planar Poisson process with intensity λ.

Then the probability density function of R is

hR(r) = 2λπre−λπr2

, r > 0,

(the density of a Rayleigh distribution).

Reason: The number of points in a circle of radius r centred at theorigin has a Poisson distribution with mean λπr 2.

If there are no points in this circle then R > r , and conversely. ThusP(R > r) = exp(−λπr 2) and the result follows by differentiation.

(University of Sheffield) Applied Probability 2019–20 42 / 59

Page 124: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Nearest point

Let R denote the distance from the origin to the nearest point in ahomogeneous planar Poisson process with intensity λ.

Then the probability density function of R is

hR(r) = 2λπre−λπr2

, r > 0,

(the density of a Rayleigh distribution).

Reason: The number of points in a circle of radius r centred at theorigin has a Poisson distribution with mean λπr 2.

If there are no points in this circle then R > r , and conversely. ThusP(R > r) = exp(−λπr 2) and the result follows by differentiation.

(University of Sheffield) Applied Probability 2019–20 42 / 59

Page 125: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Nearest point

Let R denote the distance from the origin to the nearest point in ahomogeneous planar Poisson process with intensity λ.

Then the probability density function of R is

hR(r) = 2λπre−λπr2

, r > 0,

(the density of a Rayleigh distribution).

Reason: The number of points in a circle of radius r centred at theorigin has a Poisson distribution with mean λπr 2.

If there are no points in this circle then R > r , and conversely. ThusP(R > r) = exp(−λπr 2) and the result follows by differentiation.

(University of Sheffield) Applied Probability 2019–20 42 / 59

Page 126: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

First contact

The distribution of R is called the first contact distribution of thePoisson process.

It is in fact the distribution of the distance from any fixed point inthe plane to the nearest point in the process, as can be seen bysimply re-defining the origin to be at the fixed point.

The distribution of distance from an arbitrary point of the processitself to its nearest neighbour may be found too, and for thehomogeneous Poisson process this distribution turns out to be exactlythe same as the first contact distribution.

The two distributions will not necessarily be equal generally, so a testfor a Poisson process could be based on seeing whether estimates ofthe distributions based on observed distances are similar or not.

(University of Sheffield) Applied Probability 2019–20 43 / 59

Page 127: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

First contact

The distribution of R is called the first contact distribution of thePoisson process.

It is in fact the distribution of the distance from any fixed point inthe plane to the nearest point in the process, as can be seen bysimply re-defining the origin to be at the fixed point.

The distribution of distance from an arbitrary point of the processitself to its nearest neighbour may be found too, and for thehomogeneous Poisson process this distribution turns out to be exactlythe same as the first contact distribution.

The two distributions will not necessarily be equal generally, so a testfor a Poisson process could be based on seeing whether estimates ofthe distributions based on observed distances are similar or not.

(University of Sheffield) Applied Probability 2019–20 43 / 59

Page 128: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

First contact

The distribution of R is called the first contact distribution of thePoisson process.

It is in fact the distribution of the distance from any fixed point inthe plane to the nearest point in the process, as can be seen bysimply re-defining the origin to be at the fixed point.

The distribution of distance from an arbitrary point of the processitself to its nearest neighbour may be found too, and for thehomogeneous Poisson process this distribution turns out to be exactlythe same as the first contact distribution.

The two distributions will not necessarily be equal generally, so a testfor a Poisson process could be based on seeing whether estimates ofthe distributions based on observed distances are similar or not.

(University of Sheffield) Applied Probability 2019–20 43 / 59

Page 129: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

First contact

The distribution of R is called the first contact distribution of thePoisson process.

It is in fact the distribution of the distance from any fixed point inthe plane to the nearest point in the process, as can be seen bysimply re-defining the origin to be at the fixed point.

The distribution of distance from an arbitrary point of the processitself to its nearest neighbour may be found too, and for thehomogeneous Poisson process this distribution turns out to be exactlythe same as the first contact distribution.

The two distributions will not necessarily be equal generally, so a testfor a Poisson process could be based on seeing whether estimates ofthe distributions based on observed distances are similar or not.

(University of Sheffield) Applied Probability 2019–20 43 / 59

Page 130: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Thinning

Thinning of a Poisson process refers to the random deletion of someof the points.

A simple form of thinning is to remove or retain each pointindependently with fixed probabilities 1− p and p, say.

If the original process has intensity function λ(·), then the pointprocess resulting from such independent thinning is a Poisson processwith intensity pλ(·).

(Proof in notes)

(University of Sheffield) Applied Probability 2019–20 44 / 59

Page 131: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Thinning

Thinning of a Poisson process refers to the random deletion of someof the points.

A simple form of thinning is to remove or retain each pointindependently with fixed probabilities 1− p and p, say.

If the original process has intensity function λ(·), then the pointprocess resulting from such independent thinning is a Poisson processwith intensity pλ(·).

(Proof in notes)

(University of Sheffield) Applied Probability 2019–20 44 / 59

Page 132: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Thinning

Thinning of a Poisson process refers to the random deletion of someof the points.

A simple form of thinning is to remove or retain each pointindependently with fixed probabilities 1− p and p, say.

If the original process has intensity function λ(·), then the pointprocess resulting from such independent thinning is a Poisson processwith intensity pλ(·).

(Proof in notes)

(University of Sheffield) Applied Probability 2019–20 44 / 59

Page 133: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Conditional distribution of points

Given the total number of points of a spatial Poisson process in aregion B , the positions V of the points are independently distributedover B with probability density function

fV (v) =λ(v)

Λ(B)v ∈ B .

(University of Sheffield) Applied Probability 2019–20 45 / 59

Page 134: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Conditional distribution of points

Given the total number of points of a spatial Poisson process in aregion B , the positions V of the points are independently distributedover B with probability density function

fV (v) =λ(v)

Λ(B)v ∈ B .

(University of Sheffield) Applied Probability 2019–20 45 / 59

Page 135: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Simulation

If we know the intensity function, the conditional distributionproperty gives a way to simulate a Poisson process over any region.

First generate a number n from the Poisson distribution Po(Λ(B)),then simulate n independent values from the density fV .

With this ability we could implement a simulation test for a Poissonprocess using the comparison of distributions method suggested bythe first contact distribution.

The approach is feasible even for processes on sets B with irregularshapes.

(University of Sheffield) Applied Probability 2019–20 46 / 59

Page 136: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Simulation

If we know the intensity function, the conditional distributionproperty gives a way to simulate a Poisson process over any region.

First generate a number n from the Poisson distribution Po(Λ(B)),then simulate n independent values from the density fV .

With this ability we could implement a simulation test for a Poissonprocess using the comparison of distributions method suggested bythe first contact distribution.

The approach is feasible even for processes on sets B with irregularshapes.

(University of Sheffield) Applied Probability 2019–20 46 / 59

Page 137: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Simulation

If we know the intensity function, the conditional distributionproperty gives a way to simulate a Poisson process over any region.

First generate a number n from the Poisson distribution Po(Λ(B)),then simulate n independent values from the density fV .

With this ability we could implement a simulation test for a Poissonprocess using the comparison of distributions method suggested bythe first contact distribution.

The approach is feasible even for processes on sets B with irregularshapes.

(University of Sheffield) Applied Probability 2019–20 46 / 59

Page 138: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Simulation

If we know the intensity function, the conditional distributionproperty gives a way to simulate a Poisson process over any region.

First generate a number n from the Poisson distribution Po(Λ(B)),then simulate n independent values from the density fV .

With this ability we could implement a simulation test for a Poissonprocess using the comparison of distributions method suggested bythe first contact distribution.

The approach is feasible even for processes on sets B with irregularshapes.

(University of Sheffield) Applied Probability 2019–20 46 / 59

Page 139: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

LikelihoodThe conditional distribution property also gives the likelihood for λbased on an observed pattern of points.

If n points are observed in a region B and their positions arev i , i = 1, . . . , n, then the likelihood function is

L =e−Λ(B)Λn(B)

n!×

n∏1

λ(v i)

Λ(B)

=1

n!e−Λ(B)

n∏1

λ(v i),

and the log-likelihood

l = −Λ(B) +n∑1

log λ(v i) + constant.

(University of Sheffield) Applied Probability 2019–20 47 / 59

Page 140: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

LikelihoodThe conditional distribution property also gives the likelihood for λbased on an observed pattern of points.

If n points are observed in a region B and their positions arev i , i = 1, . . . , n, then the likelihood function is

L =e−Λ(B)Λn(B)

n!×

n∏1

λ(v i)

Λ(B)

=1

n!e−Λ(B)

n∏1

λ(v i),

and the log-likelihood

l = −Λ(B) +n∑1

log λ(v i) + constant.

(University of Sheffield) Applied Probability 2019–20 47 / 59

Page 141: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

LikelihoodThe conditional distribution property also gives the likelihood for λbased on an observed pattern of points.

If n points are observed in a region B and their positions arev i , i = 1, . . . , n, then the likelihood function is

L =e−Λ(B)Λn(B)

n!×

n∏1

λ(v i)

Λ(B)

=1

n!e−Λ(B)

n∏1

λ(v i),

and the log-likelihood

l = −Λ(B) +n∑1

log λ(v i) + constant.

(University of Sheffield) Applied Probability 2019–20 47 / 59

Page 142: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

LikelihoodThe conditional distribution property also gives the likelihood for λbased on an observed pattern of points.

If n points are observed in a region B and their positions arev i , i = 1, . . . , n, then the likelihood function is

L =e−Λ(B)Λn(B)

n!×

n∏1

λ(v i)

Λ(B)

=1

n!e−Λ(B)

n∏1

λ(v i),

and the log-likelihood

l = −Λ(B) +n∑1

log λ(v i) + constant.

(University of Sheffield) Applied Probability 2019–20 47 / 59

Page 143: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Parametrization

The λ function is often specified in terms of a small number ofparameters.

In that case fitting of the model by maximization of l and subsequentinference go ahead along the same lines as before.

(University of Sheffield) Applied Probability 2019–20 48 / 59

Page 144: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Parametrization

The λ function is often specified in terms of a small number ofparameters.

In that case fitting of the model by maximization of l and subsequentinference go ahead along the same lines as before.

(University of Sheffield) Applied Probability 2019–20 48 / 59

Page 145: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Marked Poisson processes

Several examples can be thought of as a sequence of time points ateach of which another variable is observed.

A marked Poisson process is a simple model for this.

Given a Poisson process N – on the line, plane or in higherdimensions – with intensity λ(·), associate with each point X i of theprocess a random variable Yi , called the mark at X i .

Then the new process {N ,Y1, . . . } is called a marked Poissonprocess.

(University of Sheffield) Applied Probability 2019–20 49 / 59

Page 146: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Marked Poisson processes

Several examples can be thought of as a sequence of time points ateach of which another variable is observed.

A marked Poisson process is a simple model for this.

Given a Poisson process N – on the line, plane or in higherdimensions – with intensity λ(·), associate with each point X i of theprocess a random variable Yi , called the mark at X i .

Then the new process {N ,Y1, . . . } is called a marked Poissonprocess.

(University of Sheffield) Applied Probability 2019–20 49 / 59

Page 147: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Marked Poisson processes

Several examples can be thought of as a sequence of time points ateach of which another variable is observed.

A marked Poisson process is a simple model for this.

Given a Poisson process N – on the line, plane or in higherdimensions – with intensity λ(·), associate with each point X i of theprocess a random variable Yi , called the mark at X i .

Then the new process {N ,Y1, . . . } is called a marked Poissonprocess.

(University of Sheffield) Applied Probability 2019–20 49 / 59

Page 148: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Marked Poisson processes

Several examples can be thought of as a sequence of time points ateach of which another variable is observed.

A marked Poisson process is a simple model for this.

Given a Poisson process N – on the line, plane or in higherdimensions – with intensity λ(·), associate with each point X i of theprocess a random variable Yi , called the mark at X i .

Then the new process {N ,Y1, . . . } is called a marked Poissonprocess.

(University of Sheffield) Applied Probability 2019–20 49 / 59

Page 149: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Dependence

For some modelling problems it might be appropriate to take the Yi

to be independent and identically distributed.

In others there may be interest in possible dependence between marksat different points, and in possible changes in the distribution ofmarks with position of the point.

(University of Sheffield) Applied Probability 2019–20 50 / 59

Page 150: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Dependence

For some modelling problems it might be appropriate to take the Yi

to be independent and identically distributed.

In others there may be interest in possible dependence between marksat different points, and in possible changes in the distribution ofmarks with position of the point.

(University of Sheffield) Applied Probability 2019–20 50 / 59

Page 151: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Insurance risk

The arrival of claims at an insurance company and the sizes of theclaims might be modelled as a marked Poisson process.

An initial assumption, to be checked, might be that marks (claimamounts) are independent and identically distributed.

(University of Sheffield) Applied Probability 2019–20 51 / 59

Page 152: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Insurance risk

The arrival of claims at an insurance company and the sizes of theclaims might be modelled as a marked Poisson process.

An initial assumption, to be checked, might be that marks (claimamounts) are independent and identically distributed.

(University of Sheffield) Applied Probability 2019–20 51 / 59

Page 153: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Assets over timeThe difference between premium income and claim payouts is the keyto financial viability of the company.

If the ith claim is made at time Xi and is of size Yi , and premiumsbring income at a steady rate ρ net of running costs, then the assetsA(t) of the insurance company at time t are

A(t) = A(0) + ρt −N(t)∑

1

Yi ,

where N(t) is the number of claims up to time t.

The probability that A(t) remains positive for a long time, and ofhow large the reserves A(0) need to be to make this probability large,are of great interest.

(University of Sheffield) Applied Probability 2019–20 52 / 59

Page 154: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Assets over timeThe difference between premium income and claim payouts is the keyto financial viability of the company.

If the ith claim is made at time Xi and is of size Yi , and premiumsbring income at a steady rate ρ net of running costs, then the assetsA(t) of the insurance company at time t are

A(t) = A(0) + ρt −N(t)∑

1

Yi ,

where N(t) is the number of claims up to time t.

The probability that A(t) remains positive for a long time, and ofhow large the reserves A(0) need to be to make this probability large,are of great interest.

(University of Sheffield) Applied Probability 2019–20 52 / 59

Page 155: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Assets over timeThe difference between premium income and claim payouts is the keyto financial viability of the company.

If the ith claim is made at time Xi and is of size Yi , and premiumsbring income at a steady rate ρ net of running costs, then the assetsA(t) of the insurance company at time t are

A(t) = A(0) + ρt −N(t)∑

1

Yi ,

where N(t) is the number of claims up to time t.

The probability that A(t) remains positive for a long time, and ofhow large the reserves A(0) need to be to make this probability large,are of great interest.

(University of Sheffield) Applied Probability 2019–20 52 / 59

Page 156: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Assets over timeThe difference between premium income and claim payouts is the keyto financial viability of the company.

If the ith claim is made at time Xi and is of size Yi , and premiumsbring income at a steady rate ρ net of running costs, then the assetsA(t) of the insurance company at time t are

A(t) = A(0) + ρt −N(t)∑

1

Yi ,

where N(t) is the number of claims up to time t.

The probability that A(t) remains positive for a long time, and ofhow large the reserves A(0) need to be to make this probability large,are of great interest.

(University of Sheffield) Applied Probability 2019–20 52 / 59

Page 157: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Compound Poisson processes

Sums of the formN(t)∑

1

Yi ,

for a Poisson process N with marks Yi arise in many contexts.

They are called compound Poisson processes.

(University of Sheffield) Applied Probability 2019–20 53 / 59

Page 158: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Compound Poisson processes

Sums of the formN(t)∑

1

Yi ,

for a Poisson process N with marks Yi arise in many contexts.

They are called compound Poisson processes.

(University of Sheffield) Applied Probability 2019–20 53 / 59

Page 159: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Earthquakes

A sequence of earthquakes could be modelled by attaching marksrepresenting earthquake magnitude to the times of occurrence.

Questions about dependence between magnitudes close in time arehighly relevant to predictability and the possibility of warning systems.

The same question arises too about the times themselves andmotivates further development of the Poisson models we haveconsidered in this course.

(University of Sheffield) Applied Probability 2019–20 54 / 59

Page 160: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Earthquakes

A sequence of earthquakes could be modelled by attaching marksrepresenting earthquake magnitude to the times of occurrence.

Questions about dependence between magnitudes close in time arehighly relevant to predictability and the possibility of warning systems.

The same question arises too about the times themselves andmotivates further development of the Poisson models we haveconsidered in this course.

(University of Sheffield) Applied Probability 2019–20 54 / 59

Page 161: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Earthquakes

A sequence of earthquakes could be modelled by attaching marksrepresenting earthquake magnitude to the times of occurrence.

Questions about dependence between magnitudes close in time arehighly relevant to predictability and the possibility of warning systems.

The same question arises too about the times themselves andmotivates further development of the Poisson models we haveconsidered in this course.

(University of Sheffield) Applied Probability 2019–20 54 / 59

Page 162: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Floods

Floods could be modelled by a marked Poisson process, the mark fora flood occurrence being the magnitude of the flood.

Marks could include more information too, becomingmulti-dimensional.

If further data were available, for example about weather conditionsat the times of floods, or environmental conditions such asdryness/wetness of the ground in the period before the flood, then ittoo could be modelled as part of a multi-dimensional mark.

(University of Sheffield) Applied Probability 2019–20 55 / 59

Page 163: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Floods

Floods could be modelled by a marked Poisson process, the mark fora flood occurrence being the magnitude of the flood.

Marks could include more information too, becomingmulti-dimensional.

If further data were available, for example about weather conditionsat the times of floods, or environmental conditions such asdryness/wetness of the ground in the period before the flood, then ittoo could be modelled as part of a multi-dimensional mark.

(University of Sheffield) Applied Probability 2019–20 55 / 59

Page 164: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Floods

Floods could be modelled by a marked Poisson process, the mark fora flood occurrence being the magnitude of the flood.

Marks could include more information too, becomingmulti-dimensional.

If further data were available, for example about weather conditionsat the times of floods, or environmental conditions such asdryness/wetness of the ground in the period before the flood, then ittoo could be modelled as part of a multi-dimensional mark.

(University of Sheffield) Applied Probability 2019–20 55 / 59

Page 165: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Rainfall

A point process model for rainfall attempts to mimic the occurrenceand heaviness of rain at a place in terms of the passage of rain cellsover the place.

The arrivals of rain cells are modelled by a Poisson process and thetime a rain cell takes to pass over the place and the intensity of therain it brings are attached as random marks.

Marks in this case are two-dimensional.

(University of Sheffield) Applied Probability 2019–20 56 / 59

Page 166: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Rainfall

A point process model for rainfall attempts to mimic the occurrenceand heaviness of rain at a place in terms of the passage of rain cellsover the place.

The arrivals of rain cells are modelled by a Poisson process and thetime a rain cell takes to pass over the place and the intensity of therain it brings are attached as random marks.

Marks in this case are two-dimensional.

(University of Sheffield) Applied Probability 2019–20 56 / 59

Page 167: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Rainfall

A point process model for rainfall attempts to mimic the occurrenceand heaviness of rain at a place in terms of the passage of rain cellsover the place.

The arrivals of rain cells are modelled by a Poisson process and thetime a rain cell takes to pass over the place and the intensity of therain it brings are attached as random marks.

Marks in this case are two-dimensional.

(University of Sheffield) Applied Probability 2019–20 56 / 59

Page 168: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Burbage floods

An initial model for the times and severities of the Burbage Brookflood events is based on a marked point process.

Dates of floods are assumed to come from a Poisson process, and theexcess flood flows over 4 cumecs are modelled as conditionallyindependent marks with exponential distributions whose means1/µ(t) may depend on time.

(University of Sheffield) Applied Probability 2019–20 57 / 59

Page 169: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Example: Burbage floods

An initial model for the times and severities of the Burbage Brookflood events is based on a marked point process.

Dates of floods are assumed to come from a Poisson process, and theexcess flood flows over 4 cumecs are modelled as conditionallyindependent marks with exponential distributions whose means1/µ(t) may depend on time.

(University of Sheffield) Applied Probability 2019–20 57 / 59

Page 170: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Special case

Another way to view a marked Poisson process is as a point processin a higher dimensional space.

If the Poisson process is one-dimensional with points at Xi , i = 1, . . .and the marks Yi are also one-dimensional, then the points (Xi ,Yi)form a point process in two dimensions.

When the marks Yi are independent and identically distributed thistwo-dimensional process is itself a Poisson process.

(University of Sheffield) Applied Probability 2019–20 58 / 59

Page 171: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

Special case

Another way to view a marked Poisson process is as a point processin a higher dimensional space.

If the Poisson process is one-dimensional with points at Xi , i = 1, . . .and the marks Yi are also one-dimensional, then the points (Xi ,Yi)form a point process in two dimensions.

When the marks Yi are independent and identically distributed thistwo-dimensional process is itself a Poisson process.

(University of Sheffield) Applied Probability 2019–20 58 / 59

Page 172: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

2D Poisson process

If the original Poisson process has intensity λ(x) and the markprobability density is k(y) then the intensity of the two-dimensionalPoisson process (Xi ,Yi) is µ(x , y) = λ(x) k(y).

Proof in notes.

The result generalizes to higher dimensions.

(University of Sheffield) Applied Probability 2019–20 59 / 59

Page 173: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

2D Poisson process

If the original Poisson process has intensity λ(x) and the markprobability density is k(y) then the intensity of the two-dimensionalPoisson process (Xi ,Yi) is µ(x , y) = λ(x) k(y).

Proof in notes.

The result generalizes to higher dimensions.

(University of Sheffield) Applied Probability 2019–20 59 / 59

Page 174: Applied Probability - Jonathan Jordan · Applied Probability School of Mathematics and Statistics, University of She eld 2019{20 (University of She eld) Applied Probability 2019{201/59

2D Poisson process

If the original Poisson process has intensity λ(x) and the markprobability density is k(y) then the intensity of the two-dimensionalPoisson process (Xi ,Yi) is µ(x , y) = λ(x) k(y).

Proof in notes.

The result generalizes to higher dimensions.

(University of Sheffield) Applied Probability 2019–20 59 / 59