26
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary time series can be transformed into a stationary time series, modelled and back-transformed to original scale (e.g. for purposes of forecasting) ARIMA – models These parts can be modelled on a stationary series This part has to do with the transforma tion

Auto Regressive, Integrated, Moving Average

  • Upload
    rufus

  • View
    96

  • Download
    0

Embed Size (px)

DESCRIPTION

Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary time series can be transformed into a stationary time series, modelled and back-transformed to original scale (e.g. for purposes of forecasting) ARIMA – models . - PowerPoint PPT Presentation

Citation preview

Page 1: Auto Regressive,               Integrated,                              Moving Average

Auto Regressive,

Integrated,

Moving Average

Box-Jenkins models

A stationary times series can be modelled on basis of the serial correlations in it.

A non-stationary time series can be transformed into a stationary time series, modelled and back-transformed to original scale (e.g. for purposes of forecasting)

ARIMA – models

These parts can be modelled on a stationary series

This part has to do with the transformation

Page 2: Auto Regressive,               Integrated,                              Moving Average

AR-models (for stationary time series)

Consider the model

Yt = δ + ·Yt –1 + et

with {et } i.i.d with zero mean and constant variance = σ2 (white noise) and where δ (delta) and (phi) are (unknown) parameters

Autoregressive process of order 1: AR(1)

Set δ = 0 by sake of simplicity E(Yt ) = 0

k = Cov(Yt , Yt-k ) = Cov(Yt , Yt+k ) = E(Yt ·Yt-k ) = E(Yt ·Yt+k )

Page 3: Auto Regressive,               Integrated,                              Moving Average

Now:

0 = E(Yt ·Yt ) = E(( ·Yt-1 + et ) Yt )= · E(Yt-1 ·Yt ) + E(et Yt) =

= · 1 + E(et ( ·Yt-1 + et ) ) = · 1 + · E(et Yt-1 ) + E(et ·et )=

= · 1 + 0 + σ2 (for et is independent of Yt-1 )

1 = E(Yt-1 ·Yt ) = E(Yt-1 ·( ·Yt-1 + et ) = · E(Yt-1 ·Yt-1 ) + E(Yt-1 ·et ) =

= · 0 + 0 (for et is independent of Yt-1 )

2 = E(Yt -2·Yt ) = E(Yt-2 ·( ·Yt-1 + et ) = · E(Yt-2 ·Yt-1 ) +

+ E(Yt-2 ·et ) = · 1 + 0 (for et is independent of Yt-2 )

Page 4: Auto Regressive,               Integrated,                              Moving Average

0 = 1 + σ2

1 = · 0 Yule-Walker equations

2 = · 1

k = · k-1 =…= k· 0

0 = 2 · 0 + σ2

2

2

0 1

Page 5: Auto Regressive,               Integrated,                              Moving Average

Note that for 0 to become positive and finite (which we require from a variance) the following must hold:

112

This in effect the condition for an AR(1)-process to be weakly stationary

Now, note that

000)()(),(),(

kk

ktt

kttkktt YVarYVar

YYCovYYCorr

kk

k

0

0

Page 6: Auto Regressive,               Integrated,                              Moving Average

Recall that k is called the autocorrelation function (ACF)

”auto” because it gives correlations within the same time series.

For pairs of different time series one can define the cross correlation function which gives correlations at different lags between series.

By studying the ACF it might be possible to identify the approximate magnitude of .

Page 7: Auto Regressive,               Integrated,                              Moving Average

Examples:

Page 8: Auto Regressive,               Integrated,                              Moving Average
Page 9: Auto Regressive,               Integrated,                              Moving Average
Page 10: Auto Regressive,               Integrated,                              Moving Average

The general linear process

1

2

2211

noise white

ii

t

tttt

eeeeY

AR(1) as a general linear process:

22

1123

121

ttttttt

tttttt

eeeeeYe

eYeeYY

Page 11: Auto Regressive,               Integrated,                              Moving Average

If | | < 1 The representation as a linear process is valid

| | < 1 is at the same time the condition for stationarity of an AR(1)-process

Second-order autoregressive process

2211

2211

by and by ,by replace Otherwise,zero be toassumed is

tttttt

t

tttt

YYYYYYYE

eYYY

Page 12: Auto Regressive,               Integrated,                              Moving Average

Characteristic equation

Write the AR(2) model as

tt

tttt

pttp

ttttt

tttt

eYBB

eYBBYY

YYBYYBBBYYBY

eYYY

221

221

22

1

2211

1

operator backshift"" The

;;;Let

AR(2) ofequation sticcharacteri thecalled is 01 2

21 xx

Page 13: Auto Regressive,               Integrated,                              Moving Average

Stationarity of an AR(2)-process

The characteristic equation has two roots (second-order equation).

(Under certain conditions there is one (multiple) root.)

The roots may be complex-valued

If the absolute values of the roots both exceed 1 the process is stationary.

Absolute value > 1 Roots are outside the unit circle

1

i

Page 14: Auto Regressive,               Integrated,                              Moving Average

1;1;1

12

4x

01

21221

2

22

11

221

x

xx

Requires (1 , 2 ) to lie within the blue triangle.

Some of these pairs define complex roots.

Page 15: Auto Regressive,               Integrated,                              Moving Average

Finding the autocorrelation function

Yule-Walker equations:

221102211

00

2211

2211

2211

2211

by divide

) oft independen (

kkkkkk

tktkkk

tkt

tkttkttkttkt

tkttkttkttkt

tttt

eEYEeY

eYEYYEYYEYYE

eYYYYYYY

eYYY

Start with 0 = 1

Page 16: Auto Regressive,               Integrated,                              Moving Average

For any values of 1 and 2 the autocorrelations will decrease exponentially with k

For complex roots to the characteristic equation the correlations will show a damped sine wave behaviour as k increases.

Se figures on page 74 in the textbook

Page 17: Auto Regressive,               Integrated,                              Moving Average

The general autoregressive process, AR(p)

pkpkkk

pppp

pp

pp

pp

tptptt

xx

eYYY

2211

2211

2132112

1231211

1

11

:equationsWalker -Yule

valueabsolutein 1 exceed roots all if Stationary

01 :equation sticCharacteri

Exponentially decaying

Damped sine wave fashion if complex roots

Page 18: Auto Regressive,               Integrated,                              Moving Average

Moving average processes, MA

tq

qt

qtqttt

eBBY

qeeeY

1

11

1

)MA(

Always stationary

MA(1)

1for 0;01

,,

1

21

221111

221

20

1

k

eeeeCovYYCov

eVareVarYVar

eeY

kk

etttttt

ettt

ttt

Page 19: Auto Regressive,               Integrated,                              Moving Average

General pattern:

qk

qk

eeeY

q

qkqkkk

k

qtqttt

0

,,2,11 22

22

1

2211

11

“cuts off” after lag q

Page 20: Auto Regressive,               Integrated,                              Moving Average

Invertibility (of an MA-process)

tttt

qtqqtq

tttqtqttt

qtqttt

eYYY

eY

eYYeeYe

eeeY

2211

11

221111

11

i.e. an AR()-process provided the rendered coefficients 1, 2, … fulfil the conditions of stationarity for Yt

They do if the characteristic equation of the MA(q)-process

has all its roots outside the unit circle (modulus > 1)

01 1 qq xx

Page 21: Auto Regressive,               Integrated,                              Moving Average

Autogregressive-moving average processes ARMA(p,q)

qk

qk

xx

xx

eBBYBB

eeeYYY

eeeYYY

k

pkpkk

qq

pp

tq

qtp

p

qtqttptptt

qtqttptptt

,for needed equations Specific

for stationary If

circleunit theoutside roots all has 1 if Invertible

circleunit theoutside roots all has 1 if Stationary

11

11

1

1

11

1111

1111

Page 22: Auto Regressive,               Integrated,                              Moving Average

Non-stationary processes

A simple grouping of non-stationary processes:

•Non-stationary in mean•Non-stationary in variance•Non-stationary in both mean and variance

Classical approach: Try to “make” the process stationary before modelling

Modern approach: Try to model the process in it original form

Page 23: Auto Regressive,               Integrated,                              Moving Average

Classical approach

Non-stationary in mean

Example Random walk

operator backshift theusing 1or

operator" Difference" denoted be alsocan )s"differenceorder -first(" stationary becomes

1

1

1

1

t

ttt

ttt

ttt

ttt

YB

YYYYYWeYY

eYY

Page 24: Auto Regressive,               Integrated,                              Moving Average

More generally…

etc.

2

can try westationary-non still is If

process-),ARMA(an as model tocan try we

model)linear general the(i.e. satisfies If

212112

1211

ttttttttt

t

t

tttt

t

YYYYYYYYY

Y

qpY

eeeYY

First-order non-stationary in mean Use first-order differencingSecond-order non-stationary in mean Use second order differencing…

Page 25: Auto Regressive,               Integrated,                              Moving Average

ARIMA(p,d,q)

tq

qtp

p

td

t

eBBWBB

YW

11 11

satisfies

Common:

d ≤ 2p ≤ 3q ≤ 3

Page 26: Auto Regressive,               Integrated,                              Moving Average

Non-stationarity in variance

Classical approach: Use power transformations (Box-Cox)

0log

01

t

t

YYg

Common order of application:1.Square root2.Fourth root3.Log4.Reciprocal (1/Y)

For non-stationarity both in mean and variance:1.Power transformation2.Differencing