# univariate time series.ppt

Embed Size (px)

### Text of univariate time series.ppt

• 8/20/2019 univariate time series.ppt

1/83

Time Series Analysis

• 8/20/2019 univariate time series.ppt

2/83

Definition

• A time series is a sequence of observationstaken sequentially in time

• An intrinsic feature of a time series is that,

• The nature of this dependence amongobservations of a time series is of

considerable practical interest

• Time Series Analysis is concerned withtechniques for the analysis of this dependence

• 8/20/2019 univariate time series.ppt

3/83

ime Series Forecasting

ime Series Forecasting

• Examine the past behavior of a timeExamine the past behavior of a timeseries in order to infer something aboutseries in order to infer something aboutits future behaviorits future behavior

• A sophisticated and widely usedA sophisticated and widely usedtechnique to forecast the future demandtechnique to forecast the future demand

ExamplesExamples• Univariate time series: AR, MA, ARMA,Univariate time series: AR, MA, ARMA,ARIMA, ARIMA-GARCHARIMA, ARIMA-GARCH

• Multivariate: VAR, CointegrationMultivariate: VAR, Cointegration

• 8/20/2019 univariate time series.ppt

4/83

Univariate Time-series Models• The term refers to a time-series that consists of single

(scalar) observations recorded sequentially over equaltime increments

• Univariate time-series analysis incorporates making use of

historical data of the concerned variable to construct a

model that describes the behavior of this variable (time-

series)

• This model can, subsequently, be used for forecasting

purpose

•  Appropriate technique for forecasting high frequency time

series here data on independent variables are either

non-e!istent or difficult to identify

• 8/20/2019 univariate time series.ppt

5/83

Famous forecasting quotes

• "I have seen the future an it is very mu!h li"ethe #resent, only longer." - Kehlog Albran, The Profit Kehlog Albran, The Profit

– This nugget of pseudo-philosophy is actually a concisedescription of statistical forecasting. We search for statisticalproperties of a time series that are constant in time - levels, trends,

seasonal patterns, correlations and autocorrelations, etc. We thenpredict that those properties will describe the future as well as thepresent.

• \$%rei!tion is very iffi!ult, es#e!ially if it&sa'out the future(\$ Nils Bohr, Nobel laureate in Physics

– This quote serves as a warning of the importance of validating aforecasting model out-of-sample. It's often easy to find a modelthat fits the past data well--perhaps too well! - but quite anothermatter to find a model that correctly identifies those patterns in the

past data that will continue to hold in the future

• 8/20/2019 univariate time series.ppt

6/83

Time series data

• Se!ular Tren! long run pattern

• Cy!li!al )lu!tuation! expansion and

• Seasonality!  annual sales patterns tied

• Irregular or ranom !om#onent

• 8/20/2019 univariate time series.ppt

7/83

• 8/20/2019 univariate time series.ppt

8/83

• 8/20/2019 univariate time series.ppt

9/83

*'+

• 8/20/2019 univariate time series.ppt

10/83

Ex-Post vs. Ex-Ante Forecasts

• \$ow can we compare the forecastperformance of our model%

• There are two ways&

*+ Ante! 'orecast into the future, wait forthe future to arrive, and then compare theactual to the predicted

*+ %ost! 'it your model over a shortenedsample

• Then forecast over a range of observed data• Then compare actual and predicted&

• 8/20/2019 univariate time series.ppt

11/83

Ex-Post and Ex-Ante

Estimation & Forecast Periods

• (uppose you have data covering theperiod )*+&-)./)&-0

80.1 99.4 2001.4

Ex-Post  Estimation Period

Ex-Post

Forecast

Period

Ex-Ante

Forecast

Period

The

Future

• 8/20/2019 univariate time series.ppt

12/83

Examining the In-Sample Fit

• 1ne thing that can be done, once youhave fit your model is to examine the in.sample fit

That is, over the period of estimation, youcan compare the actual to the fitted data

2t can help to identify areas where your

model is consistently under or overpredicting  take appropriate measures

(imply estimate equation and look atresiduals

• 8/20/2019 univariate time series.ppt

13/83

Model Performance

• 34(E 5√")6n∑"fi  xi#/ . differencebetween forecast and actual summed  smaller the better

• 4AE 7 4A8E  smaller the better

• The Theil inequality coefficient alwayslies between 9ero and one, where 9eroindicates a perfect fit&

• :ias portion . Shoul 'e ,ero  \$ow far is the mean of the forecast from

the mean of the actual series%

• 8/20/2019 univariate time series.ppt

14/83

Model Performance

•  ;ariance portion . Shoul 'e ,ero  \$ow far is variation of forecast from forecast of

actual series variance%

• 8/20/2019 univariate time series.ppt

15/83

Autocorrelation function (ACF)

Auto!orrelation fun!tion AC). of a ranom#ro!ess es!ri'es the !orrelation 'et/een the

#ro!ess at ifferent #oints in time(

0et 1 t  'e the value of the #ro!ess at time t  /here t  may 'e an integer for a is!rete-time #ro!ess or a real num'er for a!ontinuous-time #ro!ess.(

If 1 t  has mean 23 an varian!e 4 5 then the

efinition of AC) is

http://e/wiki/Correlationhttp://e/wiki/Meanhttp://e/wiki/Variancehttp://e/wiki/Variancehttp://e/wiki/Meanhttp://e/wiki/Correlation

• 8/20/2019 univariate time series.ppt

16/83

ACF & PACF

• The partial autocorrelation at lag k isthe regression coefficient on >t.k when>t is regressed on a constant,>t.)?>t.k

• This is a partial correlation since itmeasures the correlation of values thatare periods apart after removing the

!orrelation from the intervening lags

• 8/20/2019 univariate time series.ppt

17/83

• 8/20/2019 univariate time series.ppt

18/83

Stationar !ime Series

• A stochastic process is said to be stationary  if its mean

and variance are constant over time and the value ofcovariance between two time periods depends only thedistance or gap or lag between the two time periods andnot the actual time at which the covariance is computed

• 2n time series literature, such stochastic process isknown as /ea"ly stationary or !ovarian!e stationary

• 2n most practical situation, this type of stationary oftensuffices

• A time series is stri!tly stationary  if all the moments ofits probability distribution and not just the first two"mean 7 variance# are invariant over time

• 8/20/2019 univariate time series.ppt

19/83

Stationar !ime Series

• \$owever, if the stationary process is normal , the weaklystationary process is also strictly stationary as normalstachastic process is fully specified by its two moments,the mean 7 variance

• @et >t be a stochastic time series with properties!

Mean ! E">t# 5 Varian!e ! var">t# 5 E ">t  #/ 5 B /

Covarian!e !Ck 5 E ">t  #">tDk  #  autocovariancebetween >t and >tDk, i&e& between two > values k pariods

apart• 2f k 5 , we obtain C, which is simply the variance of >

• 2f k 5 ), C) is the covariance between two adjacent valuesof >

• 8/20/2019 univariate time series.ppt

20/83

Stationary Time Series

• ow, if we shift the origin from >t to >tDm, themean, variance and autocovariance of >tDm must be same as those of >t

• This, if a time series is stationary, its mean,variance, autocovariance remains same, nomatter at what point we measure them i&e&

they are time invariant

• (uch a time series is tend to returns to itsmean, called mean reversion

• 8/20/2019 univariate time series.ppt

21/83

on-stationary Series

• A non.stationary time series will have a timevarying mean or variance or both

• 'or non.stationary time series, we can study itsbehavior only for the time period under

consideration• Each set of time series data will therefore be

for a particular episode

• (o it is not possible to generali9e it to other timeperiods

• Therefore, for the #ur#ose of fore!asting,non-stationary time series may 'e of little

#ra!ti!al value

• 8/20/2019 univariate time series.ppt

22/83

• 8/20/2019 univariate time series.ppt

23/83

Forecasting

• Most statisti!al fore!asting methosare 'ase on the assum#tion that thetime series !an 'e renerea##ro+imately stationary i(e(,

\$stationarie\$. through the use ofmathemati!al transformations

• A stationarie series is relatively

easy to #rei!t: you sim#ly #rei!tthat its statisti!al #ro#erties /ill 'ethe same in the future as they have

'een in the #ast6

• 8/20/2019 univariate time series.ppt

24/83

Forecasting

• The #rei!tions for the stationarie series!an then 'e \$untransforme,\$ 'y reversing/hatever mathemati!al transformations/ere #reviously use, to o'tain #rei!tionsfor the original series

• The etails are normally ta"en !are of 'ysoft/are

• Thus, fining the se8uen!e oftransformations neee to stationarie atime series often #rovies im#ortant !luesin the sear!h for an a##ro#riatefore!asting moel(

• 8/20/2019 univariate time series.ppt

25/83

!andom or "hite oise Process

• 9e !all a sto!hasti! #ro!ess #urelyranom or /hite noise #ro!ess  if it hasa ero mean, !onstant varian!e anserially un!orrelate

• *rror term entere in C0RM is assumeto 'e /hite noise #ro!ess as u ii ;,4 5.

• Ranom /al" moel, non-stationary innature, o'serve in asset #ri!e, sto!"#ri!e or e+!hange rates is!uss later.

• 8/20/2019 univariate time series.ppt

26/83

!rend ACF & PACF

The AC) fun!tion sho/s a efinite #attern, ite!reases /ith the lags(This means there is a tren in the ata(Sin!e the #attern oes not re#eat , /e !an !on!luethat the ata oes not sho/ any seasonality(

• 8/20/2019 univariate time series.ppt

27/83

Seasonality

• 8/20/2019 univariate time series.ppt

28/83

!rend & Seasonalit ACF & PACF

The AC) #lots !learly sho/ a re#etition in the #attern ini!atingthat the ata are seasonal, there is #erioi!ity after every

• 8/20/2019 univariate time series.ppt

29/83

Estimation and !emoval of Trend #Seasonality

• Classi!al =e!om#osition of a Time Series1t > mt ? st ? @t

mt ! trend component "deterministic, changes slowly

with t#F st ! seasonal component "deterministic, period d#F @t ! noise component "random, stationary#&

• Aim: Extract components mt and st, and hope that

>t will be stationary& Then focus on modeling >t&

• =e may need to do preliminary transformations ifthe noise or amplitude of the seasonalfluctuations appear to change over time&

• 8/20/2019 univariate time series.ppt

30/83

Time series ata, 1t > mt ? st ? @t

AC), %AC), A=) tests

on-stationary series Stationary Series, 1t>@t

=e-tren anBor=e-seasonalie

Stationary Series @t

Moel for @tAR, MA, ARMA

Resiual series 9

*stimate AR, MA, ARMA #arameters

)ore!ast 1t In-sam#leBut of sam#le.

Moel for 1t>@tAR, MA, ARMA

Resiual series 9

*stimate AR, MA, ARMA#arameters

)ore!ast 1t>@t In-sam#leBut of sam#le.

• 8/20/2019 univariate time series.ppt

31/83

Backward Shift Operator

• This operator D plays an important role in themathematics of T(A

• D1t>1t-< an in general Ds1t > 1t-s

• A polynomial in the lag operator takes the formG":#5)D G):D G/:/D?&D Gq:q, where G)? Gq areparameters

• The roots of such a polynomial are defined as qvalues of : which satisfy the polynomialequation G":# 5

• 8/20/2019 univariate time series.ppt

32/83

Backward Shift Operator

• If 8>

• 8/20/2019 univariate time series.ppt

33/83

Elimination of !rend

• onseasonal moel /ith tren: 1t > mt ? @t,*@t.>;

• Methos:

a. Moving Average Smoothing'. *+#onential Smoothing

!. S#e!tral Smoothing

. %olynomial )itting

e. =ifferen!ing " times to eliminate tren

• 8/20/2019 univariate time series.ppt

34/83

\$i%erencing &'( times to eliminatetrend

• Iefine the backward shift operator  : as follows! : Jt 5Jt.)

• =e can remove trend by differencing, e&g&

1t

- 1t-<

, an,  1t - 51t-< ? 1t-5

• 2t can be shown that a polynomial trend of degree kwill be reduced to a constant by differencing k times,that is, by applying the operator ").:#k Jt

• Kiven a sequence LxtM, we could therefore proceed bydifferencing repeatedly until the resulting series canplausibly be modeled as a reali9ation of a stationaryprocess&

• 8/20/2019 univariate time series.ppt

35/83

Elimination of Seasonalit

• Seasonal moel /ithout tren: 1t > st ? @t,

*@t.>;,&

a.Classi!al =e!om#osition

Regress level varia'le @. on ummy varia'les /ith or /ithout

inter!e#t. Cal!ulate resiuals

A these resiuals to mean value of @

Resulting series is eseasonalie time series

'. =ifferen!ing at lag to eliminate #erio

Sin!e, st - st- > ;, ifferen!ing at lag /ill eliminatea seasonal !om#onent of #erio (

• 8/20/2019 univariate time series.ppt

36/83

Elimination of !rend#Seasonalit

• Elimination of both trend and seasonalcomponents in a series, can be

achieved by using trend as well asseasonal differencing

• )or e+am#le:

• 8/20/2019 univariate time series.ppt

37/83

Time series ata, 1t > mt ? st ? @t

AC), %AC), A=) tests

on-stationary series Stationary Series, 1t

=e-tren anBor=e-seasonalie

Stationary Series

Moel for stn( seriesAR, MA, ARMA

Resiual series 9

*stimate AR, MA, ARMA #arameters

)ore!ast 1t after re-transformationIn-sam#leBut of sam#le.

Moel for 1t>@tAR, MA, ARMA

Resiual series 9

*stimate AR, MA, ARMA#arameters

)ore!ast 1tIn-sam#leBut of sam#le.

• 8/20/2019 univariate time series.ppt

38/83

on-Seasonal # SeasonalA!) MA # A!MA Process

• 8/20/2019 univariate time series.ppt

39/83

Autoregressi\$e Process

• A3")# model specification is

@t > m ? @t- m ? ut @t >

• 8/20/2019 univariate time series.ppt

40/83

Autoregressi\$e Process

@t >

• 8/20/2019 univariate time series.ppt

41/83

Autoregressi\$e Process•  A.(/) 0rocess"

t % ' t-' #  t- # ut

•  A.(p) 0rocess"

t % ' t-' #  t- # .# * t-* # ut

t % + * t-*

• 1efining the A, *olnomial

()= 1−

'− ... −

* *

• e can rite the A.(p) model concisely as"

()t % ut

• 8/20/2019 univariate time series.ppt

42/83

Autoregressi\$e Process

• t is sometime difficult to distinguishbeteen A. processes of different orderssolely based on correlograms

•  A sharper discrimination is possible on thebasis on partial autocorrelation coeff

• For an A,(*) PACF \$anis/es for lagsFor an A,(*) PACF \$anis/es for lagsgreater t/an *. 0/ile ACF of an A,(*)greater t/an *. 0/ile ACF of an A,(*)decas ex*onentialldecas ex*onentiall

• 8/20/2019 univariate time series.ppt

43/83

• 8/20/2019 univariate time series.ppt

44/83

1o\$ing A\$erage Process

• n a pure 2A process, a variable is e!pressed

solely in terms of the current and pervious hite

noise disturbances

1A(') Process t % ut # q' ut-'

• 2A(q) 0rocess"t % ut # q' u t-' # ... # qqu t-q

2ut3 ∼ 45(6σ)

• 1efining the 2A polynomialq() % ' # q' # ... # qq q

e can rite the 2A(q) model concisely as"

t % q() ut.

• 8/20/2019 univariate time series.ppt

45/83

1o\$ing A\$erage Process

• or parameter identifia7ilit reasons, andin analogy ith the concept of causality for A. processes, e require that all roots of

θ(3) be greater than 4 in magnitude

• The resulting process is said to bein\$erti7le

• !/e PACF of an 1A(q) decas!/e PACF of an 1A(q) decasex*onentiallex*onentiall

• !/e ACF \$anis/es for lags 7eond q!/e ACF \$anis/es for lags 7eond q

• 8/20/2019 univariate time series.ppt

46/83

!/e single

negati\$e s*i8e atlag ' in t/e ACF

is an 1A(')

signature

• 8/20/2019 univariate time series.ppt

47/83

A,1A Process

•5e can put an A.(p) and an 2A(q) processtogether to form the more general A.2A(p,q)

process"

t −   t-' − ... −* t-* = ut # θ  ut-'  ... θq ut-

q,

0/ere 2ut3 ∼ 45(6σ().

• 6y definition, e require that 7yt8 be stationary\$

• Using the compact A. 9 2A polynomial notation,e can rite the A.2A(p,q) as"

()

t = θ() ut, 2ut3 ∼45(6σ()

• 8/20/2019 univariate time series.ppt

48/83

A,1A Process

• or stationarity and invertibility, e requireas before, that all roots of () and θ(3) begreater than 4 in magnitude

•  A. 9 2A are special cases" an A.(p)&A.2A(p,:), and an

2A(q)&A.2A(:,q)

• ACF & PACF 7ot/ deca ex*onentiallACF & PACF 7ot/ deca ex*onentiall

• 8/20/2019 univariate time series.ppt

49/83

S l ACF9PACF

• 8/20/2019 univariate time series.ppt

50/83

Sam*le ACF9PACF• For an A!*p+ the A,F decays geometrically) and the

PA,F is ero eyond lag p. The sample A,F/PA,F

sho0ld exhiit similar ehavior) and signi1cance atthe 234 level can e assessed via the 0s0alo0nds

• For an MA*5+ the PA,F decays geometrically) and the

A,F is ero eyond lag 5. The sample A,F/PA,Fsho0ld exhiit similar ehavior) and signi1cance atthe 234 level can also e assessed via the ± 6.27/√no0nds

• For an A!MA*p)5+) the A,F # PA,F oth decay

exponentially.

• Examining the sample A,F/PA,F therefore can serveonly as a g0ide in determining possile maxim0mval0es for p # 5 to e properly investigated via AI,,.

n96.1±

• 8/20/2019 univariate time series.ppt

51/83

Sam*le ACF9PACF

• The 8A

• 8/20/2019 univariate time series.ppt

52/83

• 8/20/2019 univariate time series.ppt

53/83

:rder Selection91odel ;dentification

•n real-life data, there is usually no underlying true model\$The question then becomes ;ho to select an appropriate

statistical model for a given data set%<

•  A breakthrough as made in the early 4'=:s by the

apanese statistician Akaike\$

• Using ideas from information t/eor, he discovered a

ay to measure ho far a candidate model is from the

;true< model\$

• 5e should therefore minimi>e the distance from the truth,

and select the A.2A(p,q) model that minimizes Akaike?s

nformation @riterion (A;C)"

( )   ( )q p L AIC    ++−=   2ˆ,ˆ,ˆlog2   2σ θ φ

• 8/20/2019 univariate time series.ppt

54/83

:rder Selection91odel ;dentification

• here denotes the likelihood evaluated at the23?s of φ, θ, and σ2, respectively\$ (oadays eactually use a bias-corrected version of A@ called A@@\$)

• The first term in the A@ e!pression measures ho ell

the model fits the dataB the loer it is, the better the fit\$• The second term penali>es models ith more

parameters\$

• inal model selection can then be based upon goodness-

of-fit tests and model parsimony (simplicity)\$• There are several other information criteria currently in

use, C6@, 0, C@, 213, etc\$, but A;C and S

• 8/20/2019 univariate time series.ppt

55/83

on-stationary Time Series

- Unit !oot- A!IMA

• 8/20/2019 univariate time series.ppt

56/83

,andom 4al8 1odel

•  Although our interest is on stationary timeseries, e often encounters non-stationary

time series

• @lassic e!ample" .52 (stock price,

e!change rate)

• @an be of to types D ,andom 0al8 0it/out drift =t% =t-' # ut

D ,andom 0al8 0it/ drift =t% > # =t-' # ut

, d l8 it/ t d ift

• 8/20/2019 univariate time series.ppt

57/83

,andom 0al8 0it/out drift

• 3et ='%=t-'#u'

• ='%=6#u'? =( % ='#u(  =(%=6#u'#u(

• E(=t) % =6 and \$ar(=t) % t@(

• 2ean value of E is its initial value, hich is

constant, but as t increases, its variance

increases indefinitely, thus violating the

stationary condition

• .52 is the persistence of random shocks andimpact of particular shock does not die aay

• .52 said to have infinite memory

, d l8 it/ d ift

• 8/20/2019 univariate time series.ppt

58/83

,andom 0al8 0it/ drift

• =t%> #=t-' # ut

=t% > # ut

• Et drift upard or donard depending

upon > being positive or negative

• ,41 is an exam*le of 0/at is 8no0n

as unit root *rocess

B it , t P

• 8/20/2019 univariate time series.ppt

59/83

Bnit ,oot Process

• Cay, Et&FEt-4GutB -4 H F H4

• This is an A.(4) process

• ;f;f %' t/en 0e get ,41 0it/out drift non-%' t/en 0e get ,41 0it/out drift non-

stationar *rocessstationar *rocess• 4e call it unit root *ro7lem4e call it unit root *ro7lem

• The term refers to the root of the polynomial in

the lag operator • !/us t/e terms non-stationarit random 0al8

and unit root can 7e treated as snonmous

U it t

• 8/20/2019 univariate time series.ppt

60/83

Unit root

• 8/20/2019 univariate time series.ppt

61/83

Difference Stationar (DS) Process

• f the trend of a time series is predictable

and not variable, e call it deterministic

trend

• f trend is not predictable, e call it

stochastic trend

• Sa =t%7'#7(t#7D=t-'#ut ut 45

• ;f 7'%7(%6 7D%' ,41 0it/out drift

non-stationar 'st difference

stationar

• 8/20/2019 univariate time series.ppt

62/83

!rend Stationar Process

• f 7'%7 6 7%6 =t%7'#7t#ut

• !/is is called !S *rocess

• Though mean is not constant, variance is

• Ince the values of b4 and b/ is knon, the mean canbe forecast perfectly

• Thus, if e subtract the mean of Et from Et, theresultant series ill be stationary

• 8/20/2019 univariate time series.ppt

63/83

Dic8-Fuller unit root tests

• Cimple A.(4) model

xt%xt-'#ut .. (')

• The null hypothesis of unit root,

o %' 0it/ ' G '

• Cubtracting !t-4 from both sides of equ (4), e get

xt H xt-' % xt-' H xt-' # ut

xt-' % (-')xt-'# ut

xt-' % Ixt-'# ut

• Jere null hypothesis of unit root

o I % 6 and ' I G 6

Detection of Bnit ,oot ADF !ests

• 8/20/2019 univariate time series.ppt

64/83

Detection of Bnit ,oot H ADF !ests

•  A1 test is conducted ith the folloing model"

• 5here Et is the underlying variable at time t,

• ut is the error term

• The lag terms are introduced in order to Kustify

that errors are uncorrelated ith lag terms\$

• or the above-specified model the hypothesis,hich ould be of our interest, is"

6 I % 6

• 8/20/2019 univariate time series.ppt

65/83

• To begin, double click on the series name to openthe series indo, and choose Jie09Bnit ,oot!est

• Cpecify hether you ish to test for a unit root in

the level, first difference, or second difference ofthe series

D Lou can choose to include a constant, a constant andlinear trend, or neither

• Mies automatically select lag length, othersuse A@, C6@ and other criteria

• 8/20/2019 univariate time series.ppt

66/83

5ull /*ot/esis of an unit rootcannot 7e reKected

:t/ B it , t ! t

• 8/20/2019 univariate time series.ppt

67/83

:t/er Bnit ,oot !ests

• 0hillips-0erron (4''N) tests

• +3C-detrended 1ickey-uller tests

• (lliot, .othenberg, and Ctock, 4''O)

• *iatkoski, 0hillips, Cchmidt, and Chin tests

• (*0CC, 4''/),

• lliott, .othenberg, and Ctock 0oint Iptimal

tests (.C, 4''O)• g and 0erron (0, /::4) unit root tests

• 8/20/2019 univariate time series.ppt

68/83

• 8/20/2019 univariate time series.ppt

69/83

;ntegrated Stoc/astic Process

• 8/20/2019 univariate time series.ppt

70/83

;ntegrated Stoc/astic Process

• .52 is a specific case of more generalclass of stochastic process knon asintegrated process

• Irder of integration is the minimum

number of times the series need to be firstdifferenced to yield a stationary series

• .52 is non-stationary but 4st difference is

stationary

;(') series• A stationar series is called ;(6) series

• 4st difference of (:) series still yields (:)series

ARIMA Models

• 8/20/2019 univariate time series.ppt

71/83

ARIMA Models

An integrated process 8t is designedas an A!IMA *p)d)5+) if ta'ingdi%erences of order d) a stationaryprocess 9t of the type A!MA *p) 5+ is

otained.

he ARIMA (p, d, q model is e!pressed "# the f\$nction

t φ

" t - " #φ

\$ t - \$  # %%..#φ

p t - p  # ut -θ

" ut & " -θ

\$ u t &\$ -%% -θq u t &q

r φ()* (" & )* d+ t θ()* ut

Summar of A,1A9A,;1A modeling

• 8/20/2019 univariate time series.ppt

72/83

Summar of A,1A9A,;1A modeling*rocedures

4\$ 0erform *reliminar transformations (ifnecessary) to stabili>e variance over time

. Detrend ;and deseasonaliLeM the data (ifnecessary) to make the stationarityassumption look reasonable

Trend and seasonality are also characteri>edby A@?s that are sloly decaying and nearlyperiodic, respectively

The primary methods for achieving this areclassical decomposition, and differencing

Summar of A,1A9A,;1A modeling

• 8/20/2019 univariate time series.ppt

73/83

Summar of A,1A9A,;1A modeling

*rocedures

P\$ f the data looks nonstationary ithout a ell-defined trend or seasonality, an alternative tothe above option is to difference successi\$el 9 use A1 tests

N. Examine sam*le ACF & PACF to get an ideaof potential p 9 q values\$ or an A.(p)#2A(q),

the sample 0A@#A@ cuts off after lag p#q

O. Estimate the coefficients for the promisingmodels

Summar of A,1A9A,;1A modeling

• 8/20/2019 univariate time series.ppt

74/83

Summar of A,1A9A,;1A modeling*rocedures

O rom the fitted 23 models above, c/oose t/eone 0it/ smallest A;CC

= nspection of the standard errors of thecoefficients at the estimation stage, may revealthat some of them are not significant f so, su7set models can be fitted by constraining

these to be >ero at a second iteration of 23 estimation

N @heck the candidate models for goodness-of-fit by e!amining their residuals\$ This involves inspecting their A@#0A@ for departures

from 5, and by carrying out the formal 5hypothesis tests

S l t f A,;1A d l

• 8/20/2019 univariate time series.ppt

75/83

Seasonal *art of A,;1A model

• The seasonal part of an A.2A model has the same

structure as the non-seasonal part" it may have an A.factor, an 2A factor, and#or an order of differencing

• n the seasonal part of the model, all of these factorsoperate across multiples of lag s (the number of periods in

a season)•  A seasonal A.2A model is classified as an A.2A(0,1,Q)

model, here 0&number of seasonal autoregressive(CA.) terms, 1&number of seasonal differences,Q&number of seasonal moving average (C2A) terms

• n identifying a seasonal model, the first  step is todetermine hether or not a seasonal difference is needed,in addition to or perhaps instead of a non-seasonaldifference

Seasonal *art of A,;1A model

• 8/20/2019 univariate time series.ppt

76/83

Seasonal *art of A,;1A model

%   he seasonal models ARIMA (&, ', which are not

stationar# "\$t homogeno\$s of degree ' can "e

e!pressed as

t  Φ" t - s # Φ\$  t - \$s  # %%..# Φp t & p s #δ# ut - Θ"ut & s - Θ\$ ut &\$ s-%.

Φp ()s* (" & )s* , + t  δ # Θ ()s* ut

• The signature of pure SAR  or pure SMA behavior issimilar to the signature of pure A. or pure 2A behavior,e!cept that the pattern appears across multiples of lag s

in the A@ and 0A@

• or e!ample, a pure CA.(4) process has spikes in the A@ at lags s, /s, Ps, etc\$, hile the 0A@ cuts off afterlag s

• 8/20/2019 univariate time series.ppt

77/83

• 8/20/2019 univariate time series.ppt

78/83

Seasonal *art of A,;1A model

• @onversely, a pure C2A(4) process has

spikes in the 0A@ at lags s, /s, Ps, etc\$,

hile the A@ cuts off after lag s

•  An CA. signature usually occurs hen the

autocorrelation at the seasonal period is

positiv e, hereas an C2A signatureusually occurs hen the seasonal

autocorrelation is negative

enera m0 p ca ve seasonamodels

• 8/20/2019 univariate time series.ppt

79/83

models A!IMA *p) d) 5+ *P) \$) ;+s

n integrated process +t is designed as an /I0 (p,d,q*, if ta1ingdifferences of order d, a stationary process t of the type /0 (p, q*is obtained.

he ARIMA (p, d, q model is e!pressed "# the f\$nction

t  φ" t - " # φ \$ t - \$ # %%..# φp t - p  # ut - θ" ut & " - θ\$ u t &\$ - %% - θq u t &q

()* (" & )* d+ t θ

()* ut

he seasonal models ARIMA (&, ', which are not stationar#

"\$t homogeno\$s of degree ' can "e e!pressed as

t  Φ" t - s # Φ\$  t - \$s  # %%..# Φp t & p s #δ# ut - Θ"ut & s - Θ\$ ut &\$ s- %.  r Φp ()s* (" & )s* + t  δ # Θ ()s* ut

)eneral m\$ltiplicati*e seasonal models, ARIMA (p, d, q (&, ',

s

Φp ()s* φp ()*(" & )s* (" & )* d + t Θ ()s* θq ()* ut.

• 8/20/2019 univariate time series.ppt

80/83

• 8/20/2019 univariate time series.ppt

81/83

ARIMA Model B\$ilding

RIMA Model B\$ilding

%Identification

dentification

This stage 'asi!ally tries to ientify anThis stage 'asi!ally tries to ientify ana##ro#riate ARIMA moel for the unerlyinga##ro#riate ARIMA moel for the unerlyingstationary time series on the 'asis of AC) anstationary time series on the 'asis of AC) an%AC)%AC)

If the series is nonstationary it is firstIf the series is nonstationary it is first

transforme to !ovarian!e-stationary an thentransforme to !ovarian!e-stationary an thenone !an easily ientify the #ossi'le values ofone !an easily ientify the #ossi'le values of

the regular #art of the moel i(e(the regular #art of the moel i(e(autoregressive orer # an moving average orerautoregressive orer # an moving average orer8 in a univariate ARMA moel along /ith the8 in a univariate ARMA moel along /ith theseasonal #artseasonal #art

• 8/20/2019 univariate time series.ppt

82/83

ARIMA Model B\$ilding

RIMA Model B\$ilding

%+stimation

stimation

%oint estimates of the !oeffi!ients !an 'e%oint estimates of the !oeffi!ients !an 'e

o'taine 'y the metho of ma+imum li"elihooo'taine 'y the metho of ma+imum li"elihoo

Asso!iate stanar errors are also #rovie,Asso!iate stanar errors are also #rovie,

suggesting /hi!h !oeffi!ients !oul 'e ro##esuggesting /hi!h !oeffi!ients !oul 'e ro##e

%'iagnostic checking

iagnostic checking

ne shoul also e+amine /hether the resiuesne shoul also e+amine /hether the resiues

of the moel a##ear to 'e /hite noise #ro!essof the moel a##ear to 'e /hite noise #ro!ess

%Forecasting

orecasting

• 8/20/2019 univariate time series.ppt

83/83

Documents
Documents
Documents
Documents
Documents
Technology
Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents