Ar English

Embed Size (px)

DESCRIPTION

Ar English

Citation preview

  • Serial CorrelationCAO HAO THI

  • CONTENTSerial correlation (Autocorrelation AR) ?Consequences of ignoring ARTesting for AREstimation procedures

  • Serial Correlation ?Serial correlation (or autocorrelation) is correlation between residuals t Serial Correlation Autocorrelation AutoRegression - AR

  • Serial correlation ?PRF: Yt = 1 + 2X2t + 2X3t + + kXkt +t

    AR(p): serial correlation of the p-order t = 1 t-1 + 2 t-2 + + p t-p + t

    The p-order autoregressive process of the residuals t

  • Serial correlation ?Error terms t have the white noise whenE(t) = 0E(2t) = 2 = constE(t, t-s) = 0 for s 0

    AR(p): serial correlation of the p-orderH0 : 1 = 2 = = p = 0 : no presence of AR(p)

  • Serial correlation ?Assumption :No presence of ARE(t ,t-p) = 0 for p 0

    Violation of the assumption:E(t ,t-p) 0 for p 0 Presence AR(p)

  • CONSEQUENCES OF IFNORING AR ?Estimates and forecasts based on them are still unbiased and consistent but are inefficient.Estimates are inconsistent if the independent variables include a lagged dependent variable

    2.The estimated variance and covariance of the parameters will be biased and inconsistent and hence the t- and F-tests are no longer valid.

  • TESTING FOR AR ?Graphing method:

    This technique is only suggestive of AR and is not a substitute of the formal tests

  • PLOT FOR IDENTIFYING AR ?

  • PLOT FOR IDENTIFYING AR ?

  • TESTING FOR AR ?

    Durbin Watson TestCorrelogram Q Statistics TestThe Lagrange Multiplier (LM) Test

  • DURBIN WATSON TEST ?Use only test for AR(1)Yt = 1 + 2X2t + 2X3t + + kXkt +tAR(1): t = 1 t-1 + t

    Hypothesis:H0 : 1 = 0 : no presence of AR(1)H1 : 1 0 : presence of AR(1)

  • DURBIN WATSON TEST ?Test statistic:No Conc-lusion0dLdU24 - dU4 - dL4H0: r = 0H1: r < 0Positve Autocorrelation H1: r > 0NegativeAutocorrelation No Conc-lusion

  • DURBIN WATSON TEST ?Note:In some cases it is inconclusiveWhen the right side of the model includes lagged dependent variables, the test is invalid

  • CORRELOGRAM TESTACk (Auto Correlation)ACk = r = Correl(t, t-k)

    PACk (Partial Auto Correlation)et = 1et-1 + t 1^ = PAC1 et = 1et-1 + 2et-2 + t 2^ = PAC2

  • CORRELOGRAM TESTAssumption:H0 : AC1 =AC2 = = ACp = 0 No presence of AR(p)H1 : At least one ACj 0 (j = 2,p) Presence of AR(p)

    That is:AR(1) : H0 : AC1 = 0 No presence of AR(1) H1 : AC1 0 Presence of AR(1)AR(2) : H0 : AC1 = AC2 = 0 No presence of AR(2) H1 : AC1 0 hoac AC2 0 Presence of AR(2)

  • CORRELOGRAM TESTTest statisticLB: Lung-BoxQ* = 2k-p-qk: Examined lagp: Order of ARq: Order of moving averageQtt > Q* Reject Ho

  • CORRELOGRAM TESTUsing EVIEWView/Residual Test/CorrelogramQ Statistics

    If t are not auto correlated:AC and PAC of all lags will be close to 0 value within 2All Q-Stat statistics will be insignificant if p-value > 5% No presenceAR

  • LAGRANGE MULTIPLIER TESTYt = 1 + 2X2t + 2X3t + + kXkt +tAR(p): Serial correlation of the p-ordert = 1 t-1 + 2 t-2 + + p t-p + t

    Assumption:H0 : AC1 =AC2 = = ACp = 0 No presence AR(p)H1 : At least one ACj 0 (j = 2,p) Presence of AR(p)

  • LAGRANGE MULTIPLIER TESTStep 1: Estimate equation:Yt = 1 + 2X2t + 2X3t + + kXkt +t t^ = residStep 2: Do auxiliary regression: t^ = 1 + 2X2t + 2X3t + + kXkt + 1 t-1 + 2 t-2 + + p t-p + t R2aux

  • LAGRANGE MULTIPLIER TESTStep 3: Testing hypothesis:H0 : 1 = 2 = = p = 0 No presence AR(p) H1 : At least one j 0 (j = 1,p) Presence of AR(p)

    Test Statistics: 2tt = (n-p)R2aux 2* = 2p, 2tt > 2* or p-value > Reject H0

  • TREATMENT OF AR Changing the function formComputing differencesEstimation proceduresCochrane Orcutt Iterative Procedure (CORC) (Cochrane and Orcutt, 1949)Hildrth Lu Search Procedure (HILU) (Hildreth Lu, 1960).

  • CHANGING THE FUNCTIONAL FORMSerial correlation can be caused by misspecification of the functional form.

    No estimation procedure can guarantee the elimination of AR which is caused by incorrect nature in the determinants rather than in the error terms

  • COMPUTING DIFFRENCESYt = 0 + 1Xt + t Yt = 0 + 1 Xt + t

    Where:Yt = Yt Yt 1 Xt = Xt Xt 1

    However, the solution of using first differences might not always be appropriate.

  • COCHRANE ORCUTT PROCEDUREYt = 1 + 2 X2t + 3X3t + + k Xkt + t Yt1 =1 + 2 X(t1)2 + 3X(t1)3 + + k X(t 1)k + t 1

    Yt Yt1 = 1(1) + 2[Xt2 X(t1)2] + 3[Xt3 X(t1)3] + + k[Xtk X(t)k] + t

  • COCHRANE ORCUTT PROCEDUREYt = 1 + 2 X2t + 3X3t + + k Xkt + t (1)

    Step 1: Estimate (1) by OLS t^ = resid Step 2: t^ t-1^, compute ^

  • COCHRANE ORCUTT PROCEDUREStep 3: Compute

    Step 4: Estimate

    by OLS

  • COCHRANE ORCUTT PROCEDUREStep 5: Use k^ in step 4 k in (1) to obtain a new set of t^

    Step 6: Get new estimates of ^ and compare them with ^ obtained in step 2

    This procedure can estimate ^ only local

  • HILDRTH LU PROCEDUREStep 1: Choose a value of (1). Using this value, transform the variables and estimate

    (*)

    Equation(*) by OLS.

  • HILDRTH LU PROCEDUREStep 2: From these estimates from Equation (*), derive t and the error sum of squares associated with it. Call it ESS(1).

    Next choose a different value of (say 2) and repeat Step 1 and 2.

  • HILDRTH LU PROCEDUREStep 3: Vary values of from 1 to + 1 in some systematic way A series of values of ESS(). Choose that for which ESS is a minimum * Equation (*) is then estimate with this final * as the optimum solution.