Lecture 2. Relaxing the Assumptions of CLRM_0

Embed Size (px)

Citation preview

  • 8/3/2019 Lecture 2. Relaxing the Assumptions of CLRM_0

    1/17

    RELAXING THE ASSUMPTIONS OFCLRM

    Dr. Obid A.Khakimov

    Senior lecturer,

    Westminster International

    University in Tashkent

  • 8/3/2019 Lecture 2. Relaxing the Assumptions of CLRM_0

    2/17

    ASSUMPTIONS OF CLASSICAL LINEARREGRESSION MODEL

    The regression model is linear in parameters

    The values of independent variables are fixedin repeated sampling

    Conditional mean of residuals is equal to zero For given Xs there is no autocorrelation in the

    residuals

    Independent variables , Xs , and residuals of

    the regression are independent.

  • 8/3/2019 Lecture 2. Relaxing the Assumptions of CLRM_0

    3/17

    ASSUMPTIONS OF CLASSICAL LINEARREGRESSION MODEL

    The number of observations must be greaterthan number of parameters.

    There must be sufficient variability in thevalues of variables.

    The regression model should be correctlyspecified.

    There is no linear relationship amongindependent variables.

    Residuals of the regression normallydistributed

    ),0(~

    2

    ei Ne

  • 8/3/2019 Lecture 2. Relaxing the Assumptions of CLRM_0

    4/17

    MULTICOLLINEARITY:

    Agenda:The nature of multicollinearity.

    Practical consequences.

    Detection. Remedial measures to alleviate the problem.

  • 8/3/2019 Lecture 2. Relaxing the Assumptions of CLRM_0

    5/17

    PERFECT V S LESS THAN

  • 8/3/2019 Lecture 2. Relaxing the Assumptions of CLRM_0

    6/17

    PERFECT V.S LESS THANPERFECT

    kk XXXX1

    3

    1

    32

    1

    21 .....

    +=

    ikk eXXXX

    11

    3

    1

    32

    1

    21

    1.....

    ++=

    0.....332211 =++ kkXXXX

    Perfect multicollinearity is the case when two ore more independvariables Can create perfect linear relationship.

    Perfect multicollinearity is the case when two ore more independvariables Can create less than perfect linear relationship.

  • 8/3/2019 Lecture 2. Relaxing the Assumptions of CLRM_0

    7/17

    MULTIPLE REGRESSION MODEL

    ii uXXY ++= 3322

    =2

    33221

    2 )(min iiii XXYu

    33221

    XXY =

    ( )2

    32

    2

    3

    2

    2

    323

    2

    322

    )())((

    ))(()(

    =

    iiii

    iiiiiii

    xxxx

    xxxyxxy

    ( )2

    32

    2

    3

    2

    2

    322

    2

    233

    )())((

    ))(()(

    =

    iiii

    iiiiiii

    xxxx

    xxxyxxy

  • 8/3/2019 Lecture 2. Relaxing the Assumptions of CLRM_0

    8/17

    MULTIPLE REGRESSIONMODEL

    ( )

    ( )

    ( )0

    0

    )(

    )(

    _

    )())(((

    ))(()(

    )())()((

    ))(()(

    2

    33

    2

    2

    3

    22

    3

    2

    3

    32

    3

    2

    33

    2

    2

    33

    2

    3

    2

    3

    333233

    2

    =

    =

    =

    =

    =

    =

    aaa

    axyaxy

    afi

    xxx

    xxyxxy

    xxxx

    xxxyxxy

    iiii

    iii

    iiiiii

    iiii

    iiiiiii

    ii XX 32 =

    ( )2

    32

    2

    3

    2

    2

    323

    2

    322

    )())((

    ))(()(

    =

    iiii

    iiiiiii

    xxxx

    xxxyxxy

  • 8/3/2019 Lecture 2. Relaxing the Assumptions of CLRM_0

    9/17

    OLS ESTIMATION

    2

    232

    23

    22

    3232

    2

    2

    2

    3

    2

    3

    2

    2

    1 )(

    21

    )

    var(

    ++=

    iiii

    iiii

    xxxx

    xxXXxXxX

    n

    )1(

    )var(2

    3,2

    2

    2

    2

    2

    rx i

    =

    )1(

    )var(2

    3,2

    2

    3

    2

    3

    rx i

    =

    =

    2

    3

    2

    2

    2

    3,2

    2

    3,2

    32

    )1(

    ),cov(

    ii xxr

    r

    As degree of collinearity approaches to one,the variances of coefficients approaches toinfinity.

    Thus, the resence of hi h collinearit will

  • 8/3/2019 Lecture 2. Relaxing the Assumptions of CLRM_0

    10/17

    PRACTICAL CONSEQUENCES

    The OLS is BLUE but large variances andcovariances making process estimationdifficult.

    Large variances cause large confidenceintervals and accepting or rejectinghypothesis are biased.

    T statistics are biased

    Although t-stats are low, R-square mightbe very high.

    The sensitivity of estimators and

    variances are very high to small changes

  • 8/3/2019 Lecture 2. Relaxing the Assumptions of CLRM_0

    11/17

    VARIANCE INFLATION FACTOR

    VIFxrxrx iii

    =

    =

    =2

    2

    2

    2

    3,2

    2

    2

    2

    2

    3,2

    2

    2

    2

    2)1(

    1

    )1()var(

    0

    20

    40

    60

    80

    100

    120

    0 0.2 0.4 0.6 0.8 1 1.2

    Correlation

    VIF

  • 8/3/2019 Lecture 2. Relaxing the Assumptions of CLRM_0

    12/17

    IMPLICATION FOR K VARIABLE MODELS

    VIFxRxRx jjjjj

    j =

    =

    =

    2

    2

    22

    2

    22

    2

    )1(

    1

    )1()var(

    ikki uXXXXXY ..... 33221100 +++++=

    22

    33221100.....

    j

    kki

    RR

    XXXXXX

    =

    ++++=

  • 8/3/2019 Lecture 2. Relaxing the Assumptions of CLRM_0

    13/17

    CONFIDENCE INTERVALS AND T-STATISTICS

    VIFse kk )(96.1

    VIFset

    k

    kk

    )(

    0

    =

    0...:320

    ====k

    H

    )/()1(

    )1/(

    111 2

    2

    2

    2

    knR

    kR

    R

    R

    k

    kn

    RSS

    ESS

    k

    knF

    =

    =

    =

    Ha: Not all slope coefficients are simultaneously zero

    Due to low t-stats we can not reject ourNull Hypothesis

    ue to high R square the F-value will be very high and rejection of Ho will be easy

  • 8/3/2019 Lecture 2. Relaxing the Assumptions of CLRM_0

    14/17

    DETECTION

    Multicollinearity is a question of degree.

    It is a feature of sample but not population.

    How to detect : High R square but low t-stats. High correlation coefficients among the

    independent variables.

    Auxiliary regression

    High VIF

    Eigenvalue and condition index.***

  • 8/3/2019 Lecture 2. Relaxing the Assumptions of CLRM_0

    15/17

    AUXILIARY REGRESSION

    22

    33221100.....

    j

    kki

    RR

    XXXXXX

    =

    ++++=

    )1/()1(

    )2/(

    ...,,2

    ...,,2

    32

    32

    +

    =

    knR

    kRF

    ki

    ki

    xxxx

    xxxx

    i

    Run regression where one X is dependent and other Xs are independent and

    Obtain R square

    Ho: The Xi variable is not collinear

    Df num = k-2

    Df denom = n-k+1

    k- is the number of explanatory variables including intercept.n- is sample size.

    If F stat is higher than F critical then Xi variable is collinear

    Rule of thumb: if R square of auxiliary regression is higher

    than over R square then it might be troublesome.

  • 8/3/2019 Lecture 2. Relaxing the Assumptions of CLRM_0

    16/17

    WHAT TO DO ?

    Do nothing.

    Combining cross section and time series

    Transformation of variables (differencing, ratio

    transformation) Additional data observations.

  • 8/3/2019 Lecture 2. Relaxing the Assumptions of CLRM_0

    17/17

    READING

    Gujarati D., (2003), Basic Econometrics,

    Ch. 10