54
1 Multiple Regression Chapter 18

1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

  • View
    218

  • Download
    0

Embed Size (px)

Citation preview

Page 1: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

1

Multiple RegressionMultiple Regression

Chapter 18

Page 2: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

2

18.1 Introduction

• In this chapter we extend the simple linear regression model, and allow for any number of independent variables.

• We expect to build a model that fits the data better than the simple linear regression model.

Page 3: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

3

• We shall use computer printout to – Assess the model

• How well it fits the data• Is it useful• Are any required conditions violated?

– Employ the model• Interpreting the coefficients• Predictions using the prediction equation• Estimating the expected value of the dependent variable

Introduction

Page 4: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

4

Coefficients

Dependent variable Independent variables

Random error variable

18.2 Model and Required Conditions

• We allow for k independent variables to potentially be related to the dependent variable

y = 0 + 1x1+ 2x2 + …+ kxk +

Page 5: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

5

Multiple Regression for k = 2, Graphical Demonstration - I

y = 0 + 1xy = 0 + 1xy = 0 + 1xy = 0 + 1x

X

y

X2

1

The simple linear regression modelallows for one independent variable, “x”

y =0 + 1x +

The multiple linear regression modelallows for more than one independent variable.Y = 0 + 1x1 + 2x2 +

Note how the straight line becomes a plain, and...

y = 0 + 1x1 + 2x2

y = 0 + 1x1 + 2x2

y = 0 + 1x1 + 2x2

y = 0 + 1x1 + 2x2y = 0 + 1x1 + 2x2

y = 0 + 1x1 + 2x2

y = 0 + 1x1 + 2x2

Page 6: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

6

Multiple Regression for k = 2, Graphical Demonstration - II

Note how a parabola becomes a parabolic Surface.

X

y

X2

1

y= b0+ b1x2

y = b0 + b1x12 + b2x2

b0

Page 7: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

7

• The error is normally distributed.• The mean is equal to zero and the standard

deviation is constant ( for all values of y. • The errors are independent.

Required conditions for the error variable

Page 8: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

8

– If the model assessment indicates good fit to the data, use it to interpret the coefficients and generate predictions.

– Assess the model fit using statistics obtained from the sample.

– Diagnose violations of required conditions. Try to remedy problems when identified.

18.3 Estimating the Coefficients and Assessing the Model

• The procedure used to perform regression analysis:– Obtain the model coefficients and statistics using a statistical

software.

Page 9: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

9

• Example 18.1 Where to locate a new motor inn?– La Quinta Motor Inns is planning an expansion.– Management wishes to predict which sites are likely to be

profitable.– Several areas where predictors of profitability can be

identified are:• Competition• Market awareness• Demand generators• Demographics• Physical quality

Estimating the Coefficients and Assessing the Model, Example

Page 10: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

10

Profitability

Competition Market awareness Customers Community Physical

Margin

Rooms Nearest Officespace

Collegeenrollment

Income Disttwn

Distance to downtown.

Medianhouseholdincome.

Distance tothe nearestLa Quinta inn.

Number of hotels/motelsrooms within 3 miles from the site.

Page 11: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

11

Estimating the Coefficients and Assessing the Model, Example

Profitability

Competition Market awareness Customers Community Physical

Operating Margin

Rooms Nearest Officespace

Collegeenrollment

Income Disttwn

Distance to downtown.

Medianhouseholdincome.

Distance tothe nearestLa Quinta inn.

Number of hotels/motelsrooms within 3 miles from the site.

Page 12: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

12

• Data were collected from randomly selected 100 inns that belong to La Quinta, and ran for the following suggested model:

Margin = Rooms NearestOfficeCollege + 5Income + 6Disttwn

Estimating the Coefficients and Assessing the Model, Example

Margin Number Nearest Office Space Enrollment Income Distance55.5 3203 4.2 549 8 37 2.733.8 2810 2.8 496 17.5 35 14.449 2890 2.4 254 20 35 2.6

31.9 3422 3.3 434 15.5 38 12.157.4 2687 0.9 678 15.5 42 6.949 3759 2.9 635 19 33 10.8

Xm18-01

Page 13: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

13

This is the sample regression equation (sometimes called the prediction equation)This is the sample regression equation (sometimes called the prediction equation)

Regression Analysis, Excel OutputSUMMARY OUTPUT

Regression StatisticsMultiple R 0.7246R Square 0.5251Adjusted R Square 0.4944Standard Error 5.51Observations 100

ANOVAdf SS MS F Significance F

Regression 6 3123.8 520.6 17.14 0.0000Residual 93 2825.6 30.4Total 99 5949.5

Coefficients Standard Error t Stat P-valueIntercept 38.14 6.99 5.45 0.0000Number -0.0076 0.0013 -6.07 0.0000Nearest 1.65 0.63 2.60 0.0108Office Space 0.020 0.0034 5.80 0.0000Enrollment 0.21 0.13 1.59 0.1159Income 0.41 0.14 2.96 0.0039Distance -0.23 0.18 -1.26 0.2107

Margin = 38.14 - 0.0076Number +1.65Nearest + 0.020Office Space +0.21Enrollment + 0.41Income - 0.23Distance

Page 14: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

14

Model Assessment

• The model is assessed using three tools:– The standard error of estimate – The coefficient of determination– The F-test of the analysis of variance

• The standard error of estimates participates in building the other tools.

Page 15: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

15

• The standard deviation of the error is estimated by the Standard Error of Estimate:

• The magnitude of s is judged by comparing it to

1knSSE

s

Standard Error of Estimate

.y

Page 16: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

16

• From the printout, s = 5.51 • Calculating the mean value of y we have• It seems s is not particularly small. • Question:

Can we conclude the model does not fit the data well?

739.45y

Standard Error of Estimate

Page 17: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

17

• The definition is

• From the printout, R2 = 0.5251• 52.51% of the variation in operating margin is explained by

the six independent variables. 47.49% remains unexplained.• When adjusted for degrees of freedom,

Adjusted R2 = 1-[SSE/(n-k-1)] / [SS(Total)/(n-1)] = = 49.44%

2i

2

)yy(SSE

1R

Coefficient of Determination

Page 18: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

18

• We pose the question:Is there at least one independent variable linearly related to the dependent variable?

• To answer the question we test the hypothesis

H0: 0 = 1 = 2 = … = k

H1: At least one i is not equal to zero.

• If at least one i is not equal to zero, the model has some validity.

Testing the Validity of the Model

Page 19: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

19

• The hypotheses are tested by an ANOVA procedure ( the Excel output)

Testing the Validity of the La Quinta Inns Regression Model

MSE=SSE/(n-k-1)

MSR=SSR/k

MSR/MSE

SSE

SSR

k =n–k–1 = n-1 =

ANOVAdf SS MS F Significance F

Regression 6 3123.8 520.6 17.14 0.0000Residual 93 2825.6 30.4Total 99 5949.5

Page 20: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

20

[Variation in y] = SSR + SSE. Large F results from a large SSR. Then, much of the variation in y is explained by the regression model; the model is useful, and thus, the null hypothesis should be rejected. Therefore, the rejection region is…

Rejection region

F>F,k,n-k-1

Testing the Validity of the La Quinta Inns Regression Model

1knSSE

kSSR

F

Page 21: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

21

F,k,n-k-1 = F0.05,6,100-6-1=2.17F = 17.14 > 2.17

Also, the p-value (Significance F) = 0.0000Reject the null hypothesis.

Testing the Validity of the La Quinta Inns Regression Model

ANOVAdf SS MS F Significance F

Regression 6 3123.8 520.6 17.14 0.0000Residual 93 2825.6 30.4Total 99 5949.5

Conclusion: There is sufficient evidence to reject the null hypothesis in favor of the alternative hypothesis. At least one of the i is not equal to zero. Thus, at least one independent variable is linearly related to y. This linear regression model is valid

Conclusion: There is sufficient evidence to reject the null hypothesis in favor of the alternative hypothesis. At least one of the i is not equal to zero. Thus, at least one independent variable is linearly related to y. This linear regression model is valid

Page 22: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

22

• b0 = 38.14. This is the intercept, the value of y when all

the variables take the value zero. Since the data range of all the independent variables do not cover the value zero, do not interpret the intercept.

• b1 = – 0.0076. In this model, for each additional room

within 3 mile of the La Quinta inn, the operating margin

decreases on average by .0076% (assuming the other

variables are held constant).

Interpreting the Coefficients

Page 23: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

23

• b2 = 1.65. In this model, for each additional mile that the nearest competitor is to a La Quinta inn, the operating margin increases on average by 1.65% when the other variables are held constant.

• b3 = 0.020. For each additional 1000 sq-ft of office space, the operating margin will increase on average by .02% when the other variables are held constant.

• b4 = 0.21. For each additional thousand students the operating margin increases on average by .21% when the other variables are held constant.

Interpreting the Coefficients

Page 24: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

24

• b5 = 0.41. For additional $1000 increase in median household income, the operating margin increases on average by .41%, when the other variables remain constant.

• b6 = -0.23. For each additional mile to the downtown

center, the operating margin decreases on average

by .23% when the other variables are held constant.

Interpreting the Coefficients

Page 25: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

25

• The hypothesis for each i is

• Excel printout

H0: i 0H1: i 0 d.f. = n - k -1

Test statistic

ib

iis

bt

Testing the Coefficients

Coefficients Standard Error t Stat P-valueIntercept 38.14 6.99 5.45 0.0000Number -0.0076 0.0013 -6.07 0.0000Nearest 1.65 0.63 2.60 0.0108Office Space 0.020 0.0034 5.80 0.0000Enrollment 0.21 0.13 1.59 0.1159Income 0.41 0.14 2.96 0.0039Distance -0.23 0.18 -1.26 0.2107

Page 26: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

26

• The model can be used for making predictions by– Producing prediction interval estimate for the particular

value of y, for a given values of xi.– Producing a confidence interval estimate for the

expected value of y, for given values of xi.

• The model can be used to learn about relationships between the independent variables xi, and the dependent variable y, by interpreting the coefficients i

Using the Linear Regression Equation

Page 27: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

27

• Predict the average operating margin of an inn at a site with the following characteristics:– 3815 rooms within 3 miles,– Closet competitor .9 miles away,– 476,000 sq-ft of office space,– 24,500 college students,– $35,000 median household income,– 11.2 miles distance to downtown center.

MARGIN = 38.14 - 0.0076(3815) +1.65(.9) + 0.020(476) +0.21(24.5) + 0.41(35) - 0.23(11.2) = 37.1%

Xm18-01

La Quinta Inns, Predictions

Page 28: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

28

• Interval estimates by Excel (Data Analysis Plus)It is predicted, with 95%

confidence that the operating margin will lie between 25.4% and 48.8%.

It is estimated the average operating margin of all sites that fit this category falls within 33% and 41.2%.

The average inn would not be profitable (Less than 50%).

La Quinta Inns, Predictions

Prediction Interval

Margin

Predicted value 37.1

Prediction IntervalLower limit 25.4Upper limit 48.8

Interval Estimate of Expected ValueLower limit 33.0Upper limit 41.2

Page 29: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

29

Assessment and Interpretation:MBA Program Admission Policy

• The dean of a large university wants to raise the admission standards to the popular MBA program.

• She plans to develop a method that can predict an applicant’s performance in the program.

• She believes a student’s success can be predicted by:– Undergraduate GPA– Graduate Management Admission Test (GMAT) score– Number of years of work experience

Page 30: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

30

MBA Program Admission Policy

• A randomly selected sample of students who completed the MBA was selected. (See MBA).

• Develop a plan to decide which applicant to admit.

MBA GPA UnderGPA GMAT Work8.43 10.89 584 96.58 10.38 483 78.15 10.39 484 48.88 10.73 646 6

. . . .

. . . .

Page 31: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

31

MBA Program Admission Policy

• Solution – The model to estimate is:

y = 0 +1x1+ 2x2+ 3x3+

y = MBA GPAx1 = undergraduate GPA [UnderGPA]x2 = GMAT score [GMAT]x3 = years of work experience [Work]

– The estimated model:MBA GPA = b0 + b1UnderGPA + b2GMAT + b3Work

Page 32: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

32

SUMMARY OUTPUT

Regression StatisticsMultiple R 0.6808R Square 0.4635Adjusted R Square0.4446Standard Error 0.788Observations 89

ANOVAdf SS MS F Significance F

Regression 3 45.60 15.20 24.48 0.0000Residual 85 52.77 0.62Total 88 98.37

CoefficientsStandard Error t Stat P-valueIntercept 0.466 1.506 0.31 0.7576UnderGPA 0.063 0.120 0.52 0.6017GMAT 0.011 0.001 8.16 0.0000Work 0.093 0.031 3.00 0.0036

MBA Program Admission Policy – Model Diagnostics

Standardized residuals

0

10

20

30

40

-2.5 -1.5 -0.5 0.5 1.5 2.5 More

We estimate the regression model then we check:

Normality of errors

Page 33: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

33

SUMMARY OUTPUT

Regression StatisticsMultiple R 0.6808R Square 0.4635Adjusted R Square0.4446Standard Error 0.788Observations 89

ANOVAdf SS MS F Significance F

Regression 3 45.60 15.20 24.48 0.0000Residual 85 52.77 0.62Total 88 98.37

CoefficientsStandard Error t Stat P-valueIntercept 0.466 1.506 0.31 0.7576UnderGPA 0.063 0.120 0.52 0.6017GMAT 0.011 0.001 8.16 0.0000Work 0.093 0.031 3.00 0.0036

MBA Program Admission Policy – Model Diagnostics

We estimate the regression model then we check:

The variance of the error variable

Residuals

-3

-2

-1

0

1

2

6 7 8 9 10

Page 34: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

34

SUMMARY OUTPUT

Regression StatisticsMultiple R 0.6808R Square 0.4635Adjusted R Square 0.4446Standard Error 0.788Observations 89

ANOVAdf SS MS F Significance F

Regression 3 45.60 15.20 24.48 0.0000Residual 85 52.77 0.62Total 88 98.37

Coefficients Standard Error t Stat P-valueIntercept 0.466 1.506 0.31 0.7576UnderGPA 0.063 0.120 0.52 0.6017GMAT 0.011 0.001 8.16 0.0000Work 0.093 0.031 3.00 0.0036

MBA Program Admission Policy – Model Diagnostics

Page 35: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

35

MBA Program Admission Policy – Model Assessment

SUMMARY OUTPUT

Regression StatisticsMultiple R 0.6808R Square 0.4635Adjusted R Square0.4446Standard Error 0.788Observations 89

ANOVAdf SS MS F Significance F

Regression 3 45.60 15.20 24.48 0.0000Residual 85 52.77 0.62Total 88 98.37

CoefficientsStandard Error t Stat P-valueIntercept 0.466 1.506 0.31 0.7576UnderGPA 0.063 0.120 0.52 0.6017GMAT 0.011 0.001 8.16 0.0000Work 0.093 0.031 3.00 0.0036

• The model is valid (p-value = 0.0000…)

• 46.35% of the variation in MBA GPA is explained by the model.

• GMAT score and years of work experience are linearly related to MBA GPA.

• Insufficient evidence of linear relationship between undergraduate GPA and MBA GPA.

Page 36: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

36

• The conditions required for the model assessment to apply must be checked.

– Is the error variable normally distributed?

– Is the error variance constant?

– Are the errors independent?

– Can we identify outlier?– Is multicolinearity (intercorrelation)a problem?

18.4 Regression Diagnostics - II

Draw a histogram of the residuals

Plot the residuals versus y

Plot the residuals versus the time periods

Page 37: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

37

Diagnostics: Multicolinearity

• Example 18.2: Predicting house price (Xm18-02) – A real estate agent believes that a house selling price can be

predicted using the house size, number of bedrooms, and lot size. – A random sample of 100 houses was drawn and data recorded.

– Analyze the relationship among the four variables

Price Bedrooms H Size Lot Size124100 3 1290 3900218300 4 2080 6600117800 3 1250 3750

. . . .

. . . .

Page 38: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

38

SUMMARY OUTPUT

Regression StatisticsMultiple R 0.7483R Square 0.5600Adjusted R Square0.5462Standard Error 25023Observations 100

ANOVAdf SS MS F Significance F

Regression 3 76501718347 25500572782 40.73 0.0000Residual 96 60109046053 626135896Total 99 136610764400

Coefficients Standard Error t Stat P-valueIntercept 37718 14177 2.66 0.0091Bedrooms 2306 6994 0.33 0.7423House Size 74.30 52.98 1.40 0.1640Lot Size -4.36 17.02 -0.26 0.7982

• The proposed model isPRICE = 0 + 1BEDROOMS + 2H-SIZE +3LOTSIZE +

The model is valid, but no variable is significantly relatedto the selling price ?!

Diagnostics: Multicolinearity

Page 39: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

39

• Multicolinearity is found to be a problem.Price Bedrooms H Size Lot Size

Price 1Bedrooms 0.6454 1H Size 0.7478 0.8465 1Lot Size 0.7409 0.8374 0.9936 1

Diagnostics: Multicolinearity

• Multicolinearity causes two kinds of difficulties:– The t statistics appear to be too small.– The coefficients cannot be interpreted as “slopes”.

Page 40: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

40

Remedying Violations of the Required Conditions

• Nonnormality or heteroscedasticity can be remedied using transformations on the y variable.

• The transformations can improve the linear relationship between the dependent variable and the independent variables.

• Many computer software systems allow us to make the transformations easily.

Page 41: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

41

• A brief list of transformations» y’ = log y (for y > 0)

• Use when the s increases with y, or• Use when the error distribution is positively skewed

» y’ = y2

• Use when the s2 is proportional to E(y), or

• Use when the error distribution is negatively skewed» y’ = y1/2 (for y > 0)

• Use when the s2 is proportional to E(y)

» y’ = 1/y• Use when s2

increases significantly when y increases beyond some critical value.

Reducing Nonnormality by Transformations

Transformations, Example.

Page 42: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

42

Durbin - Watson Test:Are the Errors Autocorrelated?

• This test detects first order autocorrelation between consecutive residuals in a time series

• If autocorrelation exists the error variables are not independent

40

)(

1

2

2

21

disdofrangeThe

e

ee

dn

ii

n

iii

40

)(

1

2

2

21

disdofrangeThe

e

ee

dn

ii

n

iii

Residual at time i

Page 43: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

43

Positive First Order Autocorrelation

++

+

+

+

++ Residuals

Time

Positive first order autocorrelation occurs when consecutive residuals tend to be similar. Then,the value of d is small (less than 2).

0

+

Page 44: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

44

Negative First Order Autocorrelation

+

++

+

+

+

+0

Residuals

Time

Negative first order autocorrelation occurs when consecutive residuals tend to markedly differ. Then, the value of d is large (greater than 2).

Page 45: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

45

• If d<dL there is enough evidence to show that positive first-order correlation exists

• If d>dU there is not enough evidence to show that positive first-order correlation exists

• If d is between dL and dU the test is inconclusive.

One tail test for Positive First Order Autocorrelation

dL

First ordercorrelationexists

Inconclusivetest

Positive first order correlationDoes not exists

dU

Page 46: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

46

One Tail Test for Negative First Order Autocorrelation

• If d>4-dL, negative first order correlation exists

• If d<4-dU, negative first order correlation does not exists

• if d falls between 4-dU and 4-dL the test is inconclusive.

Negativefirst ordercorrelationexists

4-dU 4-dL

Inconclusivetest

Negative first order correlationdoes not exist

Page 47: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

47

• If d<dL or d>4-dL first order autocorrelation exists

• If d falls between dL and dU or between 4-dU and 4-dLthe test is inconclusive

• If d falls between dU and 4-dU there is no evidence for first order autocorrelation

dL dU 20 44-dU 4-dL

First ordercorrelationexists

First ordercorrelationexists

Inconclusivetest

Inconclusivetest

First ordercorrelationdoes notexist

First ordercorrelationdoes notexist

Two-Tail Test for First Order Autocorrelation

Page 48: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

48

• Example 18.3 (Xm18-03)– How does the weather affect the sales of lift tickets in a ski

resort?– Data of the past 20 years sales of tickets, along with the total

snowfall and the average temperature during Christmas week in each year, was collected.

– The model hypothesized was

TICKETS=0+1SNOWFALL+2TEMPERATURE+ – Regression analysis yielded the following results:

Testing the Existence of Autocorrelation, Example

Page 49: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

49

The Regression Equation – Assessment (I)

SUMMARY OUTPUT

Regression StatisticsMultiple R 0.3465R Square 0.1200Adjusted R Square 0.0165Standard Error 1712Observations 20

ANOVAdf SS MS F Signif. F

Regression 2 6793798 3396899 1.16 0.3373Residual 17 49807214 2929836Total 19 56601012

Coefficients Standard Error t Stat P-valueIntercept 8308.0 903.73 9.19 0.0000Snowfall 74.59 51.57 1.45 0.1663Tempture -8.75 19.70 -0.44 0.6625

The model seems to be very poor:

The model seems to be very poor:

• R-square=0.1200• It is not valid (Signif. F =0.3373)• No variable is linearly related to Sales

Xm18-03

Page 50: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

50

Diagnostics: The Error Distribution

01234567

-2.5 -1.5 -0.5 0.5 1.5 2.5 More

The errors histogram

The errors may benormally distributed

Page 51: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

51

-4000

-3000

-2000

-1000

0

1000

2000

3000

7500 8500 9500 10500 11500 12500

Residual vs. predicted y

It appears there is no problem of heteroscedasticity (the error variance seems to be constant).

Diagnostics: Heteroscedasticity

Page 52: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

52

-4000

-3000

-2000

-1000

0

1000

2000

3000

0 5 10 15 20 25

Residual over time

Diagnostics: First Order Autocorrelation

The errors are not independent!!

Page 53: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

53

Test for positive first order auto-correlation:n=20, k=2. From the Durbin-Watson table we have: dL=1.10, dU=1.54. The statistic d=0.5931

Conclusion: Because d<dL , there is sufficient evidence to infer that positive first order autocorrelation exists.

Durbin-Watson Statistic-2793.99-1723.23 d = 0.5931-2342.03-956.955-1963.73

.

.

Using the computer - ExcelTools > Data Analysis > Regression (check the residual option and then OK)Tools > Data Analysis Plus > Durbin Watson Statistic > Highlight the range of the residualsfrom the regression run > OK

The residuals

Diagnostics: First Order Autocorrelation

Page 54: 1 Multiple Regression Chapter 18. 2 18.1 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent

54

The Modified Model: Time Included

The modified regression model (Xm18-03mod)

TICKETS=0+ 1SNOWFALL+ 2TEMPERATURE+ 3TIME+

• All the required conditions are met for this model.

• The fit of this model is high R2 = 0.7410.

• The model is valid. Significance F = .0001. • SNOWFALL and TIME are linearly related to ticket sales.

• TEMPERATURE is not linearly related to ticket sales.