31
21/04/2011 1 1 psyc3010 lecture 9 psyc3010 lecture 9 mediation and moderation in mediation and moderation in multiple regression (MR) multiple regression (MR) last lecture: standard and hierarchical MR next lecture: review of regression topics 2 last lecture last lecture Æ Æ this lecture this lecture last lecture – detailed look at regression with predictors entered simultaneously (standard MR standard MR) – introduction to regression with predictors entered sequentially (hierarchical MR hierarchical MR) this lecture – applications of hierarchical multiple regression: mediation mediation (conceptual intro) (conceptual intro) moderation moderation

psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

1

11

psyc3010 lecture 9psyc3010 lecture 9

mediation and moderation in mediation and moderation in multiple regression (MR)multiple regression (MR)

last lecture: standard and hierarchical MRnext lecture: review of regression topics

22

last lecture last lecture this lecturethis lecture

last lecture– detailed look at regression with predictors

entered simultaneously (standard MRstandard MR)– introduction to regression with predictors

entered sequentially (hierarchical MRhierarchical MR)

this lecture– applications of hierarchical multiple regression:

•• mediation mediation (conceptual intro)(conceptual intro)•• moderationmoderation

Page 2: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

2

33

announcementsannouncementsWeek 10 Week 10 lecture lecture –– review of regression topicsreview of regression topics–– Some cool extra information on indirect effects and Some cool extra information on indirect effects and

structural equation structural equation modellingmodelling–– Practice questions and Practice questions and practice quiz will practice quiz will be be posted onlineposted online

Quiz 2 Quiz 2 –– Week 11Week 11–– assesses material taught in Lectures 6, 7, 8, and 9assesses material taught in Lectures 6, 7, 8, and 9

Assignment 2 Assignment 2 –– due due Week 12Week 12–– All files now on BlackboardAll files now on Blackboard–– Will learn all skills and concepts by end of this weekWill learn all skills and concepts by end of this week–– Week 10 tutorials provide tipsWeek 10 tutorials provide tips–– Week 11 tutorials provide consultation Week 11 tutorials provide consultation

44

topics for this lecture topics for this lecture

mediation– what it means– steps in the analysis– what to report

moderation – what it means– steps in the analysis

(what to report is covered in tutorials)– additional issues (e.g., predictor importance)– advanced topics + resources

Page 3: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

3

55

mediation analyses

66

what mediation iswhat mediation isso far, we’ve considered directdirect relationships between predictors and a criterion

e.g., more therapy lower depression

sometimes, these relationships don’t say much about underlying processes or mechanisms

whywhy does therapy decrease depression?

a third variable, or mediatormediator, may explainexplain or account foraccount for the relationship between an IV and a DV, e.g.:

therapy change negative thoughts lower depressiontherapy social support lower depression

thus, the original predictor has an indirectindirect relationship with the criterion

Page 4: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

4

77

mediation:mediation:IV “causes” DV through mediatorIV “causes” DV through mediator

mediator

independentvariable

dependentvariable

A C

B

1) IV is related to (‘causes’) mediator (path a)2) IV is related to DV (path b)3) mediator is related to DV when

effect of IV is controlled for (path c.b)4) IV is no longer related to DV when

effect of mediator is controlled for (path b.c)

88

testing and reporting mediation

report:significant R2 and coefficient (b or β)

SMR analysis:regress mediator on IV

show path A:IV and mediator are related

Regression analysis #1

Page 5: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

5

99

partial vs. full mediation HMR analysis – Block 1:Predict DV from IV

HMR analysis – Block 2:Predict DV from IV + mediator

show path B:IV DV

show path C:mediator DV, controlling for IV

report:sig R2 & coefficient

report:R2 change +coefficients for IV and mediator

• if coefficient for mediator is significant, then condition 3 is met (path C.B is significant)

• coefficient for IV (path B.C) also matters:• if no longer significant full mediation• if still significant partial mediation

Regression analysis #2

1010

moderation analyses

Human salvation lies in the hands of the creatively maladjusted.

--Martin Luther King Jr.

Page 6: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

6

1111

interactions in MRrelationship between a criterion and a predictor relationship between a criterion and a predictor variesvaries as a function of a second predictoras a function of a second predictorthe second predictor is usually called a the second predictor is usually called a moderatormoderatormoderator enhancesenhances or attenuatesattenuates the relationship between criterion and predictor

examples:* stress lower health, moderated by social support* dieting lower weight, moderated by exercise

moderated regressionmoderated regression achieves the same purpose achieves the same purpose as examination of interactions in factorial ANOVA: as examination of interactions in factorial ANOVA: effect of iv X at different levels of moderator Zeffect of iv X at different levels of moderator Z

1212

criterion

(Y)

predictor2

(Z)

predictor1

(X)

standard regression

investigation of independent relationships

between predictors and criterion

Page 7: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

7

1313

criterion

(Y)

moderator

(Z)

predictor

(X)

moderated regression

moderator Z has an ‘effect’ on the

relationship between X and Y

moderator (Z) could set limits or “boundary conditions” on the X Y relationshipe.g.: eating increases weight only when exercise level is low

stress worsens health only when social support is low

1414

criterion

(Y)

moderator

(Z)

predictor

(X)

moderated regression

A moderated multiple regression model tests the direct effect of Z, plus the direct effect of X, and the XZ interaction

As in ANOVA, the tests are conceptually distinct – we cannot infer that an interaction is sig because the direct effects are, or vice versa

Page 8: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

8

1515

moderation versus mediationmoderation:moderation:

focus is on the direct X focus is on the direct X Y relationship: Z Y relationship: Z adjustsadjusts itit•• aat low Z, the X t low Z, the X Y relationship is different compared to Y relationship is different compared to

the X the X Y relationship at high ZY relationship at high Zmoderator does not explain X moderator does not explain X Y relationship (no because)Y relationship (no because)moderator often uncorrelated with IV moderator often uncorrelated with IV e.g., family emergency family emergency wellwell--being, moderated by exercisebeing, moderated by exercise

mediation:mediation:focus on the indirect relationship of X focus on the indirect relationship of X Y via ZY via Z

•• X causes Y X causes Y becausebecause X causes Z, which in turn causes YX causes Z, which in turn causes Ymediator mediator is is associated with IV (+ or associated with IV (+ or -- relationship) relationship)

• e.g., exercise exercise lower stress lower stress higher wellhigher well--beingbeing

1616

additive effects:additive effects:

Ŷ = b1X + b2Z + a

& non & non--additive (interactive) effects:additive (interactive) effects:

Ŷ = b1X + b2Z + b3XZ + a

linear model for moderation

analysis plan: test whether a model including the interaction term increases the variance accounted for in the DV, compared to a model with just the two IVs considered independently

Page 9: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

9

1717

graph of graph of nono interaction interaction regression plane, additive effects only

the relationship (i.e., the slope) between X1 and Y is the same at all values of X2

1818

graph of interaction graph of interaction regression plane with interactive effects, varying b3

relationship (i.e., slope) between X1and Y variesover values of X2

Page 10: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

10

1919

questions in moderated regression

1. does the XZXZ interaction contribute significantlyto the prediction of Y?in hierarchical regression:

Enter direct effects in 1st blockEnter interaction term in 2nd block

significant R2 ch indicates a significant interaction

2. how do we interpret the effect Z has on the X Y relationship?

in ANOVA, we examine the simple effects of IV1 at different levels of IV2similarly, in moderated regression, we examine the simple slopes of X Y lines at different values of Z

2020

a hypothetical exampleUQ wants to admit students who will do well: what UQ wants to admit students who will do well: what indicator should be used as admissions criteria?indicator should be used as admissions criteria?

school performance (OP) is a known positive school performance (OP) is a known positive predictor of university performance (GPA)predictor of university performance (GPA)

BUT recent work has shown that students with BUT recent work has shown that students with lower OPs often get high GPAs in universitylower OPs often get high GPAs in university

how to explain this?how to explain this?

perhaps motivation to do well at uni influences (i.e., perhaps motivation to do well at uni influences (i.e., moderatesmoderates) the relationship between OP and GPA) the relationship between OP and GPA

OP is reverseOP is reverse--scored for this example so that scored for this example so that higher OP = better school performancehigher OP = better school performance

Page 11: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

11

2121

Correlations

1 .319** .338**. .000 .000

147 147 147.319** 1 .018.000 . .825147 147 147

.338** .018 1

.000 .825 .147 147 147

Pearson CorrelationSig. (2-tai led)NPearson CorrelationSig. (2-tai led)NPearson CorrelationSig. (2-tai led)N

GPA

OP

Motivation

GPA OP Motivation

Correlation is significant at the 0.01 level (2-tai led).**.

preliminary statisticspreliminary statistics

OP and motivation are both correlated with GPA

(high validities) ☺

OP and motivation are not correlated with

each other

(low collinearity) ☺

2222

standard multiple regressionstandard multiple regression

Model Summary

.460a .212 .201 .54438Model1

R R SquareAdjustedR Square

Std. Error ofthe Estimate

Predictors: (Constant), Motivation, OPa.

Coefficientsa

4.115 .231 17.831 .000.059 .014 .313 4.224 .000 .319 .332 .313.212 .047 .332 4.485 .000 .338 .350 .332

(Constant)OPMotivation

Model1

B Std. Error

UnstandardizedCoefficients

Beta

StandardizedCoefficients

t Sig. Zero-order Partial PartCorrelations

Dependent Variable: GPAa.

OP and motivation together account

for 21% of the variance in GPA

their individual contributions are also substantial and significant

Page 12: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

12

2323

moderated multiple regression moderated multiple regression (MMR)(MMR)

steps in testing for moderation:steps in testing for moderation:

1)1) calculate interaction termcalculate interaction term

2)2) test for significance of interactiontest for significance of interaction

3)3) if interaction is significant, test for simple if interaction is significant, test for simple slopes slopes (similar to simple effects in ANOVA)(similar to simple effects in ANOVA)

4)4) plot interaction on graphplot interaction on graph

2424

MMR step 1: calculate interaction term

Page 13: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

13

2525

calculating the interaction term•• standard regression deals with the additive effectsstandard regression deals with the additive effects•• now we need to compute the interaction term, now we need to compute the interaction term,

so that we can test for moderation:so that we can test for moderation:

Ŷ = b1X + b2Z + b3XZ + a• annoyingly SPSS does not do this for us – we have to

manually create the interaction term as a new variable, and then add it into the regression equation

•• to do this we could simply compute a new variable by to do this we could simply compute a new variable by multiplying X and Z together multiplying X and Z together

however this would make the resulting interactionthis would make the resulting interactionterm highly term highly collinearcollinear with our other predictorswith our other predictors

2626

problem: multicollinearity between interaction term and predictors

participant participant OPOP motivation motivation OP x MOTOP x MOT GPAGPA(X)(X) (Z)(Z) (XZ)(XZ) (Y)(Y)

1 4 1 4 3.52 6 2 12 3.93 8 3 24 4.54 10 4 40 5.15 12 5 60 5.5

the interaction term formed by calculating the cross-products of the original predictors will alwaysalways be correlated with those predictors

Page 14: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

14

2727

solution: mean-centering

what to dowhat to do: subtract the mean of each predictor from : subtract the mean of each predictor from each observation’s score on that predictoreach observation’s score on that predictor

participant participant cOPcOP cmotivationcmotivation ccOPOP x x cMOTcMOT GPAGPA(X)(X) (Z)(Z) (XZ)(XZ) (Y)(Y)

1 -4 -2 8 3.52 -2 -1 2 3.93 0 0 0 4.54 2 1 2 5.15 4 2 8 5.5

an an interaction term based on meaninteraction term based on mean--centeredcenteredpredictors will not be (as) correlated with predictors will not be (as) correlated with those predictorsthose predictors

2828

Correlations

.014

.865147

.105

.206147

Pearson CorrelationSig. (2-tailed)NPearson CorrelationSig. (2-tailed)N

C_OP

C_MOT

C_INT

Correlations

.524**.000147

.839**.00

147

Pearson CorrelationSig. (2-tailed)NPearson CorrelationSig. (2-tailed)N

OP

MOT

INT

withoutcentering

withcentering

Page 15: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

15

2929

1) reduces 1) reduces multicollinearitymulticollinearity (as shown on previous slides)

2) easier to interpret coefficients in presence of 2) easier to interpret coefficients in presence of interaction:interaction:– in MR, b1X tells us the relationship between X and Y,

when all other predictors (e.g., Z) are set to zero– this is fine when there is no interaction: b1 is constant &

will be the same whether Z = 0, or Z = 7, and so on– BUT when there is an interaction: b1 is different at

different values of Z, so why use Z = 0 to calculate it?(maybe it’s an end-point, or not even on the scale!)

– by using centred variables, b1 represents the relationship between X and Y at the mean of Z (i.e., 0)

– this makes the coefficients for the direct effects more meaningful

why mean-centering is good

3030

MMR step 2: test for significance of interaction

Page 16: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

16

3131

hierarchical regression analysishierarchical regression analysis

11stst blockblock: enter : enter centeredcentered IVs as predictors IVs as predictors of DV (additive effects)of DV (additive effects)

22ndnd blockblock: enter interaction term : enter interaction term (previously calculated) to see (previously calculated) to see if it accounts for additional if it accounts for additional variance in DVvariance in DV

test of the interaction term

3232

preliminary statistics

Descriptive Statistics

147 3.40 6.50 5.4754 .60891147 -12.92 5.08 .0004 3.24039147 -1.56 2.44 .0000 .95284147

GPAC_OPC_MOTValid N (listwise)

N Minimum Maximum Mean Std. Deviation

• no need to center the criterion, because there is no collinearity problem

• can then interpret the solution in terms of DV scale

centered predictors were computed by subtracting their mean from all values

– hence their new mean is zero

standard deviations have

not changed

Page 17: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

17

3333

Correlations

1 .319** .338** -.204*. .000 .000 .013

147 147 147 147.319** 1 .018 .014.000 . .825 .865147 147 147 147

.338** .018 1 .105

.000 .825 . .206147 147 147 147

-.204* .014 .105 1.013 .865 .206 .147 147 147 147

Pearson CorrelationSig. (2-tailed)NPearson CorrelationSig. (2-tailed)NPearson CorrelationSig. (2-tailed)NPearson CorrelationSig. (2-tailed)N

GPA

C_OP

C_MOT

C_INT

GPA C_OP C_MOT C_INT

Correlation is significant at the 0.01 level (2-tailed).**.

Correlation is significant at the 0.05 level (2-tailed).*.

preliminary statisticsnone of the original correlations is influenced by centering

interaction term is correlated with

criterion

c c

3434

moderated regression Model Summary

.460a .212 .201 .54438 .212 19.335 2 144 .000

.521b .271 .256 .52520 .060 11.706 1 143 .001

Model12

R R SquareAdjustedR Square

Std. Error ofthe Estimate

R SquareChange F Change df1 df2 Sig. F Change

Change Statistics

Predictors: (Constant), C_MOT, C_OPa.

Predictors: (Constant), C_MOT, C_OP, C_INTb.

step 1: results are identical to standard MR with OP and motivation as predictors

step 2: * RR2 2 chch = .06= .06, so the interaction explains 6% of the variance

in GPA over and above additive effects (motivation and OP)* FchFch (1,143) = 11.71, (1,143) = 11.71, pp = .001= .001, so this increment in

explained variance is significant

Page 18: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

18

3535

MMR step 3: test simple slopes

3636

what are simple slopes again?they help us interpret a significant interactionthey help us interpret a significant interaction

simple effects in ANOVA: examine effect of Factor A simple effects in ANOVA: examine effect of Factor A at different levels of Factor Bat different levels of Factor Bsimple slopes in MMR: examine IVsimple slopes in MMR: examine IV--DV relationship at DV relationship at different levels of the moderatordifferent levels of the moderator

predictors in MMR are continuous predictors in MMR are continuous –– have no levelshave no levelsokay, technically each point on a scale is a level, okay, technically each point on a scale is a level, but doing so many followbut doing so many follow--up tests is not good up tests is not good (Type 1 errors, parsimony, sanity...)(Type 1 errors, parsimony, sanity...)

we we selectselect critical values of the moderator where it is critical values of the moderator where it is interesting to examine the simple slopes of the interesting to examine the simple slopes of the association between the predictor and Yassociation between the predictor and Ywe use logical grounds, usually we use logical grounds, usually +1 and +1 and --1 SD1 SD of of moderatormoderator (“high” and “low” levels of Z)(“high” and “low” levels of Z)

Page 19: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

19

3737

another look at simple slopes

regression plane with interactive effects, varying b3

we examine the relationship between X and Y at high and low values of Z

(and typically the values chosen are ± 1SD of Z)

X

Z

Y

3838

Coefficientsa

5.475 .045 121.949 .000.059 .014 .313 4.224 .000 .313.212 .047 .332 4.485 .000 .332

5.478 .043 126.442 .000.059 .013 .316 4.420 .000 .315.229 .046 .358 4.982 .000 .356

-.047 .014 -.246 -3.421 .001 -.244

(Constant)C_OPC_MOT(Constant)C_OPC_MOTC_INT

Model1

2

B Std. Error

UnstandardizedCoefficients

Beta

StandardizedCoefficients

t Sig. PartCorrelations

Dependent Variable: GPAa.

results from step 2 of HMR results from step 2 of HMR

step 2 provides partial regression coefficients that specify the direct and interactive effects in the regression equation:

Ŷ = 0.059C_OP + 0.229C_MOT + -0.047C_INT + 5.478

Page 20: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

20

3939

•• we now have a single regression line predicting we now have a single regression line predicting Ŷ from a combination of X, Z, and their interaction:

Ŷ = B1X + B2Z + B3XZ + c

•• to further examine the effect of Z on the Xto further examine the effect of Z on the X Y relationship, Y relationship, we want to create two regression equations:we want to create two regression equations:

slope of Y regressed upon X at high levels of Zslope of Y regressed upon X at high levels of Zslope of Y regressed upon X at low levels of Zslope of Y regressed upon X at low levels of Z

•• so we so we recast the original equationrecast the original equation to allow us to derive to allow us to derive these two new ones:these two new ones:

Ŷ = (B1 + B3Z)X + (B2Z + c)

more details on next slide

deriving equations for simple slopesderiving equations for simple slopes

4040

trust the algebra

Ŷ = B1X + B2Z + B3XZ + c

Ŷ = ( B1 + B3 Z )X + ( B2Z + c )

effect of X at different levels of Z:• B3 tells us how much B1 changes as a

result of the interaction• can insert different values of Z to create

our two new equations and calculate:* slope of X at high Z (e.g., +1 S.D.)* slope of X at low Z (e.g., -1 S.D.)

Page 21: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

21

4141

Coefficientsa

5.475 .045 121.949 .000.059 .014 .313 4.224 .000 .313.212 .047 .332 4.485 .000 .332

5.478 .043 126.442 .000.059 .013 .316 4.420 .000 .315.229 .046 .358 4.982 .000 .356

-.047 .014 -.246 -3.421 .001 -.244

(Constant)C_OPC_MOT(Constant)C_OPC_MOTC_INT

Model1

2

B Std. Error

UnstandardizedCoefficients

Beta

StandardizedCoefficients

t Sig. PartCorrelations

Dependent Variable: GPAa.

how can we make SPSS test the slope of OP when mot = +1 SD?

logically, if we now centre Z at +1 SD, then b1 would be the slope for X Y at high Z

remember that bremember that b11 is the slope for X is the slope for X Y when Z = 0Y when Z = 0since we have since we have centeredcentered our predictors, 0 is the mean of Zour predictors, 0 is the mean of Zthus, the “normal” test of thus, the “normal” test of bb11 is the slope for c_op is the slope for c_op Y Y at the mean value of Z at the mean value of Z

results from step 2 of HMR results from step 2 of HMR

4242

first:create two new variables for the moderator, one for each level you are interested in (e.g., high and low)formulae: add or subtract 1 S.D. (standard deviation)

ModABOVE = cmod – (SD)ModBELOW = cmod – (– SD)

be consistent in your labelling to help remember what each variable is

include cues to remind you it’s centered at + or -1SD

example 1: c_mot_hi = cmot – (.95) and c_mot_lo = cmot – (-.95)

example 2:highc_mot = cmot – (.95) and lowc_mot = cmot – (-.95)

tests of simple slopes for high & low moderator

yes, it’s counteryes, it’s counter--intuitiveintuitive

Page 22: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

22

4343

secondsecond:form the crossform the cross--products of the new moderator variable products of the new moderator variable with the other predictor with the other predictor IV, IV, by computing by computing IV*IV*ModModCVCVagain, be consistent in the labels of your variables so you remember what things are

e.g., can use the prefix ci_ to stand for centered interaction:

compute ci_opxmhi = c_op * c_mot_hi .compute ci_opxmlo = c_op * c_mot_lo .execute

tests of simple slopes for high & low moderator

4444

third:

regress Y on IV, ModCV, & IV*ModCV for each of the 2 critical values (high and low)

examine test of IV in final block with MODCV and interaction of IV & MODCV also in the equation

that is, test bIV for the simple slope of Y on IV at the conditional value of Mod (high or low)

tests of simple slopes for high & low moderator

Page 23: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

23

4545

Coefficientsa

5.678 .064 89.253 .000.059 .014 .313 4.224 .000 .313.212 .047 .332 4.485 .000 .332

5.696 .062 92.458 .000.014 .019 .076 .764 .446 .055.229 .046 .358 4.982 .000 .356

-.047 .014 -.341 -3.421 .001 -.244

(Constant)C_OPMOTHIGH(Constant)C_OPMOTHIGHC_INT_HI

Model1

2

B Std. Error

UnstandardizedCoefficients

Beta

StandardizedCoefficients

t Sig. Part

Correlations

Dependent Variable: GPAa.

test simple slope of OP at high motivation

center motivation at center motivation at +1SD+1SD and reand re--run the MMR, i.e., run the MMR, i.e., – subtract 0.9528 from C_MOT to give C_MOThi– recalculate interaction term (C_OP x C_MOThi) to give INThi– enter C_OP and C_MOThi at step 1 and INThi at step2

b1 is now the slope for X Y at high Z

4646

Coefficientsa

5.273 .064 82.900 .000.059 .014 .313 4.224 .000 .313.212 .047 .332 4.485 .000 .332

5.260 .061 85.551 .000.104 .019 .555 5.517 .000 .394.229 .046 .358 4.982 .000 .356

-.047 .014 -.346 -3.421 .001 -.244

(Constant)C_OPMOTLOW(Constant)C_OPMOTLOWC_INT_LO

Model1

2

B Std. Error

UnstandardizedCoefficients

Beta

StandardizedCoefficients

t Sig. Part

Correlations

Dependent Variable: GPAa.

center motivation at center motivation at --1SD1SD and reand re--run the MMR, i.e., run the MMR, i.e., – subtract -0.9528 from C_MOT to give C_MOTlo– recalculate interaction term (C_OP x C_MOTlo) to give INTlo– enter C_OP and C_MOTlo at step 1 and INTlo at step 2

b1 is now the slope for X Y at low Z

test simple slope of OP at low motivation

Page 24: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

24

4747

MMR step 4:plot simple slopes on graph

4848

plotting the simple slopesplotting the simple slopes

the two equations we just derived tell us:(1) slope of IV DV line at high level of moderator(2) slope of IV DV line at low level of moderator

although we know the significance of the slopes, it’s hard to interpret the IV-DV relationships using equations alone

so we plot both regression lines on a graph –to do this, we need to know two points on each line

Page 25: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

25

4949

(1) plotting simple slopes by hand(1) plotting simple slopes by handfor the low moderator regression equation: for the low moderator regression equation:

use constant and simple slope from SPSS output use constant and simple slope from SPSS output (Step 2 of HMR) to create regression equation:(Step 2 of HMR) to create regression equation:

Ŷ = 0.104C_OP + 5.260 Ŷ = 0.104C_OP + 5.260 calculate 2 values of calculate 2 values of ŶŶ, by inserting two , by inserting two values of X (i.e., the IV OP)values of X (i.e., the IV OP)typically use high and low values of Xtypically use high and low values of X

high = +1 S.D., low = high = +1 S.D., low = --1 S.D.1 S.D.indicate these two points on the graphindicate these two points on the graphdraw the regression line between themdraw the regression line between them

repeat for the high moderator regression repeat for the high moderator regression equationequation

5050

back to our exampleOP (IV) descriptives: mean = 0 (centered), SD = 3.240

equation for low motivation: (low level of moderator)Ŷ = 0.104C_OPC_OP + 5.260 (from output on slide 4545)

for low OP: Ŷ = 0.104 x --3.2403.240 + 5.260 = 4.923for high OP: Ŷ = 0.104 x 3.2403.240 + 5.260 = 5.596

equation for high motivation: (high level of moderator)Ŷ = 0.014C_OPC_OP + 5.696 (from output on slide 4444)

for low OP: Ŷ = 0.014 x --3.2403.240 + 5.696 = 5.650for high OP: Ŷ = 0.014 x 3.2403.240 + 5.696 = 5.742

Page 26: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

26

51

4.5

5.0

5.5

6.0

Low High

OP

GPA

Low MotHighMot

5.6502

4.9234

5.7422

5.5962

here are the 4 points we calculated for high/low OP and high/low motivation

#s don’t go on figure

52

4.5

5.0

5.5

6.0

Low High

OP

GPA

Low MotHighMot

partial regression coefficients (bs) for the regression of OP on GPA at high and low motivation

Page 27: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

27

5353

(2) plotting simple slopes (2) plotting simple slopes with Excelwith Excel

plotting simple slopes by hand is clearly cumbersome and slowThere is an Excel macro to do these calculations for us, and to give us the figureyou will learn this strategy in tutorials this week......which will be of enormous help when you are working on Assignment 2

5454

Good work eh!now we have gone through all the steps in now we have gone through all the steps in moderated multiple regression: moderated multiple regression: –– we identified a significant interaction we identified a significant interaction –– we we decomposeddecomposed the interaction by examining the interaction by examining

the simple slopesthe simple slopes–– we plotted these simple slopes on a graphwe plotted these simple slopes on a graph

so now we have an answer to our questionso now we have an answer to our question

our analysis showed that the relationship between our analysis showed that the relationship between OP and GPA is significant at lower levels of OP and GPA is significant at lower levels of motivation but not higher levelsmotivation but not higher levels–– high motivation high motivation attenuatesattenuates or or buffers againstbuffers against the the

effects of poor prior academic performance on effects of poor prior academic performance on current academic performancecurrent academic performance

Page 28: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

28

5555

additional miscellaneous issues

5656

predictor importancein the in the ZZhighhigh and and ZZlowlow regressions we just ran, the output gives regressions we just ran, the output gives us srus sr2 2 values for our simple slopes:values for our simple slopes:–– for C_OP in the for C_OP in the ZZhighhigh solution, solution, srsr = 0.055 = 0.055 (part r in SPSS)(part r in SPSS)OP accounts for .3% of variance at high levels of motivationOP accounts for .3% of variance at high levels of motivation–– for C_OP in the for C_OP in the ZZlowlow solution, solution, srsr = 0.394 = 0.394 (part r in SPSS)(part r in SPSS)OP accounts for 15.5% of variance at low levels of motivationOP accounts for 15.5% of variance at low levels of motivation

standardisedstandardised partial regression coefficients might be useful forpartial regression coefficients might be useful forinterpretation to avoid scale dependenceinterpretation to avoid scale dependence–– e.g., e.g., at low levels of motivation, a 1 unit increase in OP at low levels of motivation, a 1 unit increase in OP

results in a .104 unit increase in GPA results in a .104 unit increase in GPA –– not very informative if you are reading this paper not very informative if you are reading this paper

in Europe or North Americain Europe or North America

Page 29: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

29

5757

-1.5

-1.0

-0.5

0.0

0.5

1.0

Low High

OP

GPA

Low MotHighMot

standardised partial regression coefficients (βs) for the regression of OP on GPA at high and low motivation

5858

-1.5

-1.0

-0.5

0.0

0.5

1.0

Low High

OP

GPA

Low MotHighMot

at low levels of motivation, a 1SD increase in OP predicts a .556 SD increase in GPA

a .556 SD increase is substantial regardless of the scale we have

Page 30: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

30

5959

why don’t we simply use the why don’t we simply use the ββvalues in the original SPSS output? values in the original SPSS output? the the ββ values for the simple slopes are usually reported and values for the simple slopes are usually reported and can be interpretedcan be interpreted

BUT BUT the interaction term beta should not be interpretedthe interaction term beta should not be interpreted–– SPSS SPSS firstfirst computes the crosscomputes the cross--product of raw X and Z product of raw X and Z

and and thenthen standardises that valuestandardises that value–– we want the crosswe want the cross--product of standardised X and Z, product of standardised X and Z,

notnot the standardised crossthe standardised cross--product of X and Zproduct of X and Z

Ok but rare to use the Ok but rare to use the unstandardisedunstandardised solution solution –– b’sb’s are scale dependent, so should are scale dependent, so should report srreport sr22 as wellas well to to

get a sense of what ‘size’ the effect is (no one does)get a sense of what ‘size’ the effect is (no one does)

common alternative is to report interaction as beta (but not common alternative is to report interaction as beta (but not interpret it) and simple slopes as betas (interpretable)interpret it) and simple slopes as betas (interpretable)

6060

advanced topics in MR

categorical predictors (SMR or HMR)categorical predictors (SMR or HMR)–– use dummy coding or effect codinguse dummy coding or effect coding–– more complex if > 2 groupsmore complex if > 2 groups

3+ predictors in MMR3+ predictors in MMRsee Aiken & West (1991)see Aiken & West (1991)

categorical moderator variablescategorical moderator variablesssee Aiken & West (1991)ee Aiken & West (1991)

Page 31: psyc3010 lecture 9 - University of Queenslanduqwloui1/stats/3010 for post/3010l9 for post.pdf · 21/04/2011 2 3 announcements Week 10 Week 10 lecture lecture –– review of regression

21/04/2011

31

6161

readingsreadingsmediation and moderation in multiple regression (this lecture)

Field (3rd & 2nd eds): no new readingsHowell (all eds): Chapter 15

review of regression topics (next lecture)Field (3rd ed): review Chapter 7Field (2nd ed): review Chapter 5