The Reproducibility Crisis in Psychological Science: One Year Later

  • View
    33

  • Download
    1

  • Category

    Science

Preview:

Citation preview

The Reproducibility Crisis in Psychological Science

Jim Grangewww.jimgrange.wordpress.com

2011 – “A Year of Horrors”

http://bit.ly/2eNL05d

“Derailed”

“…the field of psychology currently uses methodological

and statistical strategies that are too weak, too malleable, and

offer too many opportunities for researchers to befuddle

themselves and their peers”

Each case raised unique questions about how science is

conducted in psychology

From how research is planned right through to how data are

analysed and published

All cases pertain to growing concern over the number of

false positives in the literature

“How reproducible are psychology

findings?”

Open Science Collaboration

270+ researchers from across the

globe

Open Science Collaboration

Performed close replications of 100 psychology studies

Only 36% of studies

replicated!!

A Year Later

Recommendations

1. Replicate, replicate, replicate…2. Know your statistics3. Open your science4. Incorporate open science

practices in teaching5. Reward open science practices

Recommendations

1. Replicate, replicate, replicate…2. Know your statistics3. Open your science4. Incorporate open science

practices in teaching5. Reward open science practices

VERIFICATION!!!

Devoting resources to verification is irrational if the original findings are

valid

Devoting resources to verification is rational if the original findings are

invalid

We have a professional responsibility to ensure the

findings we are reporting are robust and replicable

Recommendations

1. Replicate, replicate, replicate…2. Know your statistics3. Open your science4. Incorporate open science

practices in teaching5. Reward open science practices

Recommendations

1. Replicate, replicate, replicate…2. Know your statistics3. Open your science4. Incorporate open science

practices in teaching5. Reward open science practices

2. Know Your Statistics

2. Know Your Statistics

False-positives propagate because most researchers

don’t understand the p-value (Cumming, 2012)

a) the probability that the results are due to chance

b) the probability that the results are not due to chance

c) the probability of observing results as extreme (or more) as obtained if there is no effect in reality

d) the probability that the results would be replicated if the experiment was conducted a second time

a) the probability that the results are due to chance

b) the probability that the results are not due to chance

c) the probability of observing results as extreme (or more) as obtained if there is no effect in reality

d) the probability that the results would be replicated if the experiment was conducted a second time

True or False?

The p-value tells us something about the size of an effect

True or False?

The p-value tells us something about the importance of an effect

True or False?

The p-value tells us something about the probability of our

hypothesis

a) the probability that the results are due to chance

b) the probability that the results are not due to chance

c) the probability of observing results as extreme (or more) as obtained if there is no effect in reality

d) the probability that the results would be replicated if the experiment was conducted a second time

2. Know Your Statistics

p(D|H)

2. Know Your Statistics

p(D|H)

p(H|D)Same?

p(Dead|Murdered)?= 1

p(Murdered|Dead)~<.001

p(Dead|Murdered)?= 1

p(Murdered|Dead)~<.001

p(Dead|Murdered)?= 1

p(Murdered|Dead)~<.001

p(Dead|Murdered)?= 1

p(Murdered|Dead)~<.001

Recommendations

1. Replicate, replicate, replicate…2. Know your statistics3. Open your science4. Incorporate open science

practices in teaching5. Reward open science practices

Recommendations

1. Replicate, replicate, replicate…2. Know your statistics3. Open your science4. Incorporate open science

practices in teaching5. Reward open science practices

Science works by verification

27% (!!)

What percentage of researchers shared their

data?

Researchers should NOT keep their data private if

they have published from it

Nice Bonus (1): It’s going public, so I make sure data &

analysis is of high quality

Nice Bonus (2): I can find my data easily if asked for it

As of January 1, 2017, signatories (as reviewers and/or editors) make open practices a pre-condition for

more comprehensive review

EXPLORATORY CONFIRMATORY

“I appreciate your results were unexpected, but in order to tell

a nicer story, you should re-write your introduction as if you expected these results”

HARK-ingHypothesising After the

Results are Known

HARK-ing92% of psychology articles

report confirmed hypotheses (Fanelli, 2010)

www.osf.io

Recommendations

1. Replicate, replicate, replicate…2. Know your statistics3. Open your science4. Incorporate open science

practices in teaching5. Reward open science practices

Recommendations

1. Replicate, replicate, replicate…2. Know your statistics3. Open your science4. Incorporate open science

practices in teaching5. Reward open science practices

Recommendations

1. Replicate, replicate, replicate…2. Know your statistics3. Open your science4. Incorporate open science

practices in teaching5. Reward open science practices

Recommendations

1. Replicate, replicate, replicate…2. Know your statistics3. Open your science4. Incorporate open science

practices in teaching5. Reward open science practices

Strong Incentives to Pursue New Ideas

Strong Incentives to Pursue New Ideas

Publications

Strong Incentives to Pursue New Ideas

Grant Income ($$)

Strong Incentives to Pursue New Ideas

Employment

Strong Incentives to Pursue New Ideas

Promotion

Strong Incentives to Pursue New Ideas

Fame…?

5. Reward Open Science Practices

Good for Science:

- Truth seeking- Rigour- Quality- Reproducibility

Good for Individuals/Institutions:

- Publishable- Quantity- Novelty- Impact

“…the solution requires making the incentives

for getting it right competitive with the

incentives for getting it published”

(Nosek et al., 2012)

Individual Reputations Are At Stake

University Reputations Are At Stake

(Utopian) Ideas for Institutions

Doing research right takes longer

(Utopian) Ideas for Institutions

Be tolerant of lower output (if doing it right)

(Utopian) Ideas for Institutions

Limit the “Publish or Perish” mentality

(Utopian) Ideas for Institutions

“Rigour or Rot”

“Rigour or Perish”

(Utopian) Ideas for Institutions

Reward those doing it right

Universe A & B:• Investigating embodiment of political

extremism

• Participants (N = 1,979!) from the political left, right, and center

• Moderates perceived shades of grey more accurately than left or right (p<.01).

Universe A & B:

Moderates perceived shades of grey more

accurately than left or right (p<.05).

Universe A

• Moderates perceived shades of grey more accurately than left or right (p<.01).

Universe A

• Moderates perceived shades of grey more accurately than left or right (p<.01).

Universe A

• Moderates perceived shades of grey more accurately than left or right (p<.01).

Universe B

• Moderates perceived shades of grey more accurately than left or right (p<.01).

• Surprised by the effect, so tries to replicate the result before publishing–Uses even larger sample size than Study 1

• Replication fails to reproduce the effect–No publication

In which universe will this

student most likely receive a

lectureship position?

There is something wrong with hiring

decisions if “getting it published” is

rewarded more than “getting it right”

(Utopian) Ideas for Hiring Committees

Look for evidence of open science practice

(Utopian) Ideas for Hiring Committees

Have open science practice as a “desired” (or “essential”!)

item on job specification

(Utopian) Ideas for Hiring Committees

Judge publication quality rather than quantity

Recommendations

1. Replicate, replicate, replicate…2. Know your statistics3. Open your science4. Incorporate open science

practices in teaching5. Reward open science practices

Thank You!www.jimgrange.wordpress.com

Recommended