Evidence Informing Practice

Preview:

DESCRIPTION

Evidence Informing Practice . Robert Coe ASCL Annual Conference , 2 1 March 2014. Outline. What can research tell us about the likely impacts and costs of different strategies? How do we implement these strategies to … Focus on what matters Change classroom practice - PowerPoint PPT Presentation

Citation preview

Evidence Informing Practice Robert CoeASCL Annual Conference, 21 March 2014

Outline What can research tell us about the likely

impacts and costs of different strategies? How do we implement these strategies to …

– Focus on what matters– Change classroom practice– Target areas of need– Produce demonstrable benefits

2

Improving Education: A triumph of hope over experiencehttp://www.cem.org/attachments/publications/ImprovingEducation2013.pdf

Evidence about the effectiveness of different strategies

3

Toolkit of Strategies to Improve Learning

The Sutton Trust-EEF Teaching and Learning Toolkit http://www.educationendowmentfoundation.org.uk/toolkit/

Impact vs cost

Cost per pupil

Effec

t Size

(mon

ths g

ain)

£00

8

£1000

Meta-cognitive

Peer tutoringEarly Years1-1 tuitionHomework

(Secondary)

Mentoring

Summer schools After

school

AspirationsPerformance pay

Teaching assistants

Smaller classes

Ability grouping

Most promising for raising attainment

May be worth it

Small effects /

high cost

Feedback

Phonics

Homework (Primary)

CollaborativeSmall gp

tuition Parental involvement

Individualised learning

ICT

Behaviour

Social

www.educationendowmentfoundation.org.uk/toolkit

Some things that are popular or widely thought to be effective are probably not worth doing– Ability grouping (setting); After-school clubs;

Teaching assistants; Smaller classes; Performance pay; Raising aspirations

Some things look ‘promising’– Effective feedback; Meta- cognitive and self

regulation strategies; Peer tutoring/peer‐assisted learning strategies; Homework

Key messages

Clear, simple advice:

Choose from the top left Go back to school and do it

7

For every complex problem there is an answer that is clear, simple, and wrong

H.L. Mencken

8

Why not? We have been doing some of these things for a

long time, but have generally not seen improvement

Research evidence is problematic– Sometimes the existing evidence is thin– Research studies may not reflect real life– Context and ‘support factors’ may matter

Implementation is problematic– We may think we are doing it, but are we doing it right?– We do not know how to get large groups of teachers and

schools to implement these interventions in ways that are faithful, effective and sustainable

So what should we do?

9

Four steps to improvement

Think hard about learning Invest in good professional development Evaluate teaching quality Evaluate impact of changes

1. Think hard about learning

Impact vs cost

Cost per pupil

Effec

t Size

(mon

ths g

ain)

£00

8

£1000

Meta-cognitive

Peer tutoringEarly Years1-1 tuitionHomework

(Secondary)

Mentoring

Summer schools After

school

AspirationsPerformance pay

Teaching assistants

Smaller classes

Ability grouping

Most promising for raising attainment

May be worth it

Small effects /

high cost

Feedback

Phonics

Homework (Primary)

CollaborativeSmall gp

tuition Parental involvement

Individualised learning

ICT

Behaviour

Social

www.educationendowmentfoundation.org.uk/toolkit

1. Which strategies/interventions are very surprising (you really don’t believe it)?

2. Which strategies/interventions can you explain why they do (or don’t) improve attainment?

3. Which strategies/interventions o you want to know more about?

13

Poor Proxies for Learning Students are busy: lots of work is done (especially written

work) Students are engaged, interested, motivated Students are getting attention: feedback, explanations Classroom is ordered, calm, under control Curriculum has been ‘covered’ (ie presented to students in

some form) (At least some) students have supplied correct answers,

even if they– Have not really understood them– Could not reproduce them independently– Will have forgotten it by next week (tomorrow?)– Already knew how to do this anyway

14

Do children learn better in the morning or

afternoon?

∂Learning happens when people have

to think hard

A better proxy for learning?

17

Hard questions about your school How many minutes does an average

pupil on an average day spend really thinking hard?

Do you really want pupils to be ‘stuck’ in your lessons?

If they knew the right answer but didn’t know why, how many pupils would care?

2. Invest in effective CPD

How do we get students to learn hard things?

Eg Place value Persuasive

writing Music

composition Balancing

chemical equations

• Explain what they should do• Demonstrate it• Get them to do it (with

gradually reducing support)• Provide feedback • Get them to practise until it is

secure• Assess their skill/

understanding

How do we get teachers to learn hard things?

Eg Using formative

assessment Assertive

discipline How to teach

algebra

• Explain what they should do

Intense: at least 15 contact hours, preferably 50 Sustained: over at least two terms Content focused: on teachers’ knowledge of

subject content & how students learn it Active: opportunities to try it out & discuss Supported: external feedback and networks to

improve and sustain Evidence based: promotes strategies supported

by robust evaluation evidence

What CPD helps learners?

Do you do this?

3. Evaluate teaching quality

Why monitor? Strong evidence of (potential) benefit from

– Performance feedback (Coe, 2002)– Target setting (Locke & Latham, 2006)– Intelligent accountability (Wiliam 2010)

Individual teachers matter most Everyone can improve Teachers stop improving after 3-5 years Judging real quality/effectiveness is very hard

– Multidimensional– Not easily visible– Confounded

23

Monitoring the quality of teaching Progress in assessments

– Quality of assessment matters (cem.org/blog)– Regular, high quality assessment across curriculum (InCAS, INSIGHT)

Classroom observation– Much harder than you think! (cem.org/blog)– Multiple observations/ers, trained and QA’d

Student ratings– Extremely valuable, if done properly (http://

www.cem.org/latest/student-evaluation-of-teaching-can-it-raise-attainment-in-secondary-schools)

Other– Parent ratings feedback– Student work scrutiny– Colleague perceptions (360)– Self assessment– Pedagogical content knowledge

24

Teacher Assessment How do you know that it has captured

understanding of key concepts?– vs ‘check-list’ (eg ‘;’=L5, 3 tenses=L7)

How do you know standards are comparable?– Across teachers, schools, subjects– Is progress good?

How have you resolved tensions from teacher judgments being used to judge teachers?

25

Evidence-Based Lesson Observation

Behaviour and organisation– Maximise time on task, engagement, rules & consequences

Classroom climate– Respect, quality of interactions, failure OK, high

expectations, growth mindset Learning

– What made students think hard?– Quality of: exposition, demonstration, scaffolding, feedback,

practice, assessment– What provided evidence of students’ understanding?– How was this responded to? (Feedback)

26

27

Next generation of CEM systems …

Assessments that are– Comprehensive, across the full range of curriculum areas,

levels, ages, topics and educationally relevant abilities– Diagnostic, with evidence-based follow-up– Interpretable, calibrated against norms and criteria– High psychometric quality

Feedback that is– Bespoke to individual teacher, for their students and classes– Multi-component, incorporating learning gains, pupil ratings,

peer feedback, self-evaluation, …– Diagnostic, with evidence-based follow-up

Constant experimenting

4. Evaluate impact of changes

School ‘improvement’ often isn’t School would have improved anyway

– Volunteers/enthusiasts improve: misattributed to intervention– Chance variation (esp. if start low)

Poor outcome measures– Perceptions of those who worked hard at it– No robust assessment of pupil learning

Poor evaluation designs– Weak evaluations more likely to show positive results – Improved intake mistaken for impact of intervention

Selective reporting– Dredging for anything positive (within a study)– Only success is publicised

(Coe, 2009, 2013)

Clear, well defined, replicable intervention

Good assessment of appropriate outcomes

Well-matched comparison group

EEF DIY

Evaluatio

n Guide

Key elements of good evaluation

What could

you evaluate?

1. Think hard about learning

2. Invest in good CPD

3. Evaluate teaching quality

4. Evaluate impact of changes

Summary …

Robert.Coe@cem.dur.ac.uk @ProfCoe

www.cem.org

Recommended