Evidence to action: Why TESTA works

Preview:

Citation preview

Evidence to Action: Why TESTA works

Tansy Jessop@tansyjtweets @TESTAwin

#SLTCC201624 June 2016

The plan

1. Brief overview of TESTA 2. Three change-related problems3. Three findings and strategies 4. Do we need a paradigm shift?

Why assessment and feedback matter...

1) Assessment drives what students pay attention to, and defines the actual curriculum (Ramsden 1992).

2) Feedback is the single most important factor in learning (Hattie 2009; Black and Wiliam 1998).

The TESTA Methodology

75 PROGRAMME AUDITS

Programme Team

Meeting

Based on assessment principles

• ‘Time-on-task’ (Gibbs 2004)• Challenging and high expectations (Chickering and

Gamson 1987)• Internalising goals and standards (Sadler 1989; Nicol

and McFarlane-Dick 2006)• Prompt, detailed, specific, developmental, dialogic

feedback (Gibbs 2004; Nicol 2010)• Deep learning (Marton and Saljo 1976).

Sustained growth

TESTA….

“…is a way of thinking about assessment and feedback”

Graham Gibbs

TESTA shifts in perspective from…

• ‘my’ module to ‘our’ programme

• from teacher-focused on module delivery to student experience of whole programme

• from individualistic modular design to coherent team design

• from the NSS to enhancement strategies

TESTA addresses three problems

Problem 1: Something’s going awry, but I’m not sure why

Problem 2: Curriculum design problems Problem 3: Evidence to action problem

Problem 1: ‘Not sure why’ problem

Problem 2: Curriculum design problem

Does IKEA 101 work for complex learning?

Content Vs Concepts?

The best approach from the student’s perspective is to focus on concepts. I’m sorry to break it to you, but your students are not going to remember 90 per cent – possibly 99 per cent – of what you teach them unless it’s conceptual…. when broad, over-arching connections are made, education occurs. Most details are only a necessary means to that end.

http://www.timeshighereducation.co.uk/features/a-students-lecture-to-rofessors/2013238.fullarticle#.U3orx_f9xWc.twitter

A Student’s lecture to her professor

Problem 3: Evidence to action gap

http://www.liberalarts.wabash.edu/study-overview/

Flawed Assumptions… • Main problem lack of

high quality data

• Logic will prevail

• Systems will change

Proving is different from improving

“It is incredibly difficult to translate assessment evidence into improvements in student learning”

“It’s far less risky and complicated to analyze data than it is to act”

(Blaich & Wise, 2011)

Paradigm What it looks like

Technical rational Focus on data and tools

Relational Focus on people

Emancipatory Focus on systems and structures

TESTA themes and impacts

1. Variations in assessment patterns2. High summative: low formative3. Disconnected feedback

Defining the terms

• Summative assessment carries a grade which counts toward the degree classification.

• Formative assessment does not count towards the degree (either pass/fail or a grade), elicits comments and is required to be done by all students.

1. Huge variations

• What is striking for you about this data?

• How does it compare with your context?

• Does variation matter?

Assessment features across a 3 year UG degree (n=75)Characteristic Range

Summative 12 -227

Formative 0 - 116

Varieties of assessment 5 - 21

Proportion of examinations 0% - 87%

Time to return marks & feedback 10 - 42 days

Volume of oral feedback 37 -1800 minutes

Volume of written feedback 936 - 22,000 words

Patterns over three year UK degrees (n=75) Characteristic Low Medium High

Volume of summative assessment

Below 33 40-48 More than 48

Volume of formative only Below 1 5-19 More than 19

% of tasks by examinations Below 11% 22-31% More than 31%

Variety of assessment methods

Below 8 11-15 More than 15

Written feedback in words Less than 3,800 6,000-7,600 More than 7,600

Actions based on evidence

a) Reduction in summative b) Increase in formative c) Streamlined varieties d) More, less or different feedback depending…e) Used evidence to inform a team approach to

curriculum design f) Every time a coconut with each feature

Theme 2: High summative: low formative

• Summative ‘pedagogies of control’

• Circa 2 per module in UK

• Ratio of 1:8 of formative to summative

• Formative weakly understood and practised

What students say…

• A lot of people don’t do wider reading. You just focus on your essay question.

• In Weeks 9 to 12 there is hardly anyone in our lectures. I'd rather use those two hours of lectures to get the assignment done.

• It’s been non-stop assignments, and I’m now free of assignments until the exams – I’ve had to rush every piece of work I’ve done.

What students say about formative

• If there weren’t loads of other assessments, I’d do it.

• If there are no actual consequences of not doing it, most students are going to sit in the bar.

• It’s good to know you’re being graded because you take it more seriously.

• The lecturers do formative assessment but we don’t get any feedback on it.

Assessment Arms Race

Actions based on evidence

1. Rebalance summative and formative2. It’s a programme decision3. Formative in the public domain4. Link formative and summative5. Require formative to mark summative6. Authentic assessment tasks work best…

Case Study 1

• Entire Business School (WBS)• Reduction from average 2 x summative, zero

formative per module• …to 1 x summative and 3 x formative• All working to similar script• Systematic shift, experimentation, less risky

together

Case Study 2: Blogging as formative

• Modular approach• Students read, write, and think more• Teach less, learn more• Dialogic, creative, reflective• Personalises learning• Develops ‘self-authorship’• Authentic digital task

Theme 3: Disconnected feedback

Take five

• Choose a quote that strikes you.

• What is the key issue?

• What strategies might address this issue?

What students say…

The feedback is generally focused on the module.

It’s difficult because your assignments are so detached from the next one you do for that subject. They don’t relate to each other.

Because it’s at the end of the module, it doesn’t feed into our future work.

I read it and think “Well, that’s fine but I’ve already handed it in now and got the mark. It’s too late”.

Students say the feedback relationship is broken…

Because they have to mark so many that our essay becomes lost in the sea that they have to mark.

It was like ‘Who’s Holly?’ It’s that relationship where you’re just a student.

Here they say ‘Oh yes, I don’t know who you are. Got too many to remember, don’t really care, I’ll mark you on your assignment’.

Actions based on evidence

• Conversation: who starts the dialogue?• Iterative cycles of reflection across modules• Quick generic feedback: the ‘Sherlock’ factor• Feedback synthesis tasks• Reflecting on improvement in relation to past performance• Technology: audio, screencast and blogging• From feedback as ‘telling’…• … to feedback as asking questions

It’s about educational paradigms…

Transmission Model

Social Constructivist Model

Impacts at Winchester

• Upwards trajectory on A&F scores on NSS on TESTA programmes – ‘Top 4’ University

• TESTA ‘effect’ - people talk about formative• Team approach to designing curricula• Design cycle for periodic review includes TESTA• Further research: JISC Fastech Project 2011/14 • Linked REACT Student engagement project 2015/17

ReferencesBlaich, C., & Wise, K. (2011). From Gathering to Using Assessment Results: Lessons from the Wabash National Study. Occasional Paper #8. University of Illinois: National Institution for Learning Outcomes Assessment.Boud, D. and Molloy, E. (2013) ‘Rethinking models of feedback for learning: The challenge of design’, Assessment & Evaluation in Higher Education, 38(6), pp. 698–712. doi: 10.1080/02602938.2012.691462.Gibbs, G. & Simpson, C. (2004) Conditions r which assessment supports students' learning. Learning and Teaching in Higher Education. 1(1): 3-31.Harland, T., McLean, A., Wass, R., Miller, E. and Sim, K. N. (2014) ‘An assessment arms race and its fallout: High-stakes grading and the case for slow scholarship’, Assessment & Evaluation in Higher Education, 40(4), pp. 528–541. doi: 10.1080/02602938.2014.931927.Hughes, G. (2014) Ipsative Assessment. Basingstoke. Palgrave MacMillan.Jessop, T. and Maleckar, B. (2014). The Influence of disciplinary assessment patterns on student learning: a comparative study. Studies in Higher Education. Published Online 27 August 2014 http://www.tandfonline.com/doi/abs/10.1080/03075079.2014.943170Jessop, T. , El Hakim, Y. and Gibbs, G. (2014) The whole is greater than the sum of its parts: a large-scale study of students’ learning in response to different assessment patterns. Assessment and Evaluation in Higher Education. 39(1) 73-88.Nicol, D. (2010) From monologue to dialogue: improving written feedback processes in mass higher education, Assessment & Evaluation in Higher Education, 35: 5, 501 – 517.O'Donovan, B , Price, M. and Rust, C. (2008) 'Developing student understanding of assessment standards: a nested hierarchy of approaches', Teaching in Higher Education, 13: 2, 205 — 217Sadler, D. R. (1989) ‘Formative assessment and the design of instructional systems’, Instructional Science, 18(2), pp. 119–144. doi: 10.1007/bf00117714.Williams, J. and Kane, D. (2009) ‘Assessment and feedback: Institutional experiences of student feedback, 1996 to 2007’, Higher Education Quarterly, 63(3), pp. 264–286.

Recommended