41
An evidenced-informed approach to enhancing programme-wide assessment TESTA to FASTECH Dr Tansy Jessop & Yaz El Hakim, University of Winchester Professor Paul Hyland, Bath Spa University JISC Online Annual: 22 November 2011

TESTA to FASTECH (November 2011)

Embed Size (px)

Citation preview

Page 1: TESTA to FASTECH (November 2011)

An evidenced-informed approach to enhancing programme-wide assessment

TESTA to FASTECH

Dr Tansy Jessop & Yaz El Hakim, University of WinchesterProfessor Paul Hyland, Bath Spa University

JISC Online Annual: 22 November 2011

Page 2: TESTA to FASTECH (November 2011)

Pre-Conference ActivitiesPre-reading:1) Gibbs & Simpson (2004) Conditions under which assessment supports student learning. http://www2.glos.ac.uk/offload/tli/lets/lathe/issue1/articles/simpson.pdf

2) Gibbs, G. & Dunbar-Goddet, H. (2007) The effects of programme assessment environments on student learning. http://www.heacademy.ac.uk/assets/documents/teachingandresearch/gibbs_0506.pdf

3) Jessop, T., Smith, C. & El Hakim, Y. (2011) Programme-wide assessment: doing ‘more with less’ from the TESTA NTFS project. HEA Assessment & Feedback Briefing Paper.http://www.heacademy.ac.uk/assets/documents/assessment/2011_Winchester_SS_Briefing_Report.pdf

Page 3: TESTA to FASTECH (November 2011)

1) What conditions do you see as most important in student learning (Paper 1)?

2) What is your response to the idea of institutional and programme ‘assessment environments’ which influence assessment and feedback patterns? (Paper 2)

3) What are the main challenges and benefits of addressing assessment patterns on a whole programme? (Paper 3)

Pre-conference questions

Page 4: TESTA to FASTECH (November 2011)

TESTA ‘Cathedrals Group’ Universities

Page 5: TESTA to FASTECH (November 2011)

Why TESTA has been compelling

1) The research methodology2) It is conceptually grounded in assessment and

feedback literature3) It’s about improving student learning4) It is programmatic in focus5) The change process is dialogic & developmental

Page 6: TESTA to FASTECH (November 2011)

Presentation Overview

1) The Research Methodology (Tansy)2) Case study as a compelling narrative (Tansy)3) Trends in assessment & feedback (Tansy)

Q&A4) The student effort narrative (Yaz)5) The bewildered student narrative (Yaz)6) Systems-failure on feedback narrative (Yaz)

Q&A7) A way forward: FASTECH (Paul)

Page 7: TESTA to FASTECH (November 2011)

Two Paradigms

Transmission• Expert to novice• Planned, packaged & ‘delivered’• Feedback given by experts• Feedback received by novices• One way traffic• Very little dialogue• Emphasis on measurement• CompetitionMetaphor = mechanical system

Social constructivist model• Participatory, democratic• Messy and process-oriented• Peer review• Self-evaluation• Social process• Dialogue• Emphasis on learning• CollaborationMetaphor = the journey

Page 8: TESTA to FASTECH (November 2011)

1) Research Methodology

• triangulates data from three sources • presented in a case study • complex, ambiguous, textured • open to discussion - not the ‘final word’• ‘before’ and ‘after’ data

Page 9: TESTA to FASTECH (November 2011)

Programme Audit

• How much summative assessment• How much formative (reqd, formal, feedback)• How many varieties of assessment• Proportion exams to coursework• Word count of written feedback• How much ‘formal’ oral feedback• Criteria, learning outcomes, course docs

Page 10: TESTA to FASTECH (November 2011)

Assessment Experience Questionnaireversion 3.3

• 28 questions• 5 point Likert scale where 5 = strongly agree• 9 scales and one overall satisfaction question• Scales link to conditions of learning• Examples: – quantity and distribution of effort; – use of feedback; – quantity and quality of feedback; – clear goals and standards

Page 11: TESTA to FASTECH (November 2011)

Focus groups

• What kinds of assessment• How assessment influences your study

behaviour• Whether you know what quality work looks like • What feedback is like and how you use it

Page 12: TESTA to FASTECH (November 2011)

Research Methodology

ASSESSMENT EXPERIENCEQUESTIONNAIRE (AEQ n= 1200+)

FOCUS GROUPS (n=50 with 301 students)

PROGRAMME AUDIT (n=22)

Programme Team

Meeting

Case Study

Page 13: TESTA to FASTECH (November 2011)

2) The cases are surprising, complex, puzzling

Here is one case from the TESTA data……

Page 14: TESTA to FASTECH (November 2011)

Case Study 1

• Lots of coursework (47 tasks)• Very varied forms (15 types of assessment)• Very few exams (1 in every 10)• Masses of written feedback on assignments

(15,412 words)• Learning outcomes and criteria clearly

specified….looks like a ‘model’ assessment environment

Page 15: TESTA to FASTECH (November 2011)

But students:• Don’t put in a lot of effort and distribute their

effort across few topics• Don’t think there is a lot of feedback or that it

very useful, and don’t make use of it• Don’t think it is at all clear what the goals and

standards are

……what is going on?

Page 16: TESTA to FASTECH (November 2011)

Your best guesses

A. Variety of assessment confuses studentsB. Assessment in ‘bunched’ at certain timesC. The feedback is too late to be of any useD. Teachers don’t share a common standardE. Other

• Select your response from the buttons (A B C D E) at the bottom-right of the list of participants

• Type any additional comments into the text-chat

Page 17: TESTA to FASTECH (November 2011)

• Teachers work hard, students less so.• Feedback is too late to be useful• Teachers have varied standards• Students see feedback as ‘modular’• Variety confuses students• Formative tasks are assigned low priority• Summative assessment drives effort

What is going on?

Page 18: TESTA to FASTECH (November 2011)

3) Trends in assessment and feedback

• High summative assessment, low formative • High variety (average 11; range 7-17)• Written feedback (ave7,153; r = 2,869-15,412 )• Low oral feedback (average 6 hours)• Watertight documents, tacit standards• Huge institutional and programme variations: o formative: summative ratios (134:1 cf 1:10)o oral feedback (37 minutes to 30 hours)

Page 19: TESTA to FASTECH (November 2011)

Q&A

Page 20: TESTA to FASTECH (November 2011)

4) The effort narrative. TESTA data shows that:

• average of 12 summative per year• 24 teaching weeks, one every two weeks• summative tasks end-loaded & bunched• leading to patchy effort• and surface learning• with an average three formative tasks a year….

Page 21: TESTA to FASTECH (November 2011)

The more you write the better you become at it… and if we’ve only written 40 pieces over three years that’s not a lot.

So you could have a great time doing nothing until like a month before Christmas and you’d suddenly panic. I prefer steady deadlines, there’s a gradual move forward, rather than bam!

In the second year, I kept getting such good marks I thought “If I’m getting this much without putting in much effort that means I could do so much better if I actually did do the hours” but it just goes up and down really.

Page 22: TESTA to FASTECH (November 2011)

TESTA plus HEPI quiz

Which one is false? A) 1 in 3 UK students study for 20 hours or less a weekB) Students on only 1 out of 7 TESTA programmes agreed that

they were working hardC) Students work hardest when there is a high volume of

formative assessment and oral feedbackD) Students work hardest when there is a high volume of

summative assessment and written feedbackE) 1 in 3 UK students undertake > 6 hours of paid work a weekSelect your response from the buttons (A B C D E) at the bottom-right of the list of participants

Page 23: TESTA to FASTECH (November 2011)

Chat box

What ideas might encourage students to put in effort regularly on degree programmes?

• Type your responses in the text chat

Page 24: TESTA to FASTECH (November 2011)

Strategies to encourage student effort

Choose your top strategy to encourage effort:A) Raise expectations in first year B) Require more formative assessmentC) Link formative and summative tasksD) Use more peer and self assessmentE) Design small, frequent assessed tasks

Select your response from the buttons (A B C D E) at the bottom-right of the list of participants

Page 25: TESTA to FASTECH (November 2011)

Technologies that may help…

What technologies might work to spur on regular and distributed

effort?

Type your responses in the text chat

Page 26: TESTA to FASTECH (November 2011)

5) The baffled student narrativeo The language of written criteria is difficult

to understando feedback does not always refer to criteriao students feel that marking standards vary

and are subjective and arbitraryo students sometimes use criteria

instrumentally

Page 27: TESTA to FASTECH (November 2011)

I’m not a marker so I can’t really think like them... I don’t have any idea of why it got that mark.

They have different criteria, build up their own criteria. Some of them will mark more interested in how you word things.

You know who are going to give crap marks and who are going to give decent marks.  

Page 28: TESTA to FASTECH (November 2011)

Chat Box

What strategies might help students to internalise goals and standards?

• Type your responses in the text chat

Page 29: TESTA to FASTECH (November 2011)

Strategies to help students know what ‘good’ is

Which strategy do you think helps most?A) Showing students models of good work B) Peer marking workshopsC) Lots of formative tasks with feedbackD) Plenty of interactive dialogue about standardsE) Self assessment activities

Select your response from the buttons (A B C D E) at the bottom-right of the list of participants

Page 30: TESTA to FASTECH (November 2011)

6) System-wide features make it difficult for students to use feedback and act on ito feedback often arrives after a module, or after

submission of the next tasko tasks are not sequenced or connected across

modules, leading to lack of feed forwardo students sometimes receive grades electronically

before their feedback becomes available on parchment in a dusty office

o technology has led to some depersonalised cut and pasting

Page 31: TESTA to FASTECH (November 2011)

It’s rare that you’ll get it in time to help you on that same module.t’s rare that you’ll get it in time to help you on that same module.You know that twenty other people have got the same sort of comment.

I look on the Internet and say ‘Right, that’s my mark. I don’t need to know too much about why I got it’. I only apply feedback to that module because I have this fear that if I transfer it to other modules it’s not going to transfer smoothly.You can’t carry forward most of the comments because you might have an essay first and your next assignment might be a poster.

Page 32: TESTA to FASTECH (November 2011)

Changes through TESTA

Structural

Thematic

Pedagogic

Module

Page 33: TESTA to FASTECH (November 2011)

Types of changes

1. Reduced summative2. Increased formative assessment 3. Streamlined variety4. Raised expectations of student

workload5. Sequenced and linked tasks across

modules6. Practice based changes

Page 34: TESTA to FASTECH (November 2011)

www.testa.ac.uk

Page 35: TESTA to FASTECH (November 2011)

Q&A

Page 36: TESTA to FASTECH (November 2011)

FASTECHFeedback and Assessment for Students with Technology

What is FASTECH?

• R&D Project (3 yrs): ‘R’ primarily with TESTA tools; ‘D’ in disciplines and universities.• approach: teaching teams with students interpret ‘R’ data to determine goals of ‘D’.

• activities: to address QA and QE issues, optimize sector engagement (fastech.ac.uk) • outputs: R&D findings, experiences & guides by teachers, students, others…

Pragmatic Principles?

• Fast: using readily-available technologies; quick to learn, easy to use …• Efficient: after start-up period; saves time & effort ( paper), productivity … • Effective: brings significant learning benefit to students, pedagogic impact …

Page 37: TESTA to FASTECH (November 2011)

FASTECH: a Pedagogical Goal

Student baggage … • all can be

strategic!

and blocks:• ideas about

roles of S & T• …

… ability to manage own learning …In each assessment culture, this entails using technologies that help promote

transparency & S participation in all processes from design and management to feedback and revision(validity, reliability & fairness are not enough)

a reshaping of teacher & student responsibilities

processes that enhance and create new: peer-learning activities & collaborations (in/out of class); self & peer assessment; recording, sharing & review of students’ progress and achievements …

teacher revision of pedagogies, based upon records of student progress & achievement in learning

attuning of assessment to address individual & distinctive needs & aspirations …..

Teacherbaggage …

and blocks:• ideas about role

of assessment• unsure about

value of feedback• assessment &

marking conflated• criteria &

standards• …

Page 38: TESTA to FASTECH (November 2011)

Finally, for an excellent overview

of technologies and pedagogies

JISC, Effective Assessment in a Digital Age. Bristol: HEFCE, 2010.Available at: www.jisc.ac.uk/digiassess (esp., pp. 14-15, 54-55)For resources associated with this publication:www.jisc.ac.uk/assessresource

Please contact us for more info about TESTA and FASTECH:[email protected]@[email protected]

Websites: www.testa.ac.uk & www.fastech.ac.uk (from January 2012)

Thank You

Page 39: TESTA to FASTECH (November 2011)

DISCUSSIONto be continued in the conference discussion forum

How do you think using technology in A&F will improve students’ learning?

Page 40: TESTA to FASTECH (November 2011)

ReferencesBlack, P. & D. William (1998) ‘Assessment and Classroom Learning’, Assessment in Education: Principles, Policy and Practice. 5(1): 7-74.Bloxham, S. & P. Boyd (2007) Planning a programme assessment strategy. Chapter 11 (157-175) in Developing Effective Assessment in Higher Education. Berkshire. Open University Press. Boud, D. (2000) Sustainable Assessment: Rethinking assessment for the learning society, Studies in Continuing Education, 22: 2, 151 — 167.Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports students' learning. Learning and Teaching in Higher Education. 1(1): 3-31.Gibbs, G., & Dunbar-Goddet, H. (2007) The effects of programme assessment environments on student learning. Higher Education Academy. http://www.heacademy.ac.uk/assets/York/documents/ourwork/research/gibbs_0506.pdf Gibbs, G. & Dunbar-Goddet, H. (2009). Characterising programme-level assessment environments that support learning. Assessment & Evaluation in Higher Education. 34,4: 481-489.

Page 41: TESTA to FASTECH (November 2011)

Jessop, T., El Hakim, Y. & Gibbs, G. (2011) TESTA: Research inspiring change, Educational Developments 12 (4). In press.Jessop, T., McNab, N., and Gubby, L. (2012 forthcoming) Mind the gap: An analysis of how quality assurance procedures influence programme assessment patterns. Active Learning in Higher Education. 13(3).Knight, P.T. and Yorke, M. (2003) Assessment, Learning and Employability. Maidenhead. Open University Press.Nicol, D. J. and McFarlane-Dick, D. (2006) Formative Assessment and Self-Regulated Learning: A Model and Seven Principles of Good Feedback Practice. Studies in Higher Education. 31(2): 199-218. Nicol, D. (2010) From monologue to dialogue: improving written feedback processes in mass higher education, Assessment & Evaluation in Higher Education, 35: 5, 501 – 517Sambell, K (2011) Rethinking Feedback in Higher Education. Higher Education Academy Escalate Subject Centre Publication.