45
The challenges of Assessment and Feedback: findings from an HEA project Denise Whitelock [email protected]

The challenges of Assessment and Feedback: findings from an HEA project

Embed Size (px)

DESCRIPTION

The challenges of Assessment and Feedback: findings from an HEA project – Denise Whitelock (IET) This project was undertaken by IET and colleagues from the University of Southampton and is just producing its final report. The project's aim was to produce a synthesis of evidence based research which throws light on the progress made in the practice of Assessment and Feedback in H.E. This presentation will highlight findings with respect to authentic assessment, e-portfolios, peer assessment, feedback for language learning and Advice for Action.

Citation preview

Page 1: The challenges of Assessment and Feedback: findings from an HEA project

The challenges of Assessment and Feedback: findings from an HEA project

Denise Whitelock

[email protected]

Page 2: The challenges of Assessment and Feedback: findings from an HEA project

Outline

• e-Assessment Challenge

• Authentic assessment, e-portfolios

• Peer assessment

• MCQs and self-assessment

• Feedback

• Advice for Action

Page 3: The challenges of Assessment and Feedback: findings from an HEA project

Project purpose in conjunction with Southampton University • Consult the academic community on useful references

• Seminar series

• Survey

• Advisors

• Invited contributors

• Prioritise evidence-based references

• Synthesise main points

• For readers:

• Academics using technology enhancement for assessment and feedback

• Learning technologists

• Managers of academic departments

Page 4: The challenges of Assessment and Feedback: findings from an HEA project

The e-Assessment Challenge

• Constructivist Learning – Push

• Institutional reliability and accountability – Pull

.

Page 5: The challenges of Assessment and Feedback: findings from an HEA project

www.storiesabout.comwww.storiesabout.com/

[email protected]

Page 6: The challenges of Assessment and Feedback: findings from an HEA project

Characteristics Descriptor

Authentic Involving real-world knowledge and skills

Personalised Tailored to the knowledge, skills and interests of each student

Negotiated Agreed between the learner and the teacher

Engaging Involving the personal interests of the students

Recognise existing skills Willing to accredit the student’s existing work

Deep Assessing deep knowledge – not memorization

Problem oriented Original tasks requiring genuine problem solving skills

Collaboratively produced Produced in partnership with fellow students

Peer and self assessed Involving self reflection and peer review

Tool supported Encouraging the use of ICT

Elliott’s characteristics of Assessment 2.0 activities

Page 7: The challenges of Assessment and Feedback: findings from an HEA project

Authentic assessments :e-portfolios

Electronic NVQ portfolio cover contents page, OCR IT Practitioner, EAIHFE, Robert Wilsdon

Page 8: The challenges of Assessment and Feedback: findings from an HEA project

Candidate Assessment Records section, OCR IT Practitioner, EAIHFE, Robert Wilsdon

Page 9: The challenges of Assessment and Feedback: findings from an HEA project

Building e-portfolios on a chef’s course

food preparation for e-portfolio, Modern Apprenticeship in Hospitality and Catering,

West Suffolk College, Mike Mulvihill

Evidence of food preparation skill for e-portfolio, Modern Apprenticeship in Hospitality and Catering, West Suffolk College, Mike Mulvihill

Page 10: The challenges of Assessment and Feedback: findings from an HEA project

Sharing e-portfolios: The Netfolio concept

• Social constructivism

• Connecting e-portfolios (Barbera, 2009)

• Share and build upon a joint body of evidence

• Trialled with 31 PhD students at a virtual university

• Control group used but Netfolio group obtained higher grades

• Greater visibility of revision process and peer assessment in the Netfolio system

Page 11: The challenges of Assessment and Feedback: findings from an HEA project

Peer Assessment and the WebPA Tool

• Loughborough (Loddington et al, 2009)

• Self assess and peer assess with given criteria

• Group mark awarded by tutor

• Students rated:• More timely feedback

• Reflection

• Fair rewards for hard work

• Staff rated:• Time savings

• Administrative gains

• Automatic calculation

• Students have faith in the administrative system

Page 12: The challenges of Assessment and Feedback: findings from an HEA project

MCQs: Variation on a theme (1)

The question is an example of a COLA assessment used at the Reid Kerr College, Paisley. It is a Multiple response Question used in one of their modules.

The question was developed using Questionmark Perception at the University of Dundee. It is part a set of formative assessment for medical students.

Page 13: The challenges of Assessment and Feedback: findings from an HEA project

MCQs: Variation on a theme (2)

Example of LAPT Certainty-Based Marking, UK cabinet ministers demo exercise showing feedback, University College, Tony Gardner-Medwin

Drug Chart Errors and Omissions, Medicines Administration Assessment, Chesterfield Royal Hospital

Page 14: The challenges of Assessment and Feedback: findings from an HEA project

Scaffolding and High Stakes assessment

• Math for Science

• Tutor less course

• Competency led

• No point to cheat

• Web home exam

• Invigilation technologies

Page 15: The challenges of Assessment and Feedback: findings from an HEA project

Self diagnosis

• Basic IT skills, first year med students (Sieber, 2009)

• Competency based testing

• Repeating tests for revision

• Enables remedial intervention

Page 16: The challenges of Assessment and Feedback: findings from an HEA project

Students want more support with assessment

• More Feedback

• Quicker Feedback

• Full Feedback

• User friendly Feedback

• And ..................National Students’ Survey

Page 17: The challenges of Assessment and Feedback: findings from an HEA project

Problems with Feedback

• Ignore feedback

• Look at the mark only

• Tells me correct solution but not what’s wrong with mine

• Needs decoding

• Timely

Page 18: The challenges of Assessment and Feedback: findings from an HEA project

Gains from Interactivity with Feedback: Formative Assessment

• Mean effect size on standardised tests between 0.4 to 0.7 (Black & Williams, 1998)

• Particularly effective for students who have not done well at school http://kn.open.ac.uk/document.cfm?docid=10817

• Can keep students to timescale and motivate them

• How can we support our students to become more reflective learners and enter a digitaldiscourse?

Page 19: The challenges of Assessment and Feedback: findings from an HEA project

Mobile Technologies and Assessment

• MCQs ,PDAs Valdiva & Nussbaum(2009)

• Polls,instant surveys

• Simpson & Oliver (2007)

• Draper (2009) EVS

Page 20: The challenges of Assessment and Feedback: findings from an HEA project

Collaborative formative assessment with

Global Warming

DMW, Institute of Educational Technology, September 1997DMW, Institute of Educational Technology, September 1997

Page 21: The challenges of Assessment and Feedback: findings from an HEA project

Global Warming

Page 22: The challenges of Assessment and Feedback: findings from an HEA project

Global Warming: Simlink Presentation

Page 23: The challenges of Assessment and Feedback: findings from an HEA project

Next: ‘Yoked’ apps via BuddySpace

Student A

Student B(‘yoked’, butwithout full

screen sharingrequired!)

Page 24: The challenges of Assessment and Feedback: findings from an HEA project

Free Text Entry and Feedback

• LISC for languages

• Open Comment

• IAT for Science

Page 25: The challenges of Assessment and Feedback: findings from an HEA project

LISC: Aily Fowler

• Kent University ab-initio Spanish module

• Large student numbers

• Skills-based course

• Provision of sufficient formative assessment meant unmanageable marking loads

• Impossible to provide immediate feedback

• leading to fossilisation of errors

Page 26: The challenges of Assessment and Feedback: findings from an HEA project

The LISC solution: developed by Ali Fowler• A CALL system designed to enable students

to:• Independently

practise sentence translation

• Receive immediate (and robust) feedback on all errors

• Attend immediately to the feedback (before fossilisation can occur)

Page 27: The challenges of Assessment and Feedback: findings from an HEA project

How is the final mark arrived at in the LISC System?

• The two submissions are unequally weighted

• Best to give more weight to the first attempt

• since this ensures that students give careful consideration to the construction of their first answer

• but can improve their mark by refining the answer

• The marks ratio can vary (depending on assessment/feedback type)

• the more information given in the feedback, the lower the weight the second mark should carry

Page 28: The challenges of Assessment and Feedback: findings from an HEA project

Heuristics for the final mark

• If the ratio is skewed too far in favour of the first attempt…

• students are less inclined to try hard to correct non-perfect answers

• If the ratio is skewed too far in favour of the second attempt…

• students exhibit less care over the construction of their initial answer

Page 29: The challenges of Assessment and Feedback: findings from an HEA project

Open Comment addresses the problem of free text entry

• Automated formative assessment tool

• Free text entry for students

• Automated feedback and guidance

• Open questions, divergent assessment

• No marks awarded

• For use by Arts Faculty

Page 30: The challenges of Assessment and Feedback: findings from an HEA project

IAT (Jordan & Mitchell, 2009)

• Marking engine – Web service

• Authoring tool for marking rules for each question

• Model answers but free text entry by student

• Human computer marking comparisons indistinguishable at 1% level for two thirds of questions

• Problems question writing

• Human marking is inconsistent (Conole & Warburton, 2005)

Page 31: The challenges of Assessment and Feedback: findings from an HEA project

Models of feedback which are open to test

How would you instruct a robot to mark as you do?

Page 32: The challenges of Assessment and Feedback: findings from an HEA project

Stages of analysis by computer of students’ free text entry for Open Comment: advice with respect to content (socio-emotional support stylised example)• STAGE 1a: DETECT ERRORS E.g. Incorrect dates,

facts. (Incorrect inferences and causality is dealt with below)

• Instead of concentrating on X, think about Y in order to answer this question Recognise effort (Dweck) and encourage to have another go

• You have done well to start answering this question but perhaps you misunderstood it. Instead of thinking about X which did not…….. Consider Y

Page 33: The challenges of Assessment and Feedback: findings from an HEA project

Computer analysis continued

• STAGE 2a: REVEAL FIRST OMISSION

• Consider the role of Z in your answer Praise what is correct and point out what is missing Good but now consider the role X plays in your answer

• STAGE 2b: REVEAL SECOND OMISSION

• Consider the role of P in your answer Praise what is correct and point out what is missing Yes but also consider P. Would it have produced the same result if P is neglected?

Page 34: The challenges of Assessment and Feedback: findings from an HEA project

Final stages of analysis• STAGE 3:REQUEST

CLARIFICATION OF KEY POINT 1

• STAGE 4:REQUEST FURTHER ANALYSIS OF KEY POINT 1(Stages 3 and 4 repeated with all the key points)

• STAGE 5:REQUEST THE INFERENCE FROM THE ANALYSIS OF KEY POINT 1 IF IT IS MISSING

• STAGE 6:REQUEST THE INFERENCE FROM THE ANALYSIS OF KEY POINT 1 IF IT IS NOT COMPLETE

• STAGE 7:CHECK THE CAUSALITY

• STAGE 8:REQUEST ALL THE CAUSAL FACTORS ARE WEIGHTED

Page 35: The challenges of Assessment and Feedback: findings from an HEA project

McFeSPA system

Supports teaching assistants to mark and give feedback on undergraduate computer programming assignments

Support tool for semi-automated marking and scaffolding of feedback

Findings that feedback model would be helpful in training tutors .... Similar to Open Comment findings

Page 36: The challenges of Assessment and Feedback: findings from an HEA project

Feedback: Advice for Action

• Students must decode feedback and then act on it Boud (2000)

• Students must have the opportunity to act on feedback Sadler (1989)

• Gauging efficacy through student action

• Deep and strategic study approaches more effective in processing e-feedback Strang (2010)

Page 37: The challenges of Assessment and Feedback: findings from an HEA project

Audio Feedback (Middleton & Nortcliffe, 2010)1. Timely and meaningful

2. Manageable for tutors to produce and the learner to use

3. Clear in purpose, adequately introduced and pedagogically embedded

4. Technically reliable and not adversely determined by technical constraints or difficulties

5. Targeted at specific students, groups or cohorts, addressing their needs with relevant points in a structured way

6. Produced within the context of local assessment strategies and in combination, if appropriate, with other feedback methods using each medium to good effect

7. Brief, engaging and clearly presented, with emphasis on key points that demand a specified response from the learner

8. Of adequate technical quality to avoid technical interference in the listener’s experience

9. Encouraging, promoting self esteem

10. Formative, challenging and motivational

Page 38: The challenges of Assessment and Feedback: findings from an HEA project

Characteristics Descriptor

Authentic Involving real-world knowledge and skills

Personalised Tailored to the knowledge, skills and interests of each student

Negotiated Agreed between the learner and the teacher

Engaging Involving the personal interests of the students

Recognise existing skills Willing to accredit the student’s existing work

Deep Assessing deep knowledge – not memorization

Problem oriented Original tasks requiring genuine problem solving skills

Collaboratively produced Produced in partnership with fellow students

Peer and self assessed Involving self reflection and peer review

Tool supported Encouraging the use of ICT

Elliott’s characteristics of Assessment 2.0 activities

A d v i c e f o r A c t i o n

Page 39: The challenges of Assessment and Feedback: findings from an HEA project

Creating teaching and learning dialogues: towards guided learning supported by technology• Learning to judge

• Providing reassurance

• Providing a variety of signposted routes to achieve learning goals

Page 40: The challenges of Assessment and Feedback: findings from an HEA project

Key Messages

Effective regular, online testing can encourage student learning and improve their performance in tests (JISC, 2008)

Automated marking can be more reliable than human markers and there is no medium effect between paper and computerized exams (Lee and Weerakoon, 2001)

The success of assessment and feedback with technology-enhancement lies with the pedagogy rather than the technology itself; technology is an enabler (Draper, 2009)

Page 41: The challenges of Assessment and Feedback: findings from an HEA project

Keys Messages 2

Technology-enhanced assessment is not restricted to simple questions and clear-cut right and wrong answers, much more sophisticated questions are being used as well (Whitelock & Watt, 2008)

The design of appropriate and constructive feedback plays a vital role in the success of assessment, especially assessment for learning. (Beaumont, O’Doherty & Shannon, 2008)

Page 42: The challenges of Assessment and Feedback: findings from an HEA project

Key Messages 3

Staff development essential to the process (Warburton, 2009)

Prepare students to take the assessments that use technology enhancement by practicing with similar levels of assessment using the same equipment and methods (Shepherd et al, 2006)

The reports generated by many technology-enhanced assessment systems are very helpful in checking the reliability and validity of each test item and the test as a whole can be checked on commercial systems (McKenna and Bull, 2000)

Page 43: The challenges of Assessment and Feedback: findings from an HEA project

References Beaumont, C., O’Doherty, M., and Shannon, L. (2008). Staff and student

perceptions of feedback quality in the context of widening participation, Higher Education Academy. Retrieved May 2010 from: http://www.heacademy.ac.uk/assets/York/documents/ourwork/research/Beaumont_Final_Report.pdf.

Draper, S. (2009). Catalytic assessment: understanding how MCQs and EVS can foster deep learning. British Journal of Educational Technology, 40(2), 285-293.

JISC, HE Academy, and ALT (2008). Exploring Tangible Benefits of e-Learning. Retrieved in May 2010 from http://www.jiscinfonet.ac.uk/publications/info/tangible-benefits-publication.

Lee, G. and Weerakoon, P. (2001). The role of computer-aided assessment in health professional education: a comparison of student performance in computer-based and paper-and-pen multiple-choice tests. Medical Teacher, Vol 23, No. 2, 152 - 157.

McKenna, C. and Bull, J. (2000). Quality assurance of computer-assisted assessment: practical and strategic issues. Quality Assurance in Education. 8(1), 24-31.

Page 44: The challenges of Assessment and Feedback: findings from an HEA project

References 2 Middleton, A. and Nortcliffe, A. (2010) ‘Audio feedback design: principles and

emerging practice’, In D.Whitelock and P.Brna (eds) Special Issue ‘Focusing on electronic feedback: feasible progress or just unfulfilled promises?’ Int. J. Continuing Engineering Education and Life-Long Learning, Vol. 20, No. 2, pp.208-223.

Shephard, K., Warburton, B., Maier, P. and Warren, A. (2006). Development and evaluation of computer-assisted assessment in higher education in relation to BS7988. Assessment & Evaluation in Higher Education, 31: 5, 583 — 595.

Strang, K.D. (2010) ‘Measuring self regulated e-feedback, study approach and academic outcome of multicultural university students’, In D.Whitelock and P.Brna (eds) Special Issue ‘Focusing on electronic feedback: feasible progress or just unfulfilled promises?’ Int. J. Continuing Engineering Education and Life-Long Learning, Vol. 20, No. 2, pp.239-255.

Warburton, B. (2009). Quick win or slow burn: modelling UK HE CAA uptake, Assessment & Evaluation in Higher Education, 34: 3, 257 — 272.

Whitelock, D. and Watt, S. (2008). Reframing e-assessment: adopting new media and adapting old frameworks. Learning, Media and Technology, Vol. 33, No. 3, September 2008, pp.153–156 Routledge, Taylor & Francis Group. ISSN 1743-9884

Page 45: The challenges of Assessment and Feedback: findings from an HEA project

Three Assessment Special Issues

Brna, P. & Whitelock, D. (Eds.) (2010) Special Issue of International Journal of Continuing Engineering Education and Life-long Learning, Focussing on electronic Feedback: Feasible progress or just unfulfilled promises? Volume 2, No. 2

Whitelock, D. (Ed.) (2009) Special on e-Assessment: Developing new dialogues for the digital age. Volume 40, No. 2

Whitelock, D. and Watt, S. (Eds.) (2008). Reframing e-assessment: adopting new media and adapting old frameworks. Learning, Media and Technology, Vol. 33, No. 3