25
Can assessment literacy be enhanced and does it lead to improved student performance? A case study of year one Business and Management students at Middlesex University Business School by Simon Roberts Karim Qizalbash

Assessment literacy

  • Upload
    mdxaltc

  • View
    17

  • Download
    5

Embed Size (px)

Citation preview

Page 1: Assessment literacy

Can assessment literacy be enhanced and does it

lead to improved student performance?

A case study of year one Business and Management

students at Middlesex University Business School

by Simon Roberts

Karim Qizalbash

Ana Marinica

Page 2: Assessment literacy

Introduction* Context for the study* Course Design* Evaluation Through an Assessment Literacy lens

AssessmentAssessment LiteracyWhy is assessment literacy importantActivityA Business School Case Study

| 2

Session Overview

Page 3: Assessment literacy

• Business School initiative to address the issue of lower than expected recruitment target

• MBS0111 – Preparing for Business

• 12 week programme – 12 h/week • 6h of Business Studies• 6h of Academic Skills development

• The team: Business School Academics from LWO , LDU and a GTA

Context For The Study

| 3

Page 4: Assessment literacy

• Rationale to mirror loosely what would come

• Workshops in lecture - seminar style

• Introduction to assessment likely to be experienced in year 1 with the addition of a reflection

• Introduction to both group work and individual assignments

• Support provided in parallel to be outlined by Karim

Course Design

| 4

Page 5: Assessment literacy

Evaluation Through an Assessment Literacy lens

• Focus on preparing students for University studies

• Introducing them to different forms of assessment

• Providing them with experience of a variety of assessments

• Providing them with support and formative feedback

• Alignment of the assessments to what would follow ?

| 5

Page 6: Assessment literacy

Assessment

• Brown and Knight (1994, p.7) state that ‘assessment defines what the student regards as important, how they spend their time [..], see themselves as students and then as graduates.

• A key issue in assessment is that students often do not understand what is a better piece of work and do not understand what is being asked of them particularly in terms of standards and criteria.

| 6

(O’Donovan et al., 2001)

Page 7: Assessment literacy

Assessment for learning

• engaging, meaningful assessment tasks; in line with the learning outcomes of the module

• students developing as learners – effective attributes and skills to self-assess and evaluate their own learning

• informal feedback (e.g. in-class group discussions, peer-review)

• formal feedback (a range of forms of feedback, used at a number of stages)

• opportunities to learn and practice assessments

• formative and summative – appropriate balancing of these two types of assessment

| 7

(Sambell et al, 2013)

Page 8: Assessment literacy

Assessment Literacy

‘Literacy’ as a term:

• traditionally utilised in the context of skills • at times referred to in terms of attributes such as confidence,

competence and fluency• can also be seen as a threshold to further learning, deeper

understanding and engagement

| 8

(Price et al, 2012)

Page 9: Assessment literacy

• refers to a student understanding and becoming fluent in assessment terms

• encompasses an appreciation of an assessment’s relationship to learning, understanding of assessment and feedback practices, as well as terminology used, the type, meaning and level of assessment criteria and standards

• equips one with an appreciation of the purpose and process of assessment which enables one to engage deeply with assessment standards to make a choice of which skill or area of knowledge to apply and appreciate the appropriateness of each to a specific task

| 9

Assessment Literacy cont’d

(Price et al, 2012)

Page 10: Assessment literacy

Assessment Literacy cont’d

• assessment literates know the difference between sound and unsound assessments. They are not intimidated by the sometimes mysterious and always daunting technical world of assessment.

• when assessment literate student undertake an assessment or task they will already be familiar with the appropriate assessment standards they do not discover the standard through doing the task

| 10

(Stiggins, 1995)

(Price et al, 2012)

Page 11: Assessment literacy

| 11

Page 12: Assessment literacy

• the transition period has a major impact on student retention• in the UK, for instance, about two-thirds of withdrawals happen during or at

the end of the first year

• a student population with diverse entry qualifications, abilities and learning experiences

• traditional expectations towards students have not changed: they are to manage their learning and acquire academic literacy independently

• under the pressure of league tables, students in secondary schools ‘tend to be “spoon-fed” for longer, and are less equipped with “self-learning skills” ’

• the lack of preparation for and understanding of the type of learning that is required makes it difficult for students to adjust to university life

(Drew, 2001; Gamache, 2002)| 12

Why is Assessment Literacy important and why is it especially important for first year undergraduates?

(Yorke, 2001)

(Wingate, 2007)

(National Audit Office, 2002, p. 15)

Page 13: Assessment literacy

Activity

• Take 2 minutes to make notes with regard to the task below

• Share any key points with the audience

In what ways do you believe your current practice already incorporates elements to facilitate students’ development of assessment literacy?

| 13

Page 14: Assessment literacy

| 14

• improvement in students’ assessed performance

• the more students know what is expected of them in assessments the more effectively they will be able to meet those requirements

• students are keen to know how to improve due to the fact that assessment is a dominant influence on their learning

Why Assessment Literacy?

(Price et al, 2010)

Page 15: Assessment literacy

Approaches to developing student understanding of assessment standards

| 15

Page 16: Assessment literacy

• My background – ELT & Applied Linguistics

• My initial aims for the Enhanced students – to be the bearer of knowledge, while strengthening their academic skills and abilities.

• The Cohort – 2 X ~45 students 80% young males:

- reluctance

- resentment

• The 1st week.

• Some student comments:

“why do I have to do this course!”

“what’s the point of this?”

“how is this going to help me!”

Aims reworked – make everything relevant to the students’ assessment.

| 16

Page 17: Assessment literacy

Types of Assessment & My Intervention 1

1. Essay (Report) please refer to the marking criteria - ESSAY

1st draft – feedback via 1-2-1 tutorial and annotated essay

• Some areas covered - Academic writing conventions e.g. Referencing/citations, thesis

statements & paraphrasing

- Criticality - Discursive vs Descriptive writing

• Instruction given by means of: - general teacher-led instruction in seminar and/or lecture format - Individual / group tasks - Peer review = “Dominant Logic” Explicit Model + The Social Constructivist Model

| 17

Page 18: Assessment literacy

Types of Assessment & My Intervention 2

2. Presentation please refer to the marking criteria - PRESENTATION

Practice presentation - feedback – final presentation

• Some areas covered:

- Formulaic language

- Formality / Style / Register

• Instruction given by means of:

- Individual / group exercises

- general teacher-led instruction in seminar and/or lecture format

= “Dominant Logic” Explicit Model

| 18

Page 19: Assessment literacy

The Cultivated Community of Practice Model

Not desirable!– too early in UG course for this approach– even in some cases of using the Social Constructivist Model type tasks

students would imply that I was unprepared for the class or was being “lazy” by getting the students to do the assessing

• The benefits of Assessment Literacy may not, however, be immediately obvious to authority-dependent students who are motivated by certainty and believe it is the duty of infallible assessors to ‘correct’ student work

• Indeed, assessment illiterate students may simply consider less authority - dependant assessment processes (such as peer review and peer assessment) ingenious ways for lazy tutors to shirk their marking responsibilities.

| 19

(Baxter Magold,1992; King & Kitchener, 2004 in Price et al, 2012)

(Price et al, 2012)

Page 20: Assessment literacy

• Quantitative• Comparison of means regarding enhanced vs. Jan start

cohort:– Grades individual assessments and overall grades– Self efficacy scores

• Regressions regarding grades self efficacy at end of year 1• Qualitative

• Focus groups (data yet to be analysed)

Methods of evaluation

| 20

Page 21: Assessment literacy

Findings

• Grades of enhanced students were slightly lower across all assessments and many were found to be statistically significant; with the exception of all assessment in HRM1004

• From a cross tabulation no differences were found with regard to the pass/fail ratio across the two groups with the exception of MSO

• No statistically significant differences found with regard to ‘Understanding the requirements of different assessments’; in fact scores of the Enhanced students were higher that January starts in all instances but not statistically significant

• With regard to confidence levels there were only significantly significant differences in the case of report writing and again those of the Enhanced students were higher

| 21

Page 22: Assessment literacy

Findings cont’d

From the regression analysis not many models were found, however:• Understanding the requirements of writing an essay explained 13% of the

variance of the HRM essay grade and 30% of the essay grade within the Enhanced students sample; no relationship was found regarding the January start students

• For all students the confidence in doing online tests explained 6.6% of the variance of the HRM online test grade

• Understanding the requirement of online tests explains 8.3% of the variance for enhanced students’ grades; no model for Jan start students

• Confidence in doing online tests explains 4% of the variance in MSO Test1 grade for the all students sample; 10% of the variance in MSO Test 1 grade for the enhanced students sample

• Confidence in doing online tests predicts 4% of MSO test 2 grade for the all students sample

• Interestingly, understanding the requirement of online tests predicts 11% of Accounting and Finance test grade for the Jan start sample

No other models found

| 22

Page 23: Assessment literacy

Discussion of findings

• The groups were compared due to convenience and an assumed difference in entry points (MBS 180-220 pts and January Start 240+ pts; this could not be verified yet)

• Results indicate that January start students appear to be out performing their Enhanced cohort counterparts although not significantly in the case of HRM1004

Why?

• Enhanced students had perceived higher clarity of the requirements of each assessment and were slightly more confident than their January start counterparts.

• The regression models indicate that there was a relationship between awareness and performance of the Enhanced students, especially in the case of the HRM1004 Essay

• 30% of the variance explained| 23

Page 24: Assessment literacy

Conclusions and lessons learnt

• We don’t know whether this was down to the intervention or not, therefore pre and post interventions are required

• What is students’ understanding of the requirements of different forms of assessments

• Better matching and alignment of assessments could lead to better outcomes; is assessment literacy tied to subject discipline or is it transferable?

• What can be done differently

• the team is now made up of academics across all the departments, therefore provision of subject based criteria for assessments

• Is there a longer term impact, how do year two grades compare across the two cohorts?

• Can staff be developed to ensure clarity of criteria in a relatively consistent manner across and within subject disciplines?

| 24

Page 25: Assessment literacy

References

Brown, S. and Knight, P. (1994). Assessing learners in higher education. London: Kogan Page.

Drew, S. (2001). Student Perceptions of What Helps Them Learn and Develop in Higher Education. Teaching in Higher Education. 6(3), pp. 309–331.

Stiggins, R. (1995). Assessment literacy for the 21st century. Phi Delta Kappan.

Price, M., Rust, C., O’Donovan, B., Handley, K. and Bryant, R. (2012). Assessment Literacy: The Foundation for Improving Student Learning. Oxford: Oxford Brookes University.

The Higher Education Academy. (2012). A Marked Improvement Transforming Assessment in Higher Education. [online]. Available at: https://www.heacademy.ac.uk/sites/default/files/A_Marked_Improvement.pdf. [Accessed 1 July 2015].

Yorke, M. (2001). Formative Assessment and Its Relevance in Retention. Higher Education Research and Development. 20 (2), pp. 115–123.

Wingate, U. (2007). 'A Framework for Transition: Supporting ‘Learning to Learn’ in Higher Education‘. Higher Education Quarterly. 61(3), pp. 391-405. [online]. Education Research Complete. Available at: http://rt5vx6na7p.scholar.serialssolutions.com. [Accessed 1 July 2015].

| 25