26
Piloting CAA: All Aboard Gavin Sim & Phil Holifield

Piloting CAA: All Aboard Gavin Sim & Phil Holifield

Embed Size (px)

Citation preview

Piloting CAA: All Aboard

Gavin Sim & Phil Holifield

2

Overview

Introduction CAA at UCLAN Key Challenges Staff Uptake Framework Staff Development Students Other Stakeholders Conclusions and Discussions

3

Introduction

Teaching and Learning strategy incorporated e-learning – mainly content development

First summative CAA test summer 2003 WebCT

TRIADS and Questionmark evaluated for pilot study

Questionmark - adopted felt easier for staff develop their own questions

4

Introduction

Technical Infrastructure analysed concerns Scalability – expansion over time Connectivity – internal and external colleges Bandwidth – 10 Mbps available, multimedia

Purchased dedicated server host Questionmark, Internet Information Server & SQL Server

Integration with other systems concern but addressed later

Piloting Within Department of Computing

5

Key Challenges

Encouraging Staff Uptake Staff Development Stakeholder Acceptance – e.g. management CAA perceived ability to test range of

cognitive skills Practical Issues - Labs

6

Methodology

Questionnaire Staff n=34 response rate 64% Views in relation to CAA, support and training

Framework developed based on Blooms Taxonomy, 6 staff, 8 modules

Questionnaire Students n=86 response rate 94% Acceptance of technique Question styles Language Used Usability

7

Staff Uptake

Computing encompasses range of subjects technical networking, subjective HCI

CAA may readily lend itself assessment specific disciplines

Questionnaire revealed only five members of staff used CAA, 3 actively using it

Encourage uptake CAA being incorporated into department’s strategy, all level 1 formative and summative being optional

8

Staff Uptake

Five staff now using CAA within the department

Questionnaire revealed 91% use CAA formative 56% Summative

Difference could be attributed level lecturer teaches

Appropriateness of CAA for Summative Assessment

78.26%69.57%

21.74% 17.39%

0%

20%

40%

60%

80%

100%

1 2 3 4

LevelsP

erc

en

tag

es

9

Framework

Analysing structure of module identify how CAA could be incorporated into modules

Bloom’sTaxonomy

LearningOutcomes

Syllabus

Other Assessment

format

CAA

10

Framework

Number of Learning Outcomes at each level of Blooms Taxonomy

Level 1 Level 2 Level 3 Level 4

CO1652 C01802 C01804 CO2751 CO2752 CO2601 CO3707 CO4707

Knowledge 1 1

Comprehension 3 1 1 1 1

Application 3 2 6 2

Analysis 1 3 2

Synthesis 1 1 1 2 1

Evaluation 1 2 1 3

11

Framework

Variations between number of Learning Outcomes from 3 – 8

Level 1 modules at lower Cognitive Level Level 2 Module CO2601 (Technical Solutions and

Business) requires students to demonstrate similar ability found on CO3707

Next is to identify elements of syllabus and relationship to Learning Outcomes

Prevent unrelated content being integrated into exam

12

Framework Example for CO3707 Identify the parts of the syllabus that

relate to the learning outcomes.

A B C D

1 Consideration of primary users X X X X

2 Introduction to Multimedia X X

3 Introduction to human systems X X X X

4 Multimedia Technology X X X

9 Importance of evaluation and choice metrics

X

13

Framework Number of syllabus elements at each level of Bloom’s

Taxonomy

Level 1 Level 2 Level 3 Level 4

CO1652 C01802 C01804 CO2751 CO2752 CO2601 CO3707 CO4707

Knowledge 2 1

Comprehension 2 1 2 2 2 4

Application 2 3 2 2 2 3

Analysis 1 3 3

Synthesis 3 2 1

Evaluation 5 6 1 9

14

Framework

Is going to be used on MSc Web Development Module

Module is all coursework Formative test in first semester

Enable students gain early feedback Lecturer obtain early indication of their progress

Framework shows how staff can integrate CAA into modules but further development necessary

15

Staff Development

Asked staff ‘ Would you be prepared to input the questions into the software yourself?’ 80% Yes May not reflect attitude

staff in other departments

Staff Support for CAA

86.96%73.91%

60.87%

78.26%

43.48%

0%

20%

40%

60%

80%

100%

Sof

twar

e

Dev

elop

ing

Que

stio

ns

Invi

gila

tion

Que

stio

nD

esig

n

Ora

gnis

eLa

bs

IssueP

erce

nta

ge

16

Staff Development

Lecturers need support in question design 74%

LDU organised staff development in CAA An introduction to Computer Assisted Assessment

CIF bid for funding pay developer to work with staff develop multimedia questions

81% more time required to write questions Question banks and experience reduce time

61% lecturers help invigilation (essential)

17

Staff Development

Informal Focus Groups Discuss problems and share experiences

How accommodate students special needs Invigilation issues Risk issues e.g. server fails

Without this students experience may be different from module to module

18

Students

Attitude measured through series of questionnaires

Students asked ‘ Would you find this format of assessment an acceptable replacement for part of your final exam?’

5 Point Likert Scale, Strongly Disagree=0, Strongly Agree=4

Mean=2.9, SD=.9, 99% Conf. Interval ± 0.26 Indicates reasonable level of support

19

Students

Research into computer anxiety and CAA (Liu et al. 2001; Zakrzewski & Steven 2000)

Concerns, students no prior experience of QM ‘This format of assessment is more stressful than a

paper based test’ Mean=.99, SD=.987, Conf. Interval ± 0.28 Comments ‘I prefer completing a test in this way as

it is less intimidating’ ‘As a computer geek I feel more at ease in front of a

computer.’ (final exam)

20

Students

‘Did you have any difficulties accessing the test?’ 14% Yes

Majority problems copying password from email with white space

Software could trim white spaces Authentication could be achieved through

LDAP process

21

Students

Questionmark used question by question delivery

Standard Templates Question the suitability

of a number of templates e.g. scrolling, navigation

Idea have a template bank

22

Students

Series of questions relating to the interface

Question Mean Standard Deviation

The test was easy to use. 3.13 .838

It is easy to read the characters on the screen 3.18 .917

The screen layout is clear 3.06 .843

The screen layout is consistent 3.15 .823

The navigation was clear. 2.77 .992

I always know where I am in the software 2.95 .851

The button location is consistent 3.21 .709

The order of the navigation buttons is logical 2.95 .881

The button names are meaningful 3.01 .845

The on-screen navigation is easily distinguished from the questions

3.13 .858

23

Students

81 Students completed questionnaire 3o provided qualitative feedback

Requested facility go directly back to previous question (11 times)

‘Proceed’ button felt inappropriate near main navigation

Features incorporated into forthcoming test and further analysis will be conducted

24

Other Stakeholders

Information System Services and Management informed through steering committee

Responsibility report finding of the evaluation for institutional wide deployment

Without support of management additional resources will not be made available

25

Conclusions and Discussions

Scepticism about CAA appropriateness at level 3,4 for summative assessment

Framework showed how it may be incorporated further research required

Adopting CAA into departments strategy increased uptake but staff development necessary

Students responded positively to experience Logging in process could be improved Comparison of WebCT and Questionmark

planned

26

Questions