20
Identifying students ‘at risk’: early warning at macro and micro levels Barry M c Cluckie & Lorna Love

Introduction

Embed Size (px)

DESCRIPTION

Identifying students ‘at risk’: early warning at macro and micro levels Barry M c Cluckie & Lorna Love. Introduction. Early warning for Science advisers : Using attendance data at the macro level The project & the process used ROC Analysis- trigger level Outcomes - PowerPoint PPT Presentation

Citation preview

Page 1: Introduction

Identifying students ‘at risk’: early warning at macro and micro levels Barry McCluckie & Lorna Love

Page 2: Introduction

Introduction

Early warning for Science advisers: Using attendance data at the macro level

The project & the process used

ROC Analysis- trigger level

Outcomes

Other and future work + stats

Page 3: Introduction

Process

Early warning for Science advisers: using attendance data at the macro level

Can we use 1st semester science attendance data to predict which L1 science students will withdraw or have a poor 1st year? (poor 1st year: GPA <=9 OR <120 credits)

Data types gathered: lab, tutorial, workshop, field trip & class test attendance

Data for 19 courses collected by 14 individuals, using Excel/Access.

Data collected in week 6 (for teaching weeks 1 to 5).

Reformatted and made suitable for input into an Access database.

Data is then ‘cleaned’ by comparing expected to actual attendance.

Queries run in database to select ‘at risk’ students in week 7,data made available to adviser at the end of that week via the Universities ‘Single Sign On’.

Page 4: Introduction

ROC curve analysis 1

Receiver Operating Characteristic curve

Real world decisions are made in the presence of uncertainty!

What is a ROC curve?: A graphical representation of the trade off between the True Positive Rate (TPR) & the False Positive Rate (FPR) as the discrimination threshold is varied.

Discrimination threshold: The tipping point between a prediction being classified as ‘Positive’ or ‘Negative’.

As the discrimination threshold is varied the benefit and costs values will change. This is what we want to know about!

Page 5: Introduction

ROC curve analysis 2

Contingency table:

Distribution for actually negative cases (Stayed On/ good performance)

Distribution for actually positive cases (Withdraw/poor performance)

One possible discrimination threshold

Page 6: Introduction

ROC curve analysis 3

Page 7: Introduction

ROC curve analysis 5

ROC Curve

-(65% )

-(95% )

-(90% )

-(85% )

-(80% )

-(75% )

-(70% )

-(60% )-(55% )-(50% )

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

1-Specificity

Sensi

tivi

ty

Page 8: Introduction

ROC curve analysis 4

Discrimination threshold selected was 75% attendance level in order to maintain an adviser to student ratio equal to 2.

1229 first year students (07-08) with 98 (8%) deemed as ‘at risk’.

Information available to advisers via Universities ‘Single Sign On’.

Example

Matric Title FirstName Surname Email Telephone Course L2 L3 L4 L5 T2 T3 T4 CT50701589 MR JOHN KING [email protected] 01416384321 BIOLOGY1A 1 1 0 0

CHEMISTRY1 0 0 0GEOGRAPHY1 0 1 1 1

Page 9: Introduction

Outcomes

Result for 2007-2008 students 20 students who withdrew correctly identified. (26% of all withdrawals summer ‘08)

78 ‘at risk’ students continued until the end of the academic year. (80% of those at risk)

Of these 49 had a negative outcome with 29 having a positive outcome. (Summer ’08)

21 of the positive students spoke to advisers, while 25 of the negative students spoke to adviser.

In all 69 ‘at risk’ students withdrew or had a negative outcome. (70% of those identified as ‘at risk’, summer ‘08)

37 of the 98 are now in 2nd year.

Page 10: Introduction

Other & Future work

All 1st semester attendance data was made available to Advisers of Studies for their students via the ‘Single Sign On’.

Early Warning System expanded to include the Arts Faculty and most of LBSS Faculty. [(08-09)1St year courses only]

Improve the trigger- regression/survival analysis using data fromBi-Query and compare ROC curves.

Improve the efficiency of attendance gathering tools. More use of barcode scanning and databases?; more centralised data

gathering as university moves to new student records system?

Page 11: Introduction

STATS

Year Courses Advisers Students

07-08 19 63 1229

08-09 85 140 2707

09-10 100+? 200+? 3800+? (80% of L1)

Page 12: Introduction

Early warning at a micro level

Early warning: using diagnostic data at the micro level

Project background

- Study Support Co-ordinator for F.I.M.S. Oct. ’08

- assist struggling students in Mathematics, Computing Science, Statistics

Evolution of fine-grained ‘at risk’ student data

Use of data collected

Page 13: Introduction

Dept. of Computing Science

Context

Computing Science - two Level 1 modules - long, thin courses CS1P and CS1Q

- Level 1 approx.170 students – wide variety of prior experience

- problem with students succeeding in CS1P

Voluntary Peer Assisted Learning (PAL) for CS1P - early, diagnostic class test

- more than a ‘pass/fail’, categories of strugglers

- ‘fundamentals’ ‘hand execution’ ‘problem solving’

- PAL options : structured peer revision, supported problem solving sessions

Page 14: Introduction

Computing Science Tutors

Diagnostic class test data

Purpose of data

- PAL offered to all CS1P students, targeted specifically at struggling students

- purpose: to inform the PAL co-ordinator and the Level 1 facilitators

‘At risk’ students list- share this information informally with the tutors

- surprising result ...

Page 15: Introduction

Mismatch of concern

Surprising result

Conflict: My ‘at risk’ list ≠ the tutor’s ‘at risk’ list

- some students I was concerned about were thought to be ‘doing ok’

- some students that the tutor was concerned about passed the class test

- problem is that ‘35% in class test’ does not tell the whole picture.

- tutor knows more about the student than their mark

Examples of categories of progress from the tutor’s perspective

- “not seen student enough to be able to comment”

- “trying hard, making progress, will get there with continued effort”

- “done a lot of programming before but not willing to engage with new concepts”

- “turns up late and unprepared”

Page 16: Introduction

The new ‘at risk’ data

Rich collection of dataRecorded tutor’s comments and also added in attendance data.

Now the student data has information from 3 sources1: defined via diagnostic class test(s)2: defined by tutor (and PAL facilitator)3: defined via attendance data

What do we do with this new ‘at risk’ list?- P.A.L. Co-Ordinator and Facilitators

- Head of Year

- Course director or Lecturer(s)

- Adviser of studies (Science or non-Science)

- Tutor

- Study Support Co-ordinator

Page 17: Introduction

Progress ‘chats’ with students

Trial using Study Support Co-Ordinator to intervene

Contacting advisers

- sent email to adviser of each student of concern (roughly 60 from170)

- emailed details of concerns and the additional tailored support offered

- Websurf contains no indication of CS progress until end of year

Texting and meeting students

- asked them to come for an brief, informal chat

- spoke with 15 students, mostly a positive experience

- aim was to direct student to tailored support but found that chat alone useful

New approach

- Study Support Co-ordinator has time

- Study Support Co-ordinator is not

- involved in presentation of lectures or assessment

- Study Support Co-Ordinator is

- approachable, a subject specialist, and a (relatively!) recent graduate of the Science Faculty

Page 18: Introduction

Evaluation

Evaluation

Work in Progress...

Is micro-level early warning worth it?

1. time and effort involved (estimate, average 1 day per week on top of significant ‘set-up’ costs)

2. correctly identify the students that withdrew from the course or failed?

3. identify students not flagged up by the macro warning system?

4. offer correct progression advice based on the diagnosis?

5. information sent to the advisers improve the uptake of PAL?

6. chats with the Study Support Co-Ordinator improve the uptake of PAL, study skills, motivation?

Ultimately, did this increase retention within Level 1 Computing Science?

Page 19: Introduction

Next Year

Plans for Next Year

Other modules

- Level 1 and Level 2 Computing Science

- Level 1 and Level 2 Mathematics

Formalise process of data collection, sharing and contacting students

- last year’s was ad-hoc and informal

- timeline with triggers for collecting data from tests, tutors, facilitators and for contacting students and advisers

- devise useful, informative discrete ‘categories of concern’ from tutors

Page 20: Introduction

Other departments

Recommendations

Time and effort- creating diagnostic, formative assessment is easier for some subjects than others

- once created – easy to manage resulting data

- speaking to tutors is time consuming, categorisation and automation essential

- estimate spend 1 day per week monitoring and tracking approx 60 of 170 students

Use of data

- who deals with the ‘at risk’ students ?

- exit from course or ‘back on track’ ?

- to ‘get back on track’ more than a chat is required – support mechanism

- no point in identifying students if there is no plan of action