Webinar Series This session is part of e-Assessment Scotland 2013 30 Aug 2013: Learning analytics: a...

Preview:

Citation preview

Webinar Series

This session is part of e-Assessment Scotland 201330 Aug 2013:

Learning analytics: a bottom-up approach to enhancing and evaluating students' online learningJosie Fisher, Fredy-Roberto Valenzuela and Sue Whale (University of New England, Australia)

Your hosts

Professor Geoff Crisp, Dean Learning and Teaching, RMIT Universitygeoffrey.crisp[@]rmit.edu.au

Dr Mathew Hillier, Teaching and Educational Development Institute, University of Queenslandmathew.hillier[@]uq.edu.au

Learning Analytics: A Bottom-up Approach to Enhancing and Evaluating Students’ Online

Learning Josie Fisher

Fredy-Roberto ValenzuelaSue Whale

Structure of the Presentation

1. Literature Review2. Overview of the Research

Projects3. Stage One of the Project4. Stage Two of the Project5. Final Conclusions and Remarks

1.a. Understanding What Analytics means

Analytics refers to the application of business intelligence principles and tools to academia

Academic analytics takes an institution level focus

Learning analytics focuses on the learning process including the relationship between learner, content, institution and educator

1.b. Literature

• Increased enrolment numbers and study options, diversity of student population has also increased

• Students who successfully complete a subject are more likely to reenrol

• Correlation between learners engagement in effective learning practices and their academic success (Campbell et al 2010)

Literature (Cont’)

• Perception of psychological presence of teaching staff, peers and the institution has a significant impact on student learning, satisfaction and motivation.

• Effectiveness of coaching – supported by Bettinger and Baker (2011) and Nelson et al (2009)

2. Overview of the Research Projects

Undertaken

3. Stage One of the Project

Case study - used learning analytics to inform strategies aimed at enhancing student outcomes at the subject level

Although we took a subject level focus institutional benefits were anticipated if these strategies were successful

Subject information

• Compulsory professional ethics subject in Graduate Certificate and Masters

• Full fee• Offered online through LMS• Typical enrolment of around 140• Two years’ relevant work experience

required

Student demographics

• 70% between 30-49• Gender balance even • Most working full-time• All studying part-time• 73% had no previous university studies• High % ESL

Most of these student characteristics can contribute to the likelihood of attrition and reduced success rates (Gabb, Milne & Cao, 2006) and this provided the motivation for developing an intervention program

Research questions

1. What inferences can be drawn from an analysis of students’ outcomes in the first two (online) assessment tasks that could inform subsequent offerings of the subject?

2. To what extent is participation in the LMS a predictor of overall achievement in the subject?

Methods

Historical data from 2011 were analysed in two ways:

1. Participation and achievement in the first two assessment tasks and final results were compared

2. Content access and usage patterns of students were analysed for the five highest and lowest achievers

Analysis of student outcomes

• Students who completed the Activity were more likely to successfully complete the subject (91% compared to 72%)

• Students who completed the Activity achieved an average of 79% in the Test compared to 73% for those who did not

• Students who completed the Activity achieved an average final mark of 66% compared to 58% for those who did not

Cont.

• Of the 9 students who failed the Test, only 3 successfully completed the subject with marks of 50, 52, 52

• The 30 students who achieved Test marks less than 65% achieved an average final mark of 54.5%

Inferences

• Students who completed the optional Activity were more likely to successfully complete the subject than those who didn’t

• Students who completed the Activity achieved better marks in the Test than those who did not complete it

• Test results were an indicator of future success in the subject

Proposed interventions with students who do not complete the Activity

• Provide private discussion space for these students to discuss content related to the activity

• Personal contact from lecturer to identify issues which may be alleviated

Proposed interventions with students who fail the Test

• Host synchronous online forum/s to reiterate subject matter covered in the Test

• Create private discussion space for students where questions can be asked and further explanations provided by lecturer

• Provide additional resources to enhance understanding of key concepts

Extent to which participation in the LMS predicts overall success

Sample of 5 students whose final marks were 80%+

• These students accessed learning materials regularly and in sequence

• Four accessed the weekly podcasts• They all completed the optional Activity• They accessed the topic forums regularly

Cont.

Sample of 5 students whose final marks were 40-49%

• These students demonstrated random access to the LMS

• Three students did not access the weekly podcasts at all while the other two only accessed two of the 12 podcasts

• Some did not access the LMS until after week three

Inferences

Students were more likely to succeed if they:

• Accessed all materials in sequence and regularly

• Accessed additional resources (video clips and podcasts)

• Accessed and participated in forums• Completed the optional Activity

Proposed interventions

• Monitor consistency and regularity of access and highlight potential risks of not maintaining study pattern

• Contact students who have not accessed materials by end of week two

• Emphasise the importance of podcasts

4. Stage Two: The OLT project

• Explored use of simple real time learning analytics by individual educators as part of their learning and teaching activities in order to inform interventions which were implemented within the teaching period

• Offered opportunity to explore potential impact of these interventions on student engagement and satisfaction across a number of subjects offered by distance mode (online) in UNEBS.

Aim

Critical Evaluation of the use of LA and associated interventions to increase student engagement, satisfaction across subjects offered by distance mode (online).

Design

•Implemented by individual educators

•Using student data readily available in the LMS

Approach

Analysis of behaviours

• Utilised results of case study to establish patterns of behaviour and resulting success and satisfaction for students.

• Behaviours identified as limiting success of students:– Limited access to LMS in early weeks of teaching period– Poor results in early assessment– Limited access to materials relating to major assessment

tasks– Inconsistent access across teaching period

These ‘triggers’ were used to inform interventions which were implemented and evaluated during the teaching period – providing opportunities to explore potential impact on students learning experience.

Process and results

• Timing of each intervention was based on individual behaviours of students

• Tailored to utilise range of contact methods – phone calls, email, broad messaging (preferred method – personalised)

Intervention 1Limited access to LMS and learning materials in first 2-3 weeks of teaching period•43 students contacted personally through phone calls•Questions asked:

– Have you looked at unit materials?– Have you experienced problems with access or

working through materials?– Do you have the prescribed text?– If an indication was given that the student was

considering withdrawing – why? Could we assist with getting started and continuing?

Intervention 1 (continued)

• Those unable to be contacted by phone were approached via personal email

• Email responses followed up within 24 hours

Student feedback‘thank you very much for your email…thank you for your support’

‘thank you very much for getting my focus back’

‘sincere thanks for your email – I appreciate your interest’ (Student comments, 2013)

Intervention 2

Poor results or non-participation in early assessment items was an obvious indicator of students who may be struggling with subject content. Students were contacted via email:

‘Dear XXX,The Online Test marked the ‘official’ end of Module One. The content, however, will be applied throughout Module Two, so an understanding of the main theories and concepts introduced in Topics 1.1 – 1.5 will be required. If you did not do as well as you had hoped in the Online Test, I suggest you carefully review the questions you answered wrongly. If you have difficulty in understanding where you went wrong, or you have any other questions, please contact me.’

Intervention 2 (continued)

This intervention focused on early assessment tasks and those which were not weighted heavily. Since performance in these early assessment tasks provides a strong indicator of success in the remainder of the unit, ensuring students engaged with the materials and understood the key concepts was vital.

Intervention 3

The third intervention was aimed at student who had not accessed the LMS for more than 7 days prior to due date of the major assessment task. Students were contacted personally by email of these (36) students subsequently submitted the assignment.

Online Survey (35% response rate)

Subject

Initial prompts were

perceived as positive by students

Prompts encouraged engagement

with materials

Prompts assisted in preparation

for assessment

Prompts enhanced learning

experience/ increased

satisfaction

MM110 5 4.75 5 5

GSB731 5 5 5 5

GSB751 4.1 4.0 4.4 4.1

Table 2: Targeted Students Survey Results (out of 5)

Online Survey (continued)

Students responses•Prompts encouraged them to engage with materials•Considered prompts assisted in assignment preparation•Prompts enhanced learning experience

Online Survey (continued)

Test group: 57 students not targeted by interventions compared to 16 students targeted

Results

QuestionsStudents who were

Targeted by the Interventions

Students who were NOT Targeted by the Interventions

Overall Learning Experience

4.3 4.0

Table 3: Results of End of Trimester Survey (out of 5)

5. Final Conclusions and Remarks

• Tracking students activities and the timely implementation of interventions has the potential to influence students’ behaviours and improve chances of success and hence to enhance students online learning experience

• Behaviours identified which may impact student results in online learning

• Consideration given to timing and format of interventions – emphasis on personal contact

Final Conclusions and Remarks (cont’)

• Student responses revealed that interventions were highly appreciated as they improved learning experience

• Challenges – workload

• Next steps – Interviews

• Future Research – Ethics vs Analytics