11
How can DIT academic staff use Webcourses data and reporting to make better informed decisions around student learning? 1

M Sc Applied eLearning - WIP Presentation

Embed Size (px)

Citation preview

Page 1: M Sc Applied eLearning - WIP Presentation

How can DIT academic staff use Webcourses data and reporting to make better informed decisions

around student learning?

1

Page 2: M Sc Applied eLearning - WIP Presentation

» The research is can be seen as a an evaluation of the data analysis tools within Blackboard Learn. - What information can we glean from these analytical features?

» Explore how engagement with course modules compares with module grades

» This study will focus on the following features

– Module Reports (Real data)

– Performance Dashboard

– Retention Center

2

Instructor View in BlackBoard. Test Module created for the

purposes of workshop

Page 4: M Sc Applied eLearning - WIP Presentation

4

•Educause Learning Initiative (Oblinger, Brown)

•Papers submitted to the annual LAK – Learning and Analytics Knowledge Conference 2011/2012/2013

•The Educause Annual Conference 2012/2013 , Waiting on 2013 (120 days)

•Other Key Players – Dawson, Siemens , Oblinger ,Brown, Elsa,Campbell,Mc William,

•ECAR 2013 – Discusses students lukewarm attitude towards learning analytics

•Previous data mining projects within LMS

•Purdue University – Very much the flagship project fro L.A (Campbell)

•The Indicators Project- Conducts research into the analysis of research data within a LMS (Beer , Clark, Jones) - Click activity? Critical of some of these studies as

debatable if click activity is indicator of LMS activity.

Literature Review

Page 5: M Sc Applied eLearning - WIP Presentation

5

Previous studies on LMS data

The project identified a positive correlation between student participation In online discussion forums and final academic performance. (Macfadyen & Dawson, 2010)Active site engagement with LMS can serve as an effective predictor of course outcomes (Smith, Lange, & Huston 2012, p. 60) ;(Dawson, Mc William, & Tan (2008, p. 227)

Other Readings

2013 ECAR Report (Dahlstrom, E., Walker, J., & Dziuban, C. (2013)

7/10 HEI see learning analytics as a major priority but only 10% of HEI collect system generated data needed for analytics

Discusses Ethical and Privacy Issues

Highlighted some considerations when I was submitting to RECDiscusses student’s lukewarm attitude to learning analyticsOpenness and transparency – Adhere to good ethical guidelines/information privacy guidelinesPersonalised outreach not impersonalised digital profiling

ECAR 2012/2013 –LMS listed in top three for preferred method of communication along with face toface interaction and email. (Dahlstrom, E., Walker, J., & Dziuban, C. (2013)

ECAR 2013 Discusses student’s lukewarm attitudes towards learning analytics (Dahlstrom, Walker, & Dziuban, 2013)

LAK 11 -> Raise deep and complex privacy issues \Perception of a digital big brother (Brown , 2013) ,(Ferguson ,2012) (LAK 11 Educause, 2011), (Prinsloo & Slade, 2013)

Page 6: M Sc Applied eLearning - WIP Presentation

6

Other Lit Review Findings

Lit review highlights numerous studies involving mining of LMS data using third party software outside of the LMS such as SAS, SPSS,Business Objects, Oracle,Student Explorer etc. - Can be extremely difficult.

Very few studies focus on the inbuilt reporting features of LMS.

Lack of research into inbuilt reporting features within LMS

Commercial systems’ reportage of data is “basic and not intuitive”.“The current visualisation mechanisms available in BlackBoard 8.0 and BlackBoardVista are limited in scope and difficult for teachers to readily interpret and action.”Dawson, S., & McWilliam, E. (2008).

Other research studies have indicated possibilities for course redesign

Page 7: M Sc Applied eLearning - WIP Presentation

» Snowball sampling /referral sampling technique to identify staff participants (Next step -Circulate information sheets and consent forms) (Nov)

Mixed Method

» Quantitative Analysis conducted at end of Semester one on 4-5 modules via module reporting. Engagement within course modules will be compared with assessment results. All data de-identified.(Jan)

» Resource – Workshop for staff demonstrating (Feb)

Module Reports

Performance Dashboard

Retention Center

» Interview staff participants in March 2014 (Qualitative)

7

Page 8: M Sc Applied eLearning - WIP Presentation

8

Dummy data provided during workshop to demonstrate analysis features

John Campbell identifies these factors within LMS as highly predictive of student success (Feldstein, 2013)

Page 9: M Sc Applied eLearning - WIP Presentation

9

Module Reports (Based on 4-5 DIT Course Modules) Performance Dashboard

Page 10: M Sc Applied eLearning - WIP Presentation

Target Journals

» Journal of Information Technology Education

» International Journal of Technology, Knowledge and Society

» MERLOT Journal of Online Learning and Teaching

Target Conference

» LAK2015 (5th Learning Analytics and Knowledge Conference 2015)

10

Page 11: M Sc Applied eLearning - WIP Presentation

Eportfolio

» Currently migrating e-Portfolio from Mahara to Yola

11