Upload
simon-buckingham-shum
View
103
Download
1
Tags:
Embed Size (px)
DESCRIPTION
Designing Systemic Learning Analytics at the Open University Belinda TynanPro-Vice-Chancellor Learning & TeachingThe Open University, UK Simon Buckingham Shum Knowledge Media InstituteThe Open University, UK Replay from today's webinar in the SoLAR online open course Strategy & Policy for Systemic Learning Analytics. Thanks to the Australian Office for Learning and Technology for sponsoring this, and to George Siemens for convening (replay): Abstract: The OU has been analysing student data and feeding this back to faculties since its doors opened 40 years ago. However, the emergence of learning analytics technologies open new possibilities for engaging in more effective sensemaking of richer learner data, and more timely interventions. We will introduce the framework we are developing to orchestrate the rollout of a systemic organisational analytics infrastructure (both human and technical), and discuss some of the issues that arise. We will also describe how strategic research efforts will key into this design, should they prove effective.
Citation preview
Designing Systemic Learning Analytics at the Open University
Simon Buckingham Shum Knowledge Media Institute The Open University, UK
Strategy & Policy for Systemic Learning Analytics SoLAR Open Course, 11th Oct 2013 https://learn.canvas.net/courses/182/wiki/designing-systemic-analytics-at-the-open-university
Belinda Tynan Pro-Vice-Chancellor Learning & Teaching The Open University, UK
overview
Belinda: developing an institutional strategy — framework and implementation
Simon: research perspectives
— 3 metaphors for systemic analytics
Discussion
Learning and Teaching
Strategy for Systemic Deployment of Analytics at the Open University
Belinda Tynan, Pro-Vice-Chancellor, Learning and Teaching Kevin Mayles, Senior Manager, Learning and Teaching
Learning and Teaching
Analytics for student success vision
p.4
A clear vision has been developed to galvanise effort across the institution on the focused use of analytics to drive student success
Vision To use and apply informa2on strategically (through specified indicators) to retain students and progress them to complete their study goals Mission This needs to be achieved at : • a macro level to aggregate informa5on about the student learning
experience at an ins5tu5onal level to inform strategic priori5es that will improve student reten5on and progression
• a micro level to use analy5cs to drive short, medium and long-‐term interven5ons
Learning and Teaching
What drives student success?
p.5
We have an emerging picture of the factors affecting student success based on existing statistical analyses, literature and “institutional knowledge” and our current use of associated indicators
Framework adapted from Woodley et. al. (2001) Student Progress in Distance Education: Kember’s model re-visited
• Early contact • Early engagement • Study calendar / scheduling • Tutor support • Peer support & belonging • Study habits • Employer support • Family support
• Personal life events • Financial issues
• Advice on course choice • Subject studied • Prepara5on for study • Learning design • Quality of study materials • Workload • Module assessment strategy
• Language ability • Previous educa5on / OU study • Ethnic group • Socio-‐economic background • Disability • Age • Study goal / mo5va5on • Gender Entry
characteris-cs Academic
compa-bility
Social and academic integra-on
External factors
Student Success
Indicators used in exis-ng analy-cs There are a number of indicators with suppor5ng evidence that we currently use in our analysis models
Clear evidence of impact but currently not used in analy-cs We have a number of factors for which there is clear evidence of the impact on success but are not being used in current analy5cs models due to lack of data or insufficient inves5ga5on
Unclear evidence base There are a number of factors that the OU believes or literature suggests have an impact on student success but where we have no clear evidence at this 5me due either to lack of data availability or insufficient inves5ga5on
Results from a review of exis-ng evidence on the drivers of student success are giving us a mixed picture
Indicators with evidence of no impact There are a number of indicators with suppor5ng evidence that suggest they have a minimal impact on success
Learning and Teaching
Developing institutional capabilities and strengths 3 year strategic roadmap The OU is developing its capabilities in 10 key areas that build the underpinning strengths required for the effective deployment of analytics
We need to ensure we have the right architecture and processes for collec5ng the right data and making it accessible for analy5cs
– we need a ‘big data’ mind-‐set
The university needs world class capability in data science to con5nually mine the data and build rapid prototypes of simple tools, and a clear pipeline for the outputs to be mainstreamed into opera5ons
Benefits will be realised through exis5ng business processes
impac5ng on students directly and through enhancement of
the student learning experience – we will develop an ‘analy5cs
mind-‐set’ in these areas
p.6
Learning and Teaching
Framework for harnessing analytics for student success through driving interventions
p.7
Analytics will be applied throughout the cycle of the student learning experience
Analy-cs applied… Example business processes Example datasets used
For ac-on
Op5mise student alloca5on to tutor groups
Development of learning systems
Assessment strategy and scheduling
Student pass/fail predic5ons
Study behaviour profiles
Pass rates modelling
In ac-on
Early contact with ‘at risk’ students
Module presenta5on issue flagging
Student ‘at risk’ predic5ve indicators
Helpdesk contact records
On ac-on Annual module and programme review
Learning design
Module performance KPIs
Learning design profiles
“In Action, On Action” from Donald Schön The Reflective Practitioner
Learning and Teaching
Applying ‘in action’ analytics to drive student success?
p.8
We will use analytics to put key information relating to student success in the hands of those in a position to take action
Tutor Group List Students’ study history
Feedback from previous tutors Predicted probability of passing
‘At risk’ factors Associate Lecturer
Plan early contact with most at risk students Monitor engagement prior to first TMA Refer issues to SST quickly
Weekly Alert Dashboard Weekly update of students’ predicted probability of passing / progressing
List of most ‘at risk’ students this week
Target resources at most at risk students Call or email students on at risk list to offer support No5fy ALs of any issues arising in their groups
Student Support Team
Module performance report Predicted pass rate vs target pass rate
updated during presenta5on Analysis of online learning ac5vity usage / engagement pa^erns
Iden5fy any issues with the module whilst in presenta5on and take ac5on to rec5fy Evaluate the use of learning assets to inform future produc5on
Faculty Academics
Senior Management
Student success measures Indicators derived from sta5s5cal modelling that
underpin student progression measures
Monitor student progression forecasts against target – iden5fy correc5ve ac5on Target resources at specific ‘pinch points’ in the student journey
Learning and Teaching
Evaluating impact and driving action
p.9
The basis for evaluation needs to link interventions to measurable outcomes of student success
Learning and Teaching
Evaluating impact and driving action
p.10
The basis for evaluation needs to link interventions to measurable outcomes of student success
STUDENT SUCCESS
Learning and Teaching
Evaluating impact and driving action
p.11
The basis for evaluation needs to link interventions to measurable outcomes of student success
STUDENT SUCCESS
Interven-ons For ac5on In ac5on On ac5on
Learning and Teaching
Evaluating impact and driving action
p.12
The basis for evaluation needs to link interventions to measurable outcomes of student success
Governance and
Management
STUDENT SUCCESS
Interven-ons For ac5on In ac5on On ac5on
Learning and Teaching
Evaluating impact and driving action
p.13
The basis for evaluation needs to link interventions to measurable outcomes of student success
Drivers of student success Governance
and Management
STUDENT SUCCESS
Interven-ons For ac5on In ac5on On ac5on
Learning and Teaching
Improve ins-tu-onal capabili-es and processes
Evaluating impact and driving action
p.14
The basis for evaluation needs to link interventions to measurable outcomes of student success
Drivers of student success Governance
and Management
STUDENT SUCCESS
Interven-ons For ac5on In ac5on On ac5on
Learning and Teaching
Improve ins-tu-onal capabili-es and processes
Evaluating impact and driving action
p.15
The basis for evaluation needs to link interventions to measurable outcomes of student success
Drivers of student success Governance
and Management
STUDENT SUCCESS
Interven-ons For ac5on In ac5on On ac5on
Are we seeing expected
outcomes of our
interven5ons?
Are we doing the right things as guided by the
evidence?
Analytics will drive action to increase student success Vision: To use and apply information strategically (through specified indicators) to retain students and progress them to complete their study goals
Analytics will drive action to increase student success Vision: To use and apply information strategically (through specified indicators) to retain students and progress them to complete their study goals
Recruit Retain Progress Complete Success outcomes
and leading indicators
Analytics will drive action to increase student success Vision: To use and apply information strategically (through specified indicators) to retain students and progress them to complete their study goals
Recruit Retain Progress Complete Success outcomes
and leading indicators
Student support ac5vi5es
Learning & teaching ac5vi5es
Measures of our opera5onal
performance and interven5ons
Drivers of student success
Evidence of the drivers of student success guides what we do and what we measure
Analytics will drive action to increase student success Vision: To use and apply information strategically (through specified indicators) to retain students and progress them to complete their study goals
Recruit Retain Progress Complete Success outcomes
and leading indicators
Student support ac5vi5es
Learning & teaching ac5vi5es
Measures of our opera5onal
performance and interven5ons
Dashboards / Reports / Tools
Ins5tu5onal Dashboard
PVCs
Deans
Programme Directors
Module Teams
Student Support Teams
Indicators and
measures fed into
dashboards and reports at relevant
levels
Drivers of student success
Evidence of the drivers of student success guides what we do and what we measure
Analytics will drive action to increase student success Vision: To use and apply information strategically (through specified indicators) to retain students and progress them to complete their study goals
Recruit Retain Progress Complete Success outcomes
and leading indicators
Student support ac5vi5es
Learning & teaching ac5vi5es
Measures of our opera5onal
performance and interven5ons
Dashboards / Reports / Tools
Ins5tu5onal Dashboard
PVCs
Deans
Programme Directors
Module Teams
Student Support Teams
Indicators and
measures fed into
dashboards and reports at relevant
levels
Drivers of student success
Evidence of the drivers of student success guides what we do and what we measure
ACTION
Interven-on
Analytics will drive action to increase student success Vision: To use and apply information strategically (through specified indicators) to retain students and progress them to complete their study goals
Recruit Retain Progress Complete Success outcomes
and leading indicators
Student support ac5vi5es
Learning & teaching ac5vi5es
Measures of our opera5onal
performance and interven5ons
Dashboards / Reports / Tools
Ins5tu5onal Dashboard
PVCs
Deans
Programme Directors
Module Teams
Student Support Teams
Indicators and
measures fed into
dashboards and reports at relevant
levels
Drivers of student success
Evidence of the drivers of student success guides what we do and what we measure
ACTION
Interven-on
Evalua-on of the outcomes from interven5ons increases our evidence base of what drives student success
questions/comments?
3 metaphors for systemic analytics
1. the aquarium
2. from exoskeleton to nervous system 3. resilience through biodiversity
metaphor 1
the aquarium
systems strategy: research the key variables for a healthy ecosystem and evolve predictive
models as rapidly as possible to detect variance
Aquarium science enables aquarium analytics which monitor the health of the ecosystem
fish aquarium science
learners? learning science
instructional design
It’s all about knowing what to watch
Purdue University Signals: exemplar ‘healthy ecosystem’ variables
27
Purdue University Signals: exemplar ‘healthy ecosystem’ variables
28
Key variables identified: • ACT or SAT score • Overall grade-point average • CMS usage composite • CMS assessment composite • CMS assignment composite • CMS calendar composite
Hmmm…
no learning sciences no learning design
underpinning these predictive models of student success
models based on a mix of
institutional know-how about student success, and analysing
behavioural data 29
the opportunity for the
learning sciences to combine with your university’s
collective intelligence 30
Predictive modelling of student outcomes
Registra-on PaMern
CRM contact
VLE interac-on
Assignment grades
Demo-‐graphics
? Can we combine datasets, and use machine learning to build models to identify ‘signature’ patterns associated with different kinds of students?
Library interac-on
OpenLearn interac-on
FutureLearn interac-on
App X interac-on
OU track record
Predictive modelling of student outcomes
Registra-on PaMern
CRM contact
VLE interac-on
Assignment grades
Demo-‐graphics
? Does VLE data carry information that provides more precise early identification of failing students than is currently possible? Simple example (just 3 demographic attributes and VLE):
Input: Demographic data: New/Continuing student, Sex, Previous education VLE interactions without qualifying the type (any click counts)
Goal: Evaluate the probability that the student does not submit TMA1 or submits and scores lower than 40.
Method: Naïve Bayes network (e.g. see Bishop, 2009)
Library interac-on
OpenLearn interac-on
FutureLearn interac-on
App X interac-on
OU track record
Learning and Teaching
Back to the OU’s analytics framework
Learning and Teaching
Predictive modelling within the framework
VLE user trace data / student demographics / academic achievement
Strategic internal funding to advance a promising technique from an externally funded (JISC) project, and embed within OU student support processes: A.L. Wolff and Z. Zdrahal (2012). Improving Retention by Identifying and Supporting “At-risk” Students. EDUCAUSE Review Online, July-August 2012. http://www.educause.edu/ero/article/improving-retention-identifying-and-supporting-risk-students
Zdenek Zdrahal Lead, KMi Predictive Modelling Team
http://kmi.open.ac.uk/people/member/zdenek-zdrahal
Learning and Teaching
Predictive modelling within the framework
Develop and Validate Predic-ve Models of
student success (module comple-on)
in order to trigger more -mely alerts
VLE user trace data / student demographics / academic achievement
Learning and Teaching
Predictive modelling within the framework
Develop and Validate Predic-ve Models of
student success (module comple-on)
in order to trigger more -mely alerts
Requirements to mainstream the models in
the VLE
VLE user trace data / student demographics / academic achievement
Learning and Teaching
Predictive modelling within the framework
Develop and Validate Predic-ve Models of
student success (module comple-on)
in order to trigger more -mely alerts
Prototype Student Support Team
dashboards
Requirements to mainstream the models in
the VLE
VLE user trace data / student demographics / academic achievement
Learning and Teaching
Predictive modelling within the framework
Develop and Validate Predic-ve Models of
student success (module comple-on)
in order to trigger more -mely alerts
New tool for Student Support Teams, to enable -mely interven-on. E.g. real-‐-me traffic lights on at risk students
Requirements to mainstream the models in
the VLE
VLE user trace data / student demographics / academic achievement
Prototype Student Support Team
dashboards
Learning and Teaching
Predictive modelling within the framework
Develop and Validate Predic-ve Models of
student success (module comple-on)
in order to trigger more -mely alerts
New tool for Student Support Teams, to enable -mely interven-on. E.g. real-‐-me traffic lights on at risk students
Modules are accompanied by machine-‐readable metadata that increases the power of machine learning when it comes to data analysis
Requirements to mainstream the models in
the VLE
VLE user trace data / student demographics / academic achievement
Prototype Student Support Team
dashboards
Problem specification Learning to dynamically id an at-risk student
We are here We know
Problem specification Learning to dynamically id an at-risk student
We are here We know We predict
e.g. The Retain project Does VLE activity add value to predictive models?
We know We predict
Sex
Educ
New/Cont
VLE
Model the probability of failing at TMA1 which is known to be a key predictor of final outcome either by not submitting TMA1, or by submitting with score < 40.
TMA1
• Demographic profile 1 – X – Y – Z
Sex
Educa5on
N/C TMA1
Without VLE data: Probability of failing at TMA1 = 18.5%
Student profile 1
Student profile 1 • Demographic profile 1
– X – Y – Z
Sex
Educa5on
N/C TMA1
Without VLE data: Probability of failing at TMA1 = 18.5%
Sex
Educa5on
N/C
VLE
TMA1
Clicks Probability Nr of students 0 64% 4
1-‐20 44% 3
21-‐100 26% 5
101-‐800 6.3% 14
With VLE data, a higher fidelity story:
Student profile 2 Sex
Educa5on
N/C TMA1
Without VLE data: Probability of failing at TMA1 = 7.7%
• Demographic profile 2 – X – Y – Z
Student profile 2 Sex
Educa5on
N/C TMA1
Without VLE data: Probability of failing at TMA1 = 7.7%
Sex
Educa5on
N/C
VLE
TMA1
Clicks Probability Nr of students 0 39% 35
1-‐20 22% 74
21-‐100 11.2% 178
101-‐800 2.4% 461
With VLE data, a higher fidelity story:
• Demographic profile 2 – X – Y – Z
Potential to augment student support teams with predictive modelling once validated
p.47
Query
VLE interac-on
Assignment grades
Demo-‐graphics
OU track record
Potential to augment student support teams with predictive modelling once validated
p.48
7 of your students have fail trajectory BUT prioritize Nigel, then Sue, then Ian because - has not engaged with VLE - at least one TMA below 40 - has not submitted 5 TMAs
Query
VLE interac-on
Assignment grades
Demo-‐graphics
OU track record
Why do I need a variable ML approach? Can’t I just use one method (off the shelf)?
p.49
Registra5on Pa^ern
CRM interac5ons
Library interac5on
FutureLearn interac5on
Train and Learn as new data is added using variable methods
Methods successfully tested, to be further developed: • Induction of decision tree (ID3, C4.5 from the Weka toolkit) • Support Vector Machine (from Weka) • Bayes network (Microsoft Infer.NET; SamIam - Stanford Univ.) • Naïve Bayes (see the example and Demo Cases) • Linear regression • Logistic regression • GUHA (General Unary Hypotheses Automaton)
metaphor 2
from exoskeleton to nervous system
systems strategy: embed faster feedback loops,
and build sensemaking capacity at all levels
Evolving the OU from a digital exoskeleton to a nervous system?
Ed Dumbill: http://strata.oreilly.com/2012/08/digital-nervous-system-big-data.html
Learning and Teaching
The OU’s collective intelligence Macro Level Analytics
Towards multilevel systemic analytics for student success Designing better feedback loops at all levels of learning (students + staff)
Micro Level Analytics Student Interaction Traces
Learning and Teaching
The OU’s collective intelligence Macro Level Analytics
Towards multilevel systemic analytics for student success Designing better feedback loops at all levels of learning (students + staff)
Micro Level Analytics Student Interaction Traces
Student Support Teams Associate Lecturers
Researchers
interpretation/intervention
Learning and Teaching
The OU’s collective intelligence Macro Level Analytics
Towards multilevel systemic analytics for student success Designing better feedback loops at all levels of learning (students + staff)
Micro Level Analytics Student Interaction Traces
Student Support Teams Associate Lecturers
Researchers
interpretation/intervention
Data Wranglers Researchers
Learning and Teaching
The OU’s collective intelligence Macro Level Analytics
Towards multilevel systemic analytics for student success Designing better feedback loops at all levels of learning (students + staff)
Micro Level Analytics Student Interaction Traces
VC Executive Faculties
Module Teams
Student Support Teams Associate Lecturers
Researchers
interpretation/intervention
Data Wranglers Researchers
Learning and Teaching
The OU’s collective intelligence Macro Level Analytics
Towards multilevel systemic analytics for student success Designing better feedback loops at all levels of learning (students + staff)
Micro Level Analytics Student Interaction Traces
VC Executive Faculties
Module Teams
Student Support Teams Associate Lecturers
Researchers
interpretation/intervention
Data Wranglers Researchers
Visual Analytics Design
Quality Data, Integrated
Analytics Competency Team
Organisational Capacity building
Analytics Research
Computational Platforms
metaphor 3
build resilience
systems strategy: MOOCs can be viewed as a system-level ‘shock’ to the HigherEd ecology (‘regime shift’?)
build resilience by expanding our diversity and capacity
to sense the dynamic environment
MOOCs are an innovation and research platform — analytics will be critical
http://www.ted.com/talks/daphne_koller_what_we_re_learning_from_online_education.html http://people.kmi.open.ac.uk/sbs/2013/01/emerging-mooc-data-analytics-ecosystem http://www.slideshare.net/abelardo_pardo/pushing-the-mooc-envelope-with-learning-analytics http://www.moocresearch.com/research-initiative/about#Cost,%20Performance%20Metrics%20and%20Learner%20Analytics
FLx: experimental analytics ecosystem
Partners share ideas, workflows, analytics, and visualizations, collaborating around common interests.
FLx: experimental analytics ecosystem
Partners share ideas, workflows, analytics, and visualizations, collaborating around common interests.
FLx: experimental analytics ecosystem
Partners share ideas, workflows, analytics, and visualizations, collaborating around common interests.
FLx: experimental analytics ecosystem
Partners share ideas, workflows, analytics, and visualizations, collaborating around common interests.
FLx: experimental analytics ecosystem
Partners share ideas, workflows, analytics, and visualizations, collaborating around common interests.
FLx: experimental analytics ecosystem
Partners share ideas, workflows, analytics, and visualizations, collaborating around common interests.
Workflow for social network analytics in NodeXL: are learners forming effective peer-relationships?
Import data into OpenRefine
Reshape using template
Export data to CSV
Process in NodeXL and generate network
Martin Hawksey http://mashe.hawksey.info/2013/02/lak13-recipes-in-capturing-and-analyzing-data-using-sna-on-canvas-discussions-with-nodexl-for-when-its-not-a-snapp
Workflow for quantifying reflections in forum posts: what elements of reflection are evident?
Convert discussion threads in comma-separated file format
Annotate text segments using custom components for UIMA
Convert results in CSV
Ullmann, T. D., Wild, F., & Scott, P. (2012). Comparing Automatically Detected Reflective Texts with Human Judgements. In 2nd Workshop on Awareness and Reflection in Technology-Enhanced Learning. Presented at the 7th European Conference on Technology-Enhanced Learning, Saarbruecken, Germany. Retrieved from http://ceur-ws.org/Vol-931/paper8.pdf
Inspect and analyse data with R
Reason over annotations with Drools
Workflow for academic writing analytics: to what extent does student writing display the hallmarks of scholarly argument?
Extract submitted essay drafts from Course XYZ
Convert to text files for XIP
Analyse using rhetorical parser
Render in custom dashboard Annotate onto source text
Simsek D, Buckingham Shum S, Sándor Á, De Liddo A and Ferguson R. (2013) XIP Dashboard: Visual Analytics from Automated Rhetorical Parsing of Scientific Metadiscourse. 1st International Workshop on Discourse-Centric Learning Analytics, at 3rd International Conference on Learning Analytics & Knowledge. Leuven, BE (Apr. 8-12, 2013). Open Access Eprint: http://oro.open.ac.uk/37391
Thank you… Q&A
Belinda Tynan http://www.open.ac.uk/about/main/admin-and-governance/executive-team/pro-vice-chancellor-learning-and-teaching
Simon Buckingham Shum http://simon.buckinghamshum.net / @sbskmi