79
Multimodal Learning Analytics Xavier Ochoa Escuela Superior Politécnica del Litoral

Multimodal Learning Analytics

Embed Size (px)

Citation preview

Page 1: Multimodal Learning Analytics

Multimodal Learning Analytics

Xavier OchoaEscuela Superior Politécnica del Litoral

Page 2: Multimodal Learning Analytics

http://www.slideshare.net/xaoch

Download this presentation

Page 3: Multimodal Learning Analytics

(Multimodal) Learning Analytics

Page 4: Multimodal Learning Analytics

Learning analytics is the measurement, collection, analysis and reporting of data

about learners and their contexts, for purposes of understanding and optimising learning and the

environments in which it occurs.

Page 6: Multimodal Learning Analytics

Strong focus on online data

Based on the papers it should be called Online-Learning Analytics

Page 7: Multimodal Learning Analytics

Streetlight effect

Page 8: Multimodal Learning Analytics

Where learning is happening?

Page 9: Multimodal Learning Analytics
Page 10: Multimodal Learning Analytics
Page 11: Multimodal Learning Analytics
Page 12: Multimodal Learning Analytics
Page 13: Multimodal Learning Analytics
Page 14: Multimodal Learning Analytics
Page 15: Multimodal Learning Analytics

Why Multimodal Learning

Analytics?We should be looking where it is useful to look,

not where it is easy

Page 16: Multimodal Learning Analytics

There is learning outside the LMS

But it is very messy!

Page 17: Multimodal Learning Analytics

Who is learning?

Page 18: Multimodal Learning Analytics

Who is learning?

Page 19: Multimodal Learning Analytics

Who is learning?

Page 20: Multimodal Learning Analytics

Who is learning? – Traditional way

Page 21: Multimodal Learning Analytics

But there are better ways to assess

learningAt least theoretically

Page 22: Multimodal Learning Analytics

Who is learning? – Educational Research

Page 23: Multimodal Learning Analytics
Page 24: Multimodal Learning Analytics

How can we approach the problem from a

Learning Analytics perspectiveMeasure, collect, analyze and report

to understand and optimize

Page 25: Multimodal Learning Analytics

We need to capture learning traces from

the real worldLook ma, no log files!

Page 26: Multimodal Learning Analytics

In the real world, humans communicate (and leave

traces) in several modalitiesWhat you say is as important as

how you say it

Page 27: Multimodal Learning Analytics

We need to analyze the traces with variable

degrees of sophisticationAnd we have to do it automatically as

humans are not scalable

Page 28: Multimodal Learning Analytics

We need to provide feedback in the real

worldOften in a multimodal way too

Page 29: Multimodal Learning Analytics

But…

Page 30: Multimodal Learning Analytics

Which modes are important to understand

the learning process?We do not know yet…

Page 31: Multimodal Learning Analytics

Possibilities• What we see• What we hear• How we move• How we write• How we blink• Our pulse• Brain activity?• Our hormones?

Page 32: Multimodal Learning Analytics

What are the relevant features of those signals

We do not know yet…

Page 33: Multimodal Learning Analytics

Our current analysis tools are good

enough?We do not know yet…

Page 34: Multimodal Learning Analytics

How to present the information (and uncertainty)

in a way that is actually useful?We do not know yet…

Page 35: Multimodal Learning Analytics

It is an open (but very dark) field

One feels like an explorer

Page 36: Multimodal Learning Analytics

This particular flavor of Learning Analytics is what we called

Multimodal Learning Analytics

Page 37: Multimodal Learning Analytics

Multimodal Learning Analytics is related to:• Behaviorism• Cognitive Science• Multimodal Interaction (HCI)• Educational Research (old school one)• Computer Vision• Natural Language Processing• Biosignals Processing• And as many fields as modes you can think of...

Page 38: Multimodal Learning Analytics

Examples

Page 39: Multimodal Learning Analytics
Page 40: Multimodal Learning Analytics

Math Data Corpus

Page 41: Multimodal Learning Analytics

How to (easily) obtain multimodal features?

What is already there?

Page 42: Multimodal Learning Analytics

Three Approaches• Literature-based features

• Common-sense-based features

• “Why not?”-based features

Page 43: Multimodal Learning Analytics

All approaches proved useful

Proof that we are in an early stage

Page 44: Multimodal Learning Analytics

Video: Calculator Use (NTCU)

Page 45: Multimodal Learning Analytics

Video: Calculator Use (NTCU)• Idea:• Calculator user is the one solving the problem

•Procedure:• Obtain a picture of the calculator• Track the position and angle of the image in the video using

SURF + FLANN + Rigid Object Transformation (OpenCV)• Determine to which student the calculator is pointing in

each frame

Page 46: Multimodal Learning Analytics

Video: Total Movement (TM)

Page 47: Multimodal Learning Analytics

Video: Total Movement (TM)• Idea:•Most active student is the leader/expert?

•Procedure:• Subtract current frame from previous frame• Codebook algorithm to eliminate noise-movement• Add the number of remaining pixels

Page 48: Multimodal Learning Analytics

Video: Distance from center table (DHT)

Page 49: Multimodal Learning Analytics

Video: Distance from center table (DHT)• Idea:• If the head is near the table (over paper) the student is

working on the problem•Procedure:• Identify image of heads• Use TLD algorithm to track heads• Determine the distance from head to center of table

Page 50: Multimodal Learning Analytics

Audio: Processing

Page 51: Multimodal Learning Analytics

Audio: Features• Number of Interventions (NOI)• Total Speech Duration (TSD)• Times Numbers were Mentioned (TNM)• Times Math Terms were Mentioned (TMTM)• Times Commands were Pronounced (TCP)

Page 52: Multimodal Learning Analytics

Digital Pen: Basic Features

Page 53: Multimodal Learning Analytics

Digital Pen: Basic Features• Total Number of Strokes (TNS)• Average Number of Points (ANP)• Average Stroke Path Length (ASPL)• Average Stroke Displacement (ASD)• Average Stroke Pressure (ASP)

Page 54: Multimodal Learning Analytics

Digital Pen: Shape Recognition

Stronium – Sketch Recognition Libraries

Page 55: Multimodal Learning Analytics

Digital Pen: Shape Recognition• Number of Lines (NOL)• Number of Rectangles (NOR)• Number of Circles (NOC)• Number of Ellipses (NOE)• Number of Arrows (NOA)• Number of Figures (NOF)

Page 56: Multimodal Learning Analytics

Analysis at Problem levelSolving Problem Correctly• Logistic Regression to model Student Solving Problem

Correctly• Resulting model was significantly reliable• 60,9% of the problem solving student was identified• 71,8% of incorrectly solved problems were identified

Page 57: Multimodal Learning Analytics

Analysis at problem level

Page 58: Multimodal Learning Analytics

Analysis at Group LevelExpertise Estimation

• Features were feed to a Classification Tree algorithm• Several variables had a high discrimination power between

expert and non-experts• Best discrimination result in 80% expert prediction and 90%

non-expert prediction

Page 59: Multimodal Learning Analytics

Analysis at Group LevelExpertise Estimation

Page 60: Multimodal Learning Analytics

Expert Estimation over Problems

Plateau reached after just 4 problems

Page 61: Multimodal Learning Analytics

Main conclusion: Simple features could identify

expertiseFaster Writer (Digital Pen)

Percentage of Calculator Use (Video)Times Numbers were Mentioned (Audio)

Page 62: Multimodal Learning Analytics
Page 63: Multimodal Learning Analytics

Oral Presentation Quality Corpus

Page 64: Multimodal Learning Analytics

Video Features

• 66 facial features were extracted using Luxand software including both eyes and nose tip to estimate the presenter’s gaze.

Page 65: Multimodal Learning Analytics

Kinect features• Identify Common postures

Page 66: Multimodal Learning Analytics

Kinect features• Identify Common postures

Page 67: Multimodal Learning Analytics

Kinect FeaturesLaban’s theory helps to describe human movement using non-verbal characteristics:

Spatial aspects of movement

Temporal aspects of movement

Fluency, smoothness, impulsivity

Energy and power

Overall activity

Page 68: Multimodal Learning Analytics

EVALUATION METHODOLOGY

Extracted Features

Human coded Criterion

Video

Kinect Body and PostureLanguage

Eye Contact

Page 69: Multimodal Learning Analytics

Results: less than 50% accuracy

What we were measuring was not was humans were measuring

Page 70: Multimodal Learning Analytics

What is next in MLA?

Page 71: Multimodal Learning Analytics

Mode integration framework for MLA

Currently pioneered by Marcelo Worsley

Page 72: Multimodal Learning Analytics

Developing Multimodal Measuring

DevicesOur Fitbits

Page 73: Multimodal Learning Analytics
Page 74: Multimodal Learning Analytics

Record different learning settings

And share them with the community

Page 75: Multimodal Learning Analytics

Conclusions

Page 76: Multimodal Learning Analytics

Multimodal Learning Analytics is not a subset

of Learning AnalyticsCurrent Learning Analytics is a subset of MLA

Page 77: Multimodal Learning Analytics

Some problems are easy, some hard

But we do not know until we try to solve them

Page 78: Multimodal Learning Analytics

There is a lot of exploring to do

And we need explorers

Page 79: Multimodal Learning Analytics

Gracias / Thank youQuestions?

Xavier [email protected]://ariadne.cti.espol.edu.ec/xavierTwitter: @xaoch

http://www.sigmla.org