28
Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham www.nottingham.ac.uk/lsri/msh

Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

Embed Size (px)

Citation preview

Page 1: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

Evaluation Methods for Mobile Learning

Mike SharplesLearning Sciences Research Institute

University of Nottingham

www.nottingham.ac.uk/lsri/msh

Page 2: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

What is mobile learning?• Learning with portable technology

– Focus on the technology– Could be in a fixed location, such as a classroom

• Learning across contexts– Focus on the learner– Could use portable or fixed technology– How people learn across locations and transitions

• Learning in a mobile world– Focus on the mobile society– How to understand people and technology in constant

mobility– How to design learning for the mobile society

Page 3: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

Can mobile learning be effective?

• We think so!– Classroom response systems (Draper, Dufresne, Roschelle)– Group learning with wireless mobiles and phones (Nussbaum et

al., Dillenbourg)– Classroom handheld simulation games (Collella, Virus Game)– Mobile guides (Tate Modern, Caerus, Mobile Bristol)– Connecting learning in formal and informal settings (Butterfly

Watching, MyArtSpace)

• Lack of convincing studies of mobile learning – Attitude surveys and interviews: “they say they enjoy it”– Observations: “they look like they are learning”– With a few exceptions (e.g. Nussbaum et al.)

Page 4: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

Issues in evaluating mobile learning

• It may be mobile– Tracking activity across locations

• It may be distributed– Multiple participants in different locations

• It may be informal– How can we distinguish learning from other activities?

• It may be extended– How can we evaluate long-term learning?

• It may involve a variety of personal and institutional technologies– Mobile and fixed phones, desktop machines, laptops, public

information systems• There may be specific ethical problems

– How can and should we monitor everyday activity?

Page 5: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

What do you want to know?• Usability

– Well-tested methods:• Expert evaluations (e.g. Heuristic evaluation and Cognitive

Walkthrough)• Lab-based comparisons

• Usefulness– Hard: depends on the educational aims and context

• Field-based interviews, observations and walk-throughs– Ethnographic analysis– Critical incident studies (including focus group replay)

• Learning outcome measures– Control group– Pre-test, intervention, post-test, delayed post-test

• Logbooks and diaries– Logbooks of activity– Diary-diary interview used successfully by Vavoula and others for

intensive study of everyday learning over time

Page 6: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

Some evaluation methods (contd.)• Usefulness (contd.)

– Other feedback methods• Telephone probes• Snap polls• Interviews• Focus groups

– Automatic logging • Recording where, when and how a mobile device is used• Quantitative analysis of student learning action (Trinder et al., 2005)

– Learning outcome measures• Control group• Pre-test, intervention, post-test, delayed post-test

• Attitude– Attitude surveys

• General attitude surveys are little use: almost all innovations are rated between 3.5 and 4.5 on a 5 point Likert scale

• Specific questions can indicate issues (e.g. interface problems)– Microsoft Desirability Toolkit

• Users indicate their attitudes through choice of cards

Page 7: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

Case studies

• Student Learning Organiser– Long term learning

• MyArtSpace– Learning across contexts

• PI: Personal Inquiry– Ethics

Page 8: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

Interactive Logbook projectCorlett, D., Sharples, M., Chan, T., Bull, S. (2005) Evaluation of a Mobile Learning Organiser for University Students, Journal of Computer Assisted Learning, 21, pp. 162-170.

• 17 MSc Students, University of Birmingham

• Academic year 2002-3• Loaned iPAQ with wireless LAN

for personal use• Learning organiser

Time managerCourse managerCommunicationsConcept mapper

• Standard toolsEmailInstant messengerWeb browsing

• Free to download further software from the web

Page 9: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

Evaluation methods• Questionnaires

– administered at 1, 4, 16 weeks, and 10 months• Focus groups, following each of the questionnaires• Logbooks

– Students kept logbooks for six weeks – Students’ attitudes towards the learning organiser– Patterns of usage of the various applications (including any

they had downloaded themselves)– Patterns of usage of the technology, particularly with respect

to wireless connectivity– Ease of use issues– Issues relating to institutional support for mobile learning

devices• Videoed interactions

– to compare the concept map tools, three students were videoed carrying out an exercise, which they later commented on after reviewing the video

Page 10: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

Data• Usability

– Size, memory, battery life, speed, software usability, integration

• Usfulness– of PDAs– of Learning Organiser– of concept mapping tools

• Patterns of use– Locations– Changes over time

Page 11: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

Frequency of use

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

4 weeks 16 weeks 10 months

% p

arti

cip

ants

Less thantwice a week

At leasttwice a week

At least oncea day

Many times aday

Page 12: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

Use of PDA in specific locationsRank order, for coursework, and in brackets for other activities

4 weeks 16 weeks 10 months

Home 1= (1) 2 (1) 2 (1)

Department 1= (2) 1 (2) 1 (3)

University (elsewhere)

3 (4) 4 (4) 3 (4)

Travelling 4 (3) 3 (3) 4 (2)

Page 13: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

Perceived usefulness of tools (“useful” or “very useful”)

4 Weeks 16 Weeks 10 months

Timetable 59% (10) 64% (9) 82% (14)

Web browser 65% (11) 64% (9) 71% (12)

Instant messaging 59% (10) 50% (7) 71% (12)

Email 76% (13) 79% (11) 65% (11)

Course materials 59% (10) 43% (6) 41% (7)

Supplementary materials 53% (9) 43% (6) 24% (4)

Concept mapper 35% (5) 14% (2) 0% (0)

Page 14: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

Perceived impact on activitiesNumber of students naming tool as having greatest

impact

Learning Personal Organisation Entertainment

Course materials (6) Timetable and deadlines (6) Media player (7)

Browser (3) Calendar (5) Games (3)

Timetable and deadlines (2) Writing/note taking (2) Messenger (2)

Writing/note taking (1) Email (2) Browser (1)

Calendar (1) Task manager (1) Writing/note taking (1)

Reader (1)

Page 15: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

Results• Some usability problems

– Especially battery life

• Most use of calendar, timetable and communications• PDA-optimised content was well used • Importance of connectivity• No clear demand for a specific “student learning

organiser”• Concept mapping tools were not widely used• Not generally used while travelling• Ownership is important• Need for institutional support

Page 16: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

MyArtSpace

• Service on mobile phones for enquiry-led museum learning

• Aim to make school museum visits more engaging and educational

• Students create their own interpretation of a museum visit which they explore back in the classroom

• Learning through structured enquiry, exploration

• Museum test sites – Urbis (Manchester)– The D-Day Museum (Portsmouth)– The Study Gallery of Modern Art (Poole)

• About 3000 children during 2006

Page 17: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

How it works• In class before the visit, the teacher sets an inquiry topic• At the museum, children are loaned multimedia phones• Exhibits in the museum have 2-letter codes printed by them• Children can use the phone to

– Type the code to ‘collect’ an object and see a presentation about it– Record sounds– Take photos– Make notes– See who else has ‘collected’ the object

• All the information collected or created is sent automatically to a personal website showing a list of the items

• The website provides a record of the child’s interpretation of the visit

• In class after the visit, the children share the collected and recorded items and make them into presentations

Page 18: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

Lifecycle evaluation

• Micro level: Usability issues – technology usability– individual and group activities

• Meso level: Educational Issues– learning experience as a whole– classroom-museum-home continuity – critical incidents: learning breakthroughs and breakdowns

• Macro level: Organisational Issues– effect on the educational practice for school museum

visits – emergence of new practices – take-up and sustainability

Page 19: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

EvaluationAt each level

• Step 1 – what was supposed to happen – pre-interviews with stakeholders (teachers, students,

museum educators), – documents provided to support the visits

• Step 2 – what actually happened– observer logs– post-focus groups– analysis of video diaries

• Step 3 – differences between 1 & 2– reflective interviews with stakeholders – critical incident analysis

Page 20: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

Three levels, in three stages, throughout the project

mic

ro

mes

o

m

acro

design implement deploy

Technology robust enough to support full user trial

Service deployed long enough to assess impact

Page 21: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

Summary of results

• The technology worked– Photos, information on exhibits, notes, automatic

sending to website• Minor usability problems• Students liked the ‘cool’ technology • Students enjoyed the experience more than

their previous museum visit• The students indicated that the phones made

the visit more interactive• Teachers were pleased that students

engaged with the inquiry learning task

Page 22: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

Usability Issues+ Appropriate form factor

+ Device is a mobile phone, not a typical handheld museum guide

+ Collecting and creating items was an easy and natural process

– Mobile phone connection– Text annotations– Integration of website with commercial software,

e.g. PowerPoint

Page 23: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

Educational Issues+ Supports curriculum topics in literacy and media studies + Encourages meaningful and enjoyable pre- and post-visit

lessons + Encourages children to make active choices in what is

normally a passive experience– Teacher preparation

– Need for teacher to understand the experience and run an appropriate pre-visit lesson

– Where to impose constraints– Structure and restrict the collecting activity, or learn from

organising the material back in the classroom – Support for collaborative learning

– “X has also collected” wasn’t successful

Page 24: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

Organistional issues+ Museum appeal

+ attracting secondary schools to the museum

+ Student engagement+ Students spent longer on a MAS visit (90 mins compared to 20

mins)

+ Museum accessibility+ Ability to engage with museum content after the visit

– Problems of museum staff engagement– Burden on museum staff

– Business model– Maintenance of phones– Data charges– Competition with other museum media

Page 25: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

PI: Personal Inquiry

• 3 year project between Nottingham and the Open University

• Support for inquiry science learning between formal and informal settings, keystage 3

• School for introducing and framing issues, and planning inquiries

• Outside, home and science centres for semi-structured investigations

Page 26: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

PI Ethics, general issues

• Participatory design, – all participants will willing volunteers – kept fully informed of the purpose– active participants in the design and evaluation

• Permissions – from the children, teachers parents– Studies in the home will be with the signed informed consent of all

target children and their parents – Other children in the family will be asked for their assent – Project staff subject to enhanced CRB checks. – Researchers will not go unaccompanied into homes

• Confidentiality – All data will be anonymised– Participants and their schools will not be identified in publications or

presentations (unless they wish to be)

Page 27: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

PI Ethics, specific issues• Monitoring

– Children will be using the technology as part of their curriculum work, so teachers should be able to monitor the online activities as they occur and to inspect all the collected data

– Children will be fully informed about how their learning activities outside the classroom may be monitored by teachers and researchers

– Children will be able to decide where and when to collect data – System will not continuously monitor movement and activity, but will only log

actions and data explicitly entered by the children.

• Ownership of data, privacy, and copyright– All data collected will be subject to the provisions of the Data Protection Act

1998, in particular Section 33 of the Act relating to data collected for the purposes of research.

– Material captured or created by the children will be subject to normal standards of copyright and fair use, and inappropriate material will be deleted.

– Authors of teaching materials and field data will retain copyright and moral rights of authorship over their material

– A condition of participation will be that the project has rights to publish the material for academic and educational purposes (either crediting the authors or anonymising the material where appropriate and by agreement).

Page 28: Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham

Summary of methods• Interactive logbook

– Usability• Videoed interactions with comparative systems and reflective discussion

– Usefulness• Questionnaires, focus groups, user logbooks

– Attitude• Questionnaires

• MyArtSpace– Usability

• Heuristic evaluation– Usefulness

• Structured interviews with stakeholders• Videotaped observations and notes, critical incident analysis• Focus group interviews with learners to discuss incidents

– Attitude• Interviews with stakeholders

• PI: Personal Inquiry– Still to be determined, but will include: stakeholder panels, videotaped observations and

critical incident analysis, comparative tests of learning process and outcomes for selected tasks