26
Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Embed Size (px)

Citation preview

Page 1: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

Pedagogic Evaluation

Helen BeethamConsultant in Pedagogy

JISC e-learning programme

Page 2: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

Activities for this session

• Discuss what is meant by ‘pedagogic evaluation’

• Identify project aims and rephrase as evaluation questions

• Identify stakeholders in the evaluation• Consider appropriate means of data

collection and analysis• Establish contact with ‘peer’ projects

and begin sharing ideas/expertise

Page 3: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

Lab testing

• Does it work?– Functionality test– Compatibility test– Destruction test– …

Page 4: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

Usability testing

• Can other people make it work?– Are the menus clearly designed?– Is there a logical page structure?– ...

Page 5: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

Pedagogic evaluation• Does anyone care if it works?

– i.e. is it useful in learning and teaching contexts?

– How is it useful?– Who needs software anyway?– …

Page 6: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

Relating the three phases of evaluation

• Software may progress from lab testing through usability testing to contextual evaluation…

• … or (e.g. in RAP) through many iterative cycles, with users involved at each stage

• Increasing authenticity of context– Evaluation moves from simple, lab-based to

complex, authentic contexts of use• Different questions are asked, different kinds

of data are collected, and different issues arise– Complex, authentic contexts rarely provide yes/no

answers to development questions– Causative factors may be difficult to untangle– Findings may be highly context-related (so several

different contexts are better for evaluation than one)

Page 7: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

Three approaches to learning and teaching

• There are basically three ways of understanding how people learn– Associative– Constructive

• individual/cognitivist• social constructivist

– Situative• Lead to different pedagogic strategies

and approaches• Any of these approaches may be

appropriate– depending on the priority outcomes and the

needs of learners

Page 8: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

Associative approach

• In learning– Routines of organised activity– Progression through component concepts or skills– Clear goals and feedback – Individualised pathways matched to prior performance

• In teaching– Analysis into component units– Progressive sequences of component-to-composite skills or

concepts– Clear instructional approach for each unit– Highly focused objectives

• In assessment– Accurate reproduction of knowledge or skill – Component performance– Clear criteria: rapid reliable feedback

Page 9: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

Constructive approach(cognitivist)

• In learning– Active construction and integration of concepts– Ill-structured problems– Opportunities for reflection– Ownership of the task

• In teaching– Interactive environments and appropriate challenges– Encourage experimentation and the discovery of principles– Coach and model skills– Include meta-cognitive outcomes

• In assessment– Conceptual understanding (applied knowledge and skills)– Extended performance– Processes as well as outcomes– Crediting varieties of excellence– Developing self-evaluation and autonomy in learning

Page 10: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

• In learning– Conceptual development through collaborative activity– Ill-structured problems– Opportunities for discussion and reflection– Shared ownership of the task

• In teaching– Collaborative environments and appropriate challenges– Encourage experimentation, discussion and collaboration– Coach and model skills, including social skills– Learning outcomes may be collectively negotiated

• In assessment– Conceptual understanding (applied knowledge and skills)– Extended performance– Process and participation as well as outcomes– Crediting varieties of excellence– Developing peer-evaluation and shared responsibility

Constructive approach(social)

Page 11: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

Situative approach

• In learning– Participation in social practices of enquiry and learning– Acquiring skills in contexts of use– Developing identity as a learner– Developing learning and professional relationships

• In teaching– Creating safe environments for participation– Supporting development of identities– Facilitating learning dialogues and relationships– Elaborating authentic opportunities for learning

• In assessment– Crediting participation– Extended performance, including variety of contexts– Authenticity of practice (values, beliefs, competencies)– Involving peers

Page 12: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

3 ways of learning about (educational) software

• Does it work?– Routines of organised activity– Analysis into component units– Component performance– Highly focused objectives

• Can other people make it work?– Ill-structured problems– Opportunities for reflection– Ownership of the task– Extended performance– Processes as well as outcomes

• Is it useful in authentic (educational) contexts?– Creating supportive environments for use– Supporting development of users’ skills– Facilitating dialogues and relationships– Elaborating authentic opportunities– Extended performance, including variety of contexts– Authenticity of practice

Page 13: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

Evaluation is learning!

• Evaluation for development, not accountability• Sharing lessons (including failures)• Sharing concepts and approaches• Moving software into more authentic contexts of use

– in order to find out how it is useful, and how it should be supported and embedded for effective use

• Using the outcomes of evaluation to inform development and take-up– Learning across peer projects– Learning across different strands of the e-learning

programme (and beyond)– Learning about your own software

• how to have effective dialogues with users• range and/or specificity of application• usability implications?

Page 14: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

Principles of evaluation

• Ask the right questions• Involve the right people• Collect useful and reliable data• Analyse and draw conclusions

appropriately

Page 15: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

1. Asking the right questions

• How does the use of this e-tool support effective learning, teaching and assessment (LTA)?

• What LTA activities?– Be specific and pragmatic – Understand how e-tool fits with existing LTA practice– But expect it to alter practice – sometimes unpredictably

• Which users?– Range of user needs, roles and preferences– Consider stakeholders who are not direct users

• What counts as ‘effective’?– Enhanced outcomes for learners? Enhanced experience of

learning?– Enhanced experience for teachers/support staff? Greater

organisational efficiency?– Consider what claims you made in your bid, your big vision

• Effective in what LTA contexts?– Does e-tool support a particular pedagogic approach? – Does it require a particular organisational context?– Consider pragmatics of interoperability, sustainability and re-use– Are you aiming for specificity of breadth of application?

Page 16: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

Example: Interactive Logbook project

• identify how the IL supports learners wrt ‘access, communication, planning & recording’– How is access to learning resources improved?– How is communication for learning improved?– Does the IL provide a useful tool for planning and recording

learning in the pilot programme? In what ways?– Does the IL support planning and recording outside of the pilot

programme and does it provide a durable basis for future planning and recording? In what ways?

– How does it compare with other systems offering similar benefits?• identify how the IL supports teachers or programme

developers and organisations respectively, and understand how best ‘to implement and embed the Interactive Logbook… as distributed e-learning becomes more mainstream’. – What skills do learners and tutors need to make effective use of

IL? What features of the programme support integration and use of IL (assessment strategies, support available, mode of access)?

– What technical systems and support are needed for IL to be integrated effectively? What features of the organisation support effective use of IL by learners and teachers?

Page 17: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

Over to you (1)

• Evaluation should be interesting, so:– What do you really want to find out about your software?– What is the most important lesson your project could pass

on to others?– Don’t set out to prove what we already know!

• Look back at your project aims – what claims are you making for impact on LTA?– Translate these aims/claims into questions. Is there

evidence of this impact? How does it happen?– Good claims are achievable but also challenging– Good questions are tractable but also interesting

• Do your original aims fit with what interests you now?– Prioritise the issues that seem important now, with the

benefit of insights from the development process– But use this as an opportunity to revisit and review

• What are other projects investigating? – Do you have any questions in common in your peer group?

Or questions that complement each other?– What could you usefully share of the evaluation process?

Page 18: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

2. Involving the right people

• Who are your users?– What activities will they carry out with the system?– What functions of the system are important to them?– What roles do they play in those activities?– What are the important differences between them?

• Significant for sampling (e.g. dyslexics for V-MAP) walk-throughs, use cases (user testing and

evaluation design) real groups of learners and teachers (pedagogic

evaluation)

• Who are your other stakeholders?• Non-users whose work or learning may be impacted

by use of the system in context e.g. administrators• Other ‘interested’ parties e.g. institutional managers,

project funders, researchers/developers, potential users…

Page 19: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

Over to you (2)

• Who are your important stakeholder groups?– distinguish your user group in ways that are

significant for your evaluation questions (e.g. learners with or without an existing e-portfolio)

– consider non-users as stakeholders and as potential sources of information

• Share your outcomes with your peer group. – What types of user do you need to include? (e.g.

model users you have developed for walk-throughs)– How will you identify real user groups for evaluation?– How will you ensure all your significant types are

included? • (NB including different types of learner is much easier

than finding a ‘statistically representative’ sample: for this the proportions of different kinds of learner must be the same as in the target population)

Page 20: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

3. Collecting useful data

• Data collected should be directly relevant to the questions!

• Data should show triangulation: – using a variety of methods (e.g. focus group and

questionnaire)– from a range of stakeholders (e.g. learners, teaching staff)– over a span of time (e.g. ‘before’ and ‘after’)

• Quantitative data = How much? How often? How many? – Also providing yes/no answers to simple questions– Generalising from instances to rules– Converting opinions into data for analysis (Likert scales)

• Qualitative data = explanatory, narrative– What happened? What was it like for you? Why?– Identifying themes and providing local evidence– Preserving the voices of participants

• How authentic the context? If authentic, how will you support embedding and use of the software?

• Feasibility and costs of data collection?• Skills and costs of analysis?

Page 21: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

Over to you (3)

• Use the matrix to plan what data you will collect– Data should be designed to answer specific

questions (left column) and should be collected from specific stakeholder groups (top row)

– Add details if possible, e.g. when (time) and how (method) this data could be collected

– You need not fill all the boxes but try to have something in each row and column

– You can merge boxes!

Page 22: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

Final discussion: analysing and drawing conclusions

• Basic choices for data analysis:– Quantitative analysis – statistical data that may be

presented as pie charts, graphs etc; likert scales– Qualitative analysis within a given analytical framework –

comparison, correlation, explanation– Qualitative analysis without a given analytical framework –

case histories, narratives, themes• Outcomes need to be useful to different audiences

– Your project, and other development projects– Implementers and users of your software

• How will we draw conclusions across the different projects?– Peer review groups– Links to other projects in the e-learning programme

• Pedagogy strand – refer to previous workshop• DeL regional pilots• ELF reference models (also standards community)

– Sharing scenarios, roles and walk-throughs (model users)?– Mapping activities to a common framework (model uses)?

Page 23: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

Learner Differences (‘model users’)

• Refer to hand-out – But note that in many situations these

differences will not be educationally significant

– And in your context of use, there may be other important differences to consider (see your ‘stakeholder’ activity)

• Could we develop a databank of ‘scenarios’ and roles’?– See Peter Rees-Jones’ work on scenarios for

e-portfolios

Page 24: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Outline of a learning activityIdentities: preferences, needs, motivations Competences: skills, knowledge, abilities

Roles: approaches and modes of participating

learning outcome(s)

Tools, resources, artefacts Affordances of the physical and virtual environment for learning

learningenvironment

Other people involved and the specific role they play in the interaction e.g. support, mediate, challenge, guide

specific interaction of learner(s) with other(s), using specific tools and resources, oriented

towards specific outcomes

learner(s)

other(s)

New knowledge, skills and abilities on the part of the learner(s)

Artefacts of the activity process

learning activity

Page 25: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Developmental processes

learning outcome(s)

learningenvironment

learner(s)

other(s)

learning activity

do

reflectLearning outcomes are captured

for reflection, planning and review

share

respondOutcomes can also be shared for

formal assessment, informal feedback and peer review

adapt

differentiate

Environment (tools, resources) can be adapted to meet the

needs of learners, or provides a range of options for differentiation

Page 26: Supporting further and higher education Pedagogic Evaluation Helen Beetham Consultant in Pedagogy JISC e-learning programme

Supporting further and higher education

Next steps

• Appoint evaluators (if not already in place)• Finalise evaluation plan

– Based on phase 2 bid– Using a pro-forma (optional)

• Identify opportunities to liaise with other projects, e.g. to share– Evaluation questions and approaches– Actual data (comparative analysis?)– Process of analysis and drawing conclusions

• I will be in touch to discuss these (or contact me at any time)

[email protected]