1 Standards, quality assurance, best practice and benchmarking in e-learning Professor Paul Bacsich...

Preview:

Citation preview

1

Standards, quality assurance, best practice and benchmarking

in e-learningProfessor Paul Bacsich

Matic Media Ltd, and

Middlesex University, UK

2

The Menu

• Standards (technical)• Quality Assurance

– Standards (content)– Standards (pedagogy and process)

• Best Practice– Excellence?

• Benchmarking• Conclusions

3

Standards (technical)• UK follows mainly IMS• Agency called CETIS set up by JISC to

advise universities and colleges on IMS• A few mega universities (OU, Ufi, etc) are

direct members of IMS• IMS Learning Design gaining influence• Also e-portfolios

4

Standards (Content)

• Quality Assurance Agency has set up “subject benchmarks”

• More about generalised competences than detailed syllabi

• See www.qaa.ac.uk/academicinfrastructure/benchmark/

5

Standards (pedagogy and process)

• Quality Assurance Agency (QAA)• “Code of practice for the assurance of academic

quality and standards in higher education”• See www.qaa.ac.uk/academicinfrastructure/

codeOfPractice/• Not much on pedagogy – this is left to the

discretion of the professor

6

QAA in e-learning

• Little has been done specifically on e-learning – but see…

• “Collaborative provision and flexible and distributed learning (including e-learning)”

• Recent – September 2004• Some feel it says too little, others do not

want to be restricted

7

Digression on Pedagogy

• Higher Education Academy• “works with universities and colleges, discipline

groups, individual staff and organisations to help them deliver the best possible learning experience for all students”

• Runs Subject Centres for each subject• Beginning to advise on e-learning

8

Best practice in e-learning

• Not much studied in the UK yet• OU a major source of advice• UKeU set up to crystallise best practice into an

operational business• It failed – but its legacy may help

– Committee for Academic Quality

• US much more active – see e.g. “Quality on the Line” (IHEP, 2000)

9

In UK, universities compete- and now in e-learning

• Universities want to judge how well they are doing in e-learning

• And funding agencies also want to know• But universities don’t want to tell if they are

doing badly! Not the public, not the funding agencies.

• And universities (like people) are not good at judging themselves.

10

Benchmarking

• Like Activity Based Costing, it has been around for many years

• Unlike ABC, but like BPR, quality, excellence, etc; no one is now sure what it means…

11

Back to Basics (Xerox)a process of self-evaluation and self-improvement through the systematic and collaborative comparison of practice [process]and performance [metrics, KPIs]with competitors [or comparators]in order to identify own strengths and weaknesses,and learn how to adapt and improveas conditions change.

12

Benchmarking Dichotomies

• Implicit• Independent• Internal• Vertical• Inputs or Processes• Metric

• Explicit• Collaborative [clubs]• External• Horizontal• Outputs• Qualitative

(After Jackson)

13

Focus of my work

• Focussed purely on e-learning• But not to any particular style (e.g. DL)• Oriented to institutions past the “a few projects”

stage• Suitable for desk research as well as invasive

studies• Suitable for single- and multi-institution studies

14

Benchmarking (in Universities)

• There are several reports that will tell you how to do benchmarking in general– Higher Education Academy (UK)– Learning and Skills Development Agency

(UK)– Department of Education Training and Youth

Affairs (Australia)

15

Benchmarking (in Universities)

• And some agencies can help:– European Benchmarking Programme on

University Management (ESMU, Brussels)– English Universities Benchmarking Club

16

Benchmarking in e-Learning

• There are very few reports– National Learning Network (UK) –not for

universities, but for colleges– E-Learning Maturity Model (NZ) – brand new!

17

Quality/Best Practice in e-Learning

• There are a few reports (US):– APQC/SHEEO Study 1998 (US)– IHEP “Quality on the Line” 2000 (US)

• And several projects (EU):– BENVIC– SEEQUEL– Swiss Virtual Campus @ Lugano: MINE

18

Excellence (?) in e-Learning

• New project:E-xcellence (EADTU and others)

• Outside e-learning, several projects:– Consortium for Excellence in Higher

Education (UK)

19

Benchmarking e-learning

A “synthesis”

20

Processes or Outputs?

• Outputs first (can be done by desk research)

• Processes later (best done in clubs or invasive studies)

• Inputs not of interest to students; but of course of interest to funders

21

Metrics or Bureaucratic

• Use a 6-point scale– 5 from Likert plus 1 more for “excellence”

• Backed up by metrics where possible

• Also contextualised by narrative

• Remember the problems of judging “best practice”; judging “better practice” is easier

22

Other Decisions

• Explicit (otherwise you are not trying)

• Independent or collaborative

• Internal or external

• Horizontal: focus on processes across the whole institution; do not be seduced into individual projects

23

How Many Benchmarks?

• It is like ABC: how many activities?• Answer: Not 5, not 500.• Better answer: Well under 100.

– Composite some criteria together– Remove any not specific to e-learning– Be careful about any which are not provably

critical success factors.

24

How Many do Others Have?

• LSDA (UK) has 14

• IHEP (US) has 24

• APQC/SHEEO (US) had 14

• (Breaking news) EMM (NZ) has 43

25

Pick and Mix System

• 25 criteria (liable to grow to around 30)• 6 levels, backed up by qualitative and numeric

information• Student-oriented• Focussed on critical success factors• Requires no long training course to understand,

if you know about e-learning• Methodology-agnostic

26

“Adoption phase” (Rogers)

1. Innovators only2. Early adopters taking it up3. Early adopters adopted; early majority taking it up4. Early majority adopted; late majority taking it up5. All taken up except laggards, who are now taking

it up (or retiring or leaving)6. First wave embedded, second wave under way

(e.g. m-learning after e-learning)

27

“Training”

1. No systematic training for e-learning2. Some systematic training, e.g. in some projects and

departments3. U-wide training programme but little monitoring of

attendance or encouragement to go4. U-wide training programme, monitored and incentivised5. All staff trained in VLE use, training appropriate to job

type – and retrained when needed6. Staff increasingly keep themselves up to date in a “just

in time, just for me” fashion except in situations of discontinuous change

28

What’s next?

29

Next Steps• Correlate with “quality” and “excellence” projects in EU• Publish a review report on UK Committee for Academic

Quality (in e-Learnng) August• Review underpinning methodologies (CMM etc)• Literature search outside Europe, US and Commonwealth• Series of workshops

– at ALT-C 2005 Manchester September– At ACODE Australia November– at Online Educa Berlin December

30

Thank you for listeningAny questions?

Professor Paul Bacsich

Global Campus, Middlesex University

p.bacsich@mdx.ac.uk www.cs.mdx.ac.uk/staff/profiles/p_bacsich.html

Recommended