31
Service Evaluation Project Evaluating E-Learning 0 Evaluating the E-Learning packages on the University of Leeds Clinical Psychology Doctoral Training Programme: Pedagogy and Instructional Design Jamie Barrow Commissioned by Clare Dowzer, Leeds DClinPsychol programme

Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

0

Evaluating the E-Learning packages on the

University of Leeds Clinical Psychology

Doctoral Training Programme: Pedagogy

and Instructional Design

Jamie Barrow

Commissioned by Clare Dowzer, Leeds DClinPsychol programme

Page 2: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

1

Introduction

Factors important for learning

When designing educational resources it is important to include factors that promote

successful learning, referred to as Pedagogy and Instructional Design components.

Pedagogical factors support the active nature of the learning process and are designed

to impart knowledge (Mehanna, 2004), with a focus on: learning content, user needs,

and learning outcomes (Lim & Lee, 2007). Instructional Design (ID) relates to

experiences of learning and the features of learning environments which promote

knowledge and skill development (Merrill, Drake, Lacy, & Pratt, 1966).

Pedagogical factors can be understood through a number of theoretical constructs such

as Behaviourism; where learning occurs via a transmission of knowledge, Cognitivism;

the creation of cognitive change and processing, and Constructivism; where learning

occurs through interaction (Lim & Lee, 2007).

Bransford, Brown, and Cocking (2000) identify four factors as being significantly

important for the learning process: attention, motivation, emotions, and experiences of

the learner. The authors describe how learners must be engaged with the information

and motivated to acquire knowledge, and that successful learning environments must

capture this by providing opportunity for interaction, “doing” and receiving feedback

on their performance to develop and refine their understanding.

Tavangarian, Leypold, Nölting, Röser, and Voigt (2004) discuss learning from a

constructivist standpoint, where learners assimilate knowledge by building on previous

knowledge and experiences. They describe “learning objects” metaphorically as Lego©

blocks which connect with one another to develop overall understanding. The size of

these learning objects must be considered to facilitate learning, as they cannot be too

Page 3: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

2

big or unconnected to previous ideas as this will hinder overall understanding.

Mödritscher (2006) supports the idea that learner characteristics influence successful

learning, identifying prior knowledge, the individual’s cognitive and learning style, and

intellectual capabilities as important considerations.

E-Learning

E-Learning is use of digital technology to present information. The use of e-learning

resources has benefits to both the teaching institutions and learners, such as providing

a method through which to share information and resources quickly, on a large scale

and in a relatively cost-effective way (Gordon, 2014). They provide flexibility to

teachers and learners regarding when the resource is accessed and completed (Gordon,

2014; Yelland, Tsembas, & Hall, 2008).

Daskalakis and Tselios (2013) report that users are generally motivated to complete

online learning and that this trend has been increasing as technological support becomes

better able to facilitate interactive and multimedia features within the learning

materials. This allows a flexible method of study, where users can access information

remotely, any time, and at their own pace.

Ruiz, Mintzer, and Leipzig (2006) discuss using a ‘blended learning approach’ where

e-learning resources are used alongside classroom methods to support learning. They

identify benefits of e-learning use such as; cost-saving, increased access and availability

of material for learners, and student satisfaction.

Yelland et al. (2008) warn that simply transferring paper-based learning resources to an

online format is insufficient. Effective e-learning should motivate and engage users via

techniques such as story-telling, animation, and by building on prior knowledge

Page 4: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

3

(Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an

intuitive design, where users can navigate, find, and access information without

distraction from poor design or function (Daskalakis & Tselios, 2013) allowing them

to maintain attention and focus to the content.

Engaging the user via objective setting, interaction and varied presentation of

information, provision of feedback and recognition can support learning outcomes

(Mehanna, 2004). Paechter, Maier, and Macher (2010) explored the opinion of 2196

university students using e-learning as part of their degree, which varied in terms of

subject and degree level (Batchelors, Masters, PhD). The authors report that

achievement goals were considered most important for learning. They therefore suggest

that the content of e-learning courses is designed to influence motivation through the

provision of clear learning objectives, opportunities to practice and apply knowledge,

including feedback to measure progress throughout.

Martín-Rodrígueza, Fernández-Molinab, Montero-Alonsoc, and González-Gómezd

(2015) report course design and content was the most relevant feature for student

satisfaction, including interaction with the course and tutor support to promote learning.

Interaction with course instructors can support learning outcomes and satisfaction in

the course, and so course instructors do not become less valuable in e-learning and

should be involved when developing resources (Mehanna, 2004; Paechter et al., 2010).

Whilst students value IT as a convenient resource which supports their learning (Smith

& Borreson Caruso, 2010), technology is not a substitute for face-to-face interaction

with tutors (Borreson Caruso & Salaway, 2007).

Interaction between students has been identified as an important feature in e-learning

environments however, the evidence is mixed regarding its use (Ituma, 2011). The

Page 5: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

4

author reported limited support for the use of the ‘chat’ component and that this was

rarely used by students. It is therefore necessary to explore user feedback to discover

which resources students’ value or consider inessential.

A criticism of e-learning research and evaluation is that this often explores use and

satisfaction from developer/teacher perspective rather than learner experience (Ruiz et

al., 2006). Ruiz et al. (2006) promote the need to engage users in evaluation to ensure

that the materials are also meeting the needs and aims of students. Islam (2013)

identifies the value of teacher/learner collaboration in the development and use of such

resources as they should meet the needs of both of both user groups.

Use of E-Learning on the Doctorate of Clinical Psychology (DClinPsy) Training Course

The DClinPsy staff team made the decision to create e-learning resources to as a method

to provide trainees with information that can be accessed easily and flexibility to

support trainee learning.

Some aimed to provide a base level of information or to refresh student knowledge

prior to face-to-face teaching sessions; for example “Introduction to statistics”. This

was the first package to be developed based on trainee feedback that a refresher would

be useful prior to the teaching sessions, as individual experiences varied regarding use

of statistics in their pre-training careers.

Others were developed to deliver information that was not scheduled as a teaching

session but were potentially beneficial for trainees; such as Literature searching

package. Not all packages require completion as a course requirement but were

identified as useful for trainees to access as they conduct academic assignments,

dictated by individual learning goals and needs.

Page 6: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

5

An anticipated benefit was that trainees would be able to spend as much or little time

completing e-learning as necessary for the individual, as well as being something which

can be accessed repeatedly as required. By providing refresher or base-level knowledge

prior to a face-to-face teaching session it was hoped that trainees are supported to make

the best use of the teaching time in a blended learning format (Ruiz et al., 2006).

The packages were developed by the learning technologist in collaboration with course

tutors or other subject experts.

How E-Learning has been evaluated by others

Due to the growing use of e-learning as a mode of information delivery, it is important

to assess that e-learning packages are capturing elements which facilitate learning in a

way which is as effective as face to face teaching methods.

The literature around evaluating e-learning is expansive and rather varied in regard to

what e-learning looks like, ranging from single session delivery to full virtual learning

environments. Due to this variation there are many different models of evaluation to

choose from, each with slightly different focus depending on the aims of the researcher.

A literature review by Dorobat (2014) identifies and discusses a number of evaluative

models before suggesting their own model that draws together the important features

of previous evaluative frameworks. They proposed a comprehensive framework for “E-

Learning System Success” (ELSS), a simplified version of which can be found in

Figure 1. Dorobat identified six areas which should be considered when exploring

successful e-learning and proposed that these can impact and influence one-another.

Whilst the ELSS provides a useful framework of pedagogy and ID features from which

Page 7: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

6

to begin assessing e-learning, it does not provide specific components to assess when

conducting an evaluation.

Figure 1: Dorobat’s proposed simplified ELSS model (2014)

Oztekin, Kong, and Uysal (2010) proposed a checklist called “UseLearn”, made up of

36 criteria to evaluate the quality and usability of e-learning systems. They discuss the

importance of e-learning being easy for learners to use without system error. Attention

has been identified as an important component of successful learning, and so users

should be supported to access information and remain focused on learning content and

activities without distraction from functionality issues within the learning platform

(Daskalakis & Tselios, 2013; Islam, 2013). The UseLearn checklist provides useful

items from which to assess the functionality of ID features of a learning environment,

however there is an absence of factors assessing pedagogical features.

Holsapple and Lee-Post (2006) describe system features that support successful e-

learning conditions and propose assessment should include evaluation of the system

design, delivery, and user satisfaction. The authors acknowledge that this is an iterative

process and evaluation should include identifying barriers to learning, developing and

implementing improvements, and re-evaluation. This service evaluation project (SEP)

is the first step in evaluating the e-learning packages on the DClinPsy training

Perceived Control

Social Factors Benefits

Quality Usefulness &

Satisfaction

User Attitude

Page 8: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

7

programme by assessing the ID and pedagogical components, including user feedback

where available.

There are a variety of different rubrics and checklists that can be used to evaluate e-

learning. The content of such tools varies depending on the aims of both the information

provider and learner however, a number of common themes were apparent in

supporting knowledge acquisition. The ideas from the literature will be used to inform

the present evaluation.

Commissioning and Project Aims

The purpose of this SEP is to investigate how well the current packages include

pedagogical and instructional design elements to promote successful learning. From

this we hope to make recommendations as to how existing packages can be refined and

improved, which may also influence how upcoming e-learning packages are developed.

The project was commissioned by Dr Clare Dowzer: Learning technologist and

Research Coordinator at the University of Leeds, and Dr Gary Latchford: Joint

Programme Director of the University of Leeds DClinPsy training programme. Over

the last three years seven e-learning resources have been developed:

1. History taking and Record Keeping (HTRK)

2. Single Case Design (SCD)

3. Neuropsychology

4. Presenting Data

5. Introduction to Statistics

6. Psychometrics

7. Literature Searching and Reviewing (LSR)

Page 9: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

8

These were created by adding an element of interactivity to existing PowerPoint

presentations in order to transfer these teaching materials from more interactive, face

to face learning into an online resource. These packages range from those which replace

a single lecture (Presenting Data), to more comprehensive learning which may take

several hours to complete (Neuropsychology). The service evaluation project was

commissioned to assess these packages from a pedagogical and instructional design

perspective using a checklist tool based on the literature, and available feedback from

trainees who have accessed the e-learning packages. It is expected that this evaluation

will identify the strengths and limitations of the e-learning and make recommendations

as to how these might be improved for future users.

Research Aims:

1. To evaluate the Pedagogical and Instructional design properties of the e-

learning packages and make recommendations for improvement.

2. To assess user satisfaction through the analysis of evaluation questionnaires.

Method

Design

A quantitative approach was used to assess the presence of pedagogical and

instructional design features across the seven e-learning packages. A mixed measures

approach was considered to explore user experiences, however due to the size

restriction of this project it was thought this would be more thoroughly explored as a

separate project which could include teacher and learner perspectives.

Developing a Measure

Page 10: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

9

From the initial literature review a measure was not identified which would meet the

evaluation requirements of this project, due to length or the scope of the tool. Therefore,

a brief content analysis identified recurring elements in measures which are used to

evaluate various e-learning packages. The ELSS (Dorobat, 2014), Uselearn (Oztekin et

al., 2010), Holsapple and Lee-Post (2006), and Bransford et al. (2000) were used to

inform and devise a checklist (Appendix 1) which could be applied to each of the e-

learning packages. This process was done in consultation with the project commissioner

to ensure this would meet the project requirements.

Checklist items were grouped together to inform compliance to the following

subheadings/criteria:

Category To assess Informed by Learning Techniques

Was information presented to support learning and application of knowledge?

ELSS (Dorobat, 2014)

UseLearn (Oztekin et al., 2010)Bransford et al. (2000)

Accessibility

Does the e-learning allow users to access information easily?

ELSS (Dorobat, 2014) UseLearn

System quality

Does the e-learning platform function sufficiently to support learning without distraction? (e.g. from poor design or technological error)

ELSS (Dorobat, 2014) Holsapple and Lee-

Post (2006) UseLearn (Oztekin et

al., 2010)

Usefulness & Satisfaction

Does it meet user requirements/expectation?

ELSS (Dorobat, 2014) Bransford et al.

(2000) Holsapple and Lee-

Post (2006)

Social factors Are users able to interact with others to promote learning?

ELSS (Dorobat, 2014)

Across these five categorise there were 21 items, some of which were broken down into

sub-checklist items to support decision making. User feedback was used to inform the

Page 11: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

10

compliance rating on five items, which are identified on the checklist (e.g. “Is the

training pitched at the correct level?”).

Ethical considerations

Review by an ethics panel was not necessary for this project as no participants were

recruited. The analysis required access to the learning packages via the University of

Leeds Minerva platform and routinely gathered, anonymous, user evaluation data.

Procedure

The checklist was and applied to each of the seven e-learning packages. Each item

would be rated as “Present”, “Partial” [partially present], “Absent”, or “Not

Applicable/No Data”. The researcher and project commissioner completed and

compared the results on the HTRK package to establish agreement regarding the rating

criteria and to improve the reliability of the ratings.

Data analysis

Quantitative data analysis using descriptive statistics was used to calculate compliance

for each package. This score was expressed as a percentage of items: Present, Partially

Present, Absent, or NA/No Data in each of the pedagogical/instructional design

categories and an overall compliance score.

User feedback was available on four of the seven packages: SCD, HTRK,

Neuropsychology and Presenting Data. The routinely collected DClinPsy e-learning

evaluation questionnaire (Appendix 2) also asked users three qualitative questions

regarding: Objectives, Assessment, and Suggestions. There were not enough qualitative

Page 12: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

11

data from the completed trainee feedback forms to conduct a robust thematic analysis

as described by Braun and Clarke (2006). Available feedback has instead been

considered during the rating of some checklist components and to generate preliminary

themes to inform the recommendations.

Results

The results shall be considered in terms of overall compliance to the checklist, followed

by a breakdown of how this was informed by each pedagogical and instructional design

category.

Overall Compliance

A breakdown of compliance across all packages can be seen in Figure 1. Overall

compliance to the checklist varied across the packages with an average ‘Present’ rating

of 50% (range = 36.36 - 63.64%). When considering both present and partially present

items overall compliance is high across the packages (mean = 81.17; range= 68.18 –

90.91%). There were very few items rated as ‘Absent’ (mean = 5.2%; range = 0 –

13.64%) however, the results are influenced by the number of items rated as N/A or

where data were unavailable (mean = 10.39%; range = 4.55 – 22.73%). The most

compliant package was SCD, followed by HTRK; the lowest rated compliance was for

Presenting Data (as shown in figure 2).

Page 13: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

12

Figure 2: Overall Checklist Compliance across the E-Learning packages

Learning Techniques

The pedagogical and instructional design category “Learning techniques” had the

highest level of ‘present’ rated compliance across the e-learning packages (mean =

90.47%; range = 83.33 – 100%). As shown in Figure 3; Neuropsychology, SCD and

Presenting data all achieved 100% ‘present’ ratings; HTRK achieved 83.33% (100%

when present and partial were combined). For the three remaining packages there were

no available data from the user feedback component. There were no “absent’ ratings in

any category. The strong compliance suggests that the e-learning package present the

information well, using techniques which can promote successful learning.

40.91

36.36

45.45

59.09

50.00

54.55

63.64

27.27

36.36

27.27

22.73

40.91

36.36

27.27

13.64

4.55

4.55

0.00

4.55

4.55

4.55

18.18

22.73

22.73

18.18

4.55

4.55

4.55

0 10 20 30 40 50 60 70 80 90 100

Presenting Data

Intro to Statistics

Literature searching & Reviewing

Psychometrics

Neuropsychology

History taking and record keeping

Single case design

Compliance (%)

E-Le

arn

ing

Pac

kage

Overall Compliance

Present Partial Absent No Data or N/A

Page 14: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

13

Figure 3: Compliance to “Learning Techniques” Category

System Quality

‘Present’ ratings across the e-learning packages for the System Quality category was

much lower (Figure 4), achieving a mean of 40% (range = 20-80%). There was a high

level of partially achieved components (mean = 40%), indicating that some features are

included but require improvement. Some examples of this include:

“Are files easy to upload/download or view?” - links present but no longer

functioning.

“Error prevention” – Interactions with animations did not always function.

Users were not always informed why they could not move on (e.g. if a task was

incomplete or incorrect).

83.33

83.33

83.33

83.33

100.00

100.00

100.00

0.00

0.00

0.00

16.67

0.00

0.00

0.00

16.67

16.67

16.67

0.00

0.00

0.00

0.00

0 10 20 30 40 50 60 70 80 90 100

Psychometrics

Intro to Statistics

Literature searching & Reviewing

History taking and record keeping

Presenting Data

Single case design

Neuropsychology

Compliance (%)

E-Le

arn

ing

Pac

kage

Learning Techniques

Present Partial Absent No Data or N/A

Page 15: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

14

“Is the course well organised?” – Users could not navigate to a specific page of

information if they wanted to recap this. Not all packages provided information

about where they were in the training or had multiple routes to the same

information.

When considering both present and partially achieved items as compliant, the mean

compliance increases to 80% (range = 60-100%).

Figure 4: Compliance to “System Quality” Category

The results indicate high levels of variation in the successful implementation of

components which support the e-learning environment and performance, and that

these can be improved across the e-learning packages.

Accessibility

“Accessibility” achieved a lower compliance rating across the E-Learning packages,

with a mean ‘present’ rating of 40% (range = 20-60%). When combining ‘present’ and

20

40

40

60

20

20

80

40

20

20

20

80

80

20

20

20

20

20

0

0

0

20

20

20

0

0

0

0

0 10 20 30 40 50 60 70 80 90 100

Intro to Statistics

Literature searching & Reviewing

Presenting Data

Single case design

History taking and record keeping

Neuropsychology

Psychometrics

Compliance (%)

E-Le

arn

ing

Pac

kage

System Quality

Present Partial Absent No Data or N/A

Page 16: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

15

‘partial’ this increased to a mean of 77.14% compliance (range = 40-100%). As shown

in Figure 5, SCD achieved the highest level of compliance, whereas Presenting Data

has the lowest. Presenting Data also has the highest level of ‘absent’ features on this

category than any other checklist category (40%).

Figure 5: Compliance to “Accessibility” Category

These results suggest that accessibility varied across the e-learning packages (e.g.

function across internet browsers or mobile devices). No package achieved a ‘present’

rating for the ability to adjust the display settings to user preference (e.g. colour, size)

which could be an important feature regarding access for those with different learning

needs.

Usefulness and Satisfaction

20

20

40

40

40

60

60

20

60

40

40

40

20

40

40

0

0

0

20

20

0

20

20

20

20

0

0

0

0 10 20 30 40 50 60 70 80 90 100

Presenting Data

Intro to Statistics

Psychometrics

Literature searching & Reviewing

Neuropsychology

History taking and record keeping

Single case design

Compliance (%)

E-Le

arn

ing

Pac

kage

Accessibility

Present Partial Absent No Data or N/A

Page 17: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

16

Of all the categories “Usefulness and Satisfaction” had the highest level of No Data or

N/A ratings (mean = 39.28%; range= 25-50%) due to limited user feedback to inform

the ratings. The “Motivation to learn” item was not assessed by the feedback form and

so this could not be rated on any package. Only three packages prompted the learner to

complete a feedback form about the experience (Neuropsychology, Psychometrics,

HTRK), however all packages are able to be evaluated via the Minerva platform and so

this was rated as ‘partial’ on those who did not provide a specific prompt to users.

As shown in figure 6 the highest rated package was HTRK, achieving a ‘present’ score

of 75%. The lowest was Presenting data, which did not achieve a ‘present’ rating.

Overall ‘present’ ratings were rather low (mean = 39.29%; range = 0-75%). When

combining present and partial ratings the average compliance increased to 60.71%

(range = 50-75%).

Figure 6: Compliance to “Usefulness and Satisfaction” Category

0

25

25

50

50

50

75

50

25

25

0

25

25

0

50

50

50

50

25

25

25

0 10 20 30 40 50 60 70 80 90 100

Presenting Data

Literature searching & Reviewing

Intro to Statistics

Psychometrics

Single case design

Neuropsychology

History taking and record keeping

Compliance (%)

E-Le

arn

ing

Pac

kage

Usefulness and Satisfaction

Present Partial Absent No Data or N/A

Page 18: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

17

These results demonstrate the need to increase completion of feedback measures, as it

is unclear how satisfied users are with the e-learning packages.

Social Factors

None of the packages had a fully compliant ‘social factors’ rating. All were rated as

partially present due to the wider facility on the University of Leeds “Minerva” platform

to begin forum style discussions. This was not rated as present as there were no

discussions currently listed nor were there opportunities for interaction and feedback

from staff members or tutors (such as a Frequently Asked Questions thread). It was not

always clear who to contact for further information or support however, some credited

those who contributed to the course content or design.

Qualitative feedback

The user feedback questionnaire includes free text boxes where users are able to provide

qualitative information relating to three areas: Objectives, Assessment, and

Suggestions. These questions can be seen in Table 1, along with preliminary themes

and quotes to illustrate these.

Table 1: Qualitative Analysis of evaluation data

Question Preliminary Themes Illustrative Quote Objectives: What different or extra objectives do you think should be included in the course?

Unsure/No suggestions

Concern about increased time to complete

“I’m not really sure, they are

a good length and [to include extra objectives] would also add time.”

Assessment: Is there any way that the questions or quiz could be improved?

Value of feedback and explanation

Consolidates learning

“to give feedback if you get it

wrong, rather than having to go through it a few times until getting the question right.”

Page 19: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

18

“I thought the quiz was a

good way to consolidate learning”

Do you have any suggestions about additional e-learning resources?

To support other teaching

Clinical Academic

Interaction

Social Varied

presentation of information

“E-learning on different therapy models or [clinical] presentations”

“Maybe one for quantitative and qualitative methods to support in making choices for thesis/SEP?”

“Maybe more links to videos that can talk through ideas.” “Maybe you could have an area where you could leave questions and get answers have discussions. I'm not sure if that is in the remit of e-learning, but the more interactive the better!”

Overall the feedback suggests that users value the e-learning and recognise its utility in

supporting learning; even raising suggestions of other topics where e-learning could be

useful. Users do appear to be cautious about adding additional aims or topics due to the

additional demand on trainee time to complete these. It is also suggested that users are

unsure as to the “remit” of e-learning and what can be offered within this.

The checklist includes both a “justification” and “Other Comments” section, where the

researcher noted observations regarding the e-learning to inform the rating of each

component. The completed checklist was shared with the commissioner to consider

alongside the ratings to allow easy identification of where errors were observed and so

these can be improved. The following are examples of this type of feedback:

Error prevention a) Can tasks be completed easily?

Page 20: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

19

o “If stuck at a question which requires a correct answer, user cannot

navigate around to find the information or move on” – Literature

searching and Review

Is the course accessible across different devices/media/platforms?

o “Worked on iPad in Safari but not Chrome, which asked users to

download a different app. On iPad Safari when more information was

displayed on one screen the scroll feature did not work” –

Neuropsychology.

Discussion

Research has warned that when developing e-resources for learning it is not sufficient

to simply transfer paper-based resources to an online environment (Yelland et al.,

2008). The process of knowledge acquisition requires learners to be supported and

involved in the active learning process through the inclusion of pedagogical and ID

factors (Mehanna, 2004). It is important that the online environment is supportive,

engaging and provides opportunity to apply their knowledge without distraction

(Daskalakis & Tselios, 2013).

Aim 1: This project aimed to explore the pedagogical and ID features of the e-learning

packages developed for use on the DClinPsy training programme using a bespoke

evaluative checklist.

Overall compliance to the checklist was high across the e-learning packages. When

assessing ‘present’ rated factors there was variation as to how strongly these features

were included however, when combining ‘present’ and ‘partial’ items no package

Page 21: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

20

achieved a score lower than 65%. This suggests that the e-learning resources are well

developed to include sufficient pedagogical and ID components to at least some extent

to promote successful learning, however this can be improved.

Of the checklist categories “Learning Techniques” was the most highly rated category,

with no score below 80% ‘present’. This category explored features such as; the

opportunity to use knowledge and receive feedback (Mehanna, 2004), explanation of

concepts and suitably pitched information (Tavangarian et al., 2004), and varied

presentation of the information. The results suggest that each of these pedagogical

features are sufficiently well included to promote learning.

System quality explored the ID features of the e-learning packages and the findings

suggest variability across these as to how well implemented these were. This assessed

how well users were able to navigate the system without disruption from design or

technological errors (Daskalakis & Tselios, 2013). Error prevention, organisation and

function were assessed, with errors observed within the packages. There were some

difficulties observed completing tasks or navigating the system which reduced

compliance to ‘partial’, as features were present but not always functioning as expected

(e.g. navigating menus).

The accessibility category assessed how well the packages allowed users to access

resources and adjust these to user preference. All packages were accessible through

web browsers but varied in how well these could be accessed across different devices

(e.g. tablet computer). Most did not require the user to download additional software to

access, however users were advised to use Google Chrome for best functionality. Most

adapted well to different screen sizes, although some (e.g. Neuropsychology) did not

Page 22: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

21

adjust to a smaller display meaning the user had to scroll to view the information. No

package allowed the user to configure the colour or font of the display.

Aim 2: The project will explore user satisfaction using data from evaluative feedback

forms. This objective was not fully met due to limited evaluative data from trainees.

All categories were influenced by a limited amount of user feedback, however the

‘Usefulness and satisfaction’ category had the highest amount of missing data as a result

and so the findings should be considered with caution.

The available feedback was used to inform the ratings however these should be

interpreted with caution as they cannot be generalised to the wider trainee group. Of

the packages where feedback was available satisfaction was high; more data should be

gathered to explore this component.

It would be useful to explore the trainee perspective of the “Social Factors” category,

as no package had a ‘present’ rating for this. Not all packages provided details for

contacting the course tutor, however the wider VLE system “Minerva” provides a wiki

space for trainees to use freely. This means that there is the capacity to support such a

discussion forum already in place however it would seem that this feature is scarcely

used by trainees. Ituma (2011) suggests that learners perhaps use other methods of

communication, which could be the case with trainees.

Within the qualitative feedback there was a suggestion of an area where learners can

post questions and receive answers (perhaps from the appropriate tutor/moderator or

from other trainees). It may be beneficial to explore wider trainee views on this and

whether this would be a valued resource which trainees think they would use. It could

be useful to have this open to all trainees so that past discussions can be viewed my

Page 23: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

22

current trainees and reduce the burden on moderators/tutors to ask similarly themed

questions each year.

Recommendations:

Identified errors in current packages to be fixed/improved.

E-Learning packages to present information in a variety of methods to engage

the user.

Users should receive feedback on performance in a timely manner.

o Incorrect quiz responses should prompt a tip or direct the user to the

correct information.

E-Learning resources should be checked annually by the course tutor or learning

technician to ensure that links and documents are still accessible.

User perspective of the e-learning packages should be further explored

o All packages should have a direct link to prompt users to complete the

evaluative feedback form.

o Feedback form to include a question regarding how useful trainees

found the e-learning.

o A further service evaluation project could explore user expectations and

experience of the e-learning systems (including the demand for a

‘chat/forum’ or frequently asked questions message board).

o Course staff perspectives could also be explored as information

disseminators regarding how successful they think inclusion of an e-

learning component has been (Paechter et al., 2010).

o Both staff and trainee perspectives gathered regarding potential topics

for additional resources.

Page 24: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

23

Critical Evaluation

The project resulted in the development of a bespoke checklist for evaluating the

DClinPsy e-learning packages. This was devised based on the literature and

consideration of what was relevant for the project based on the aims of the

commissioner. However, this means that the checklist may not be applicable to other e-

learning evaluations as there may be components which are missing or not relevant to

other projects. It could be that as the scope of e-learning changes on the DClinPsy

course, and the way information is presented develops, the checklist may need updating

to reflect that and capture other pedagogical and instructional design techniques the

university wishes to employ.

A full quantitative analysis of user feedback was not possible due to limited numbers

of feedback forms to inform a reliable analysis. There were a limited number of

completed user feedback forms, with some packages having no feedback and others

with a range of 1-8 participants. This contributed to the high levels of no-data on some

packages and reduces the reliability of the comparison of compliance across packages.

Only three packages prompted the user to complete a feedback form evaluating the e-

learning experience. This may have contributed to the low levels of user feedback, as

users may not have been aware of the form or thought this to be necessary. Therefore,

it is a recommendation from this evaluation that a prompt and link is provided at the

end of every package to increase the data available for future evaluation.

Qualitative feedback was limited due to the lack of evaluative feedback and that there

were only three qualitative questions on the evaluation form. It was considered that

trainees could be sent a questionnaire to assess their attitude toward e-learning and their

experience of use, however this was not possible within the timescales and size of this

Page 25: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

24

project. It may be worthwhile to qualitatively explore trainee perspectives of the use of

e-learning on the DClinPsy course via a further research project.

Conclusions

Pedagogical and ID features support learners to successfully develop knowledge and

understanding. The e-learning packages developed for use on the DClinPsy training

programme were evaluated using a bespoke checklist to assess the inclusion of

pedagogical and ID features which were identified as relevant for the aims and scope

of the e-learning packages. It was found that overall ‘present’ and ‘partial’ compliance

was high and that the ‘Learning Techniques’ category was well implemented,

suggesting good inclusion of pedagogical and ID features. There were errors and areas

for improvement identified within the packages which require development to better

support learning and user needs. Some areas where there were missing data would

benefit from exploring user perspective to assess features users find helpful for learning

(e.g. a discussion forum).

This project used a largely quantitative approach and it may be useful to supplement

these findings with a further qualitative exploration. More evaluation of user

perspective is needed to assess the usability and effectiveness of the training; whilst we

can assess the presence of pedagogical and ID features, the main priority of e-learning

is that that it meets the learning requirements of the trainees.

Page 26: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

25

References

Borreson Caruso, J., & Salaway, G. (2007). The ECAR Study of Undergraduate Students and Information Technology, 2007. Retrieved from http://www.csplacement.com/downloads/ECAR-ITSkliisstudy.pdf

Bransford, J. D., Brown, A. L., & Cocking, R. R. (2000). How People Learn: Brain, Mind, Experience and School (Expanded Edition ed.). Washington, D.C.: National Research Council.

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77-101.

Daskalakis, S., & Tselios, N. (2013). Evaluating e-Learning Initiatives: A Literature Review on Methods and Research Frameworks. In Web-Based and Blended Educational Tools and Innovations: IGI Global.

Dorobat, I. (2014). Models for Measuring E-Learning Success in Universities: A Literature Review. Informatica Economică, 19(3), 77-90. doi:10.12948/issn14531305/18.3.2014.07

Gordon, N. (2014). Flexible Pedagogies: preparing for the future. Retrieved from Holsapple, C. W., & Lee-Post, A. (2006). Defining, Assessing, and Promoting E-Learning

Success: An Information Systems Perspective. Decision Sciences Journal of Innovative Education, 4(1), 67 - 85.

Islam, N. A. K. M. (2013). Investigating e-learning system usage outcomes in the university context. Computers & Education, 69, 387 - 399. doi:10.1016/j.compedu.2013.07.037

Ituma, A. (2011). An evaluation of students’ perceptions and engagement with e-learning components in a campus based university. Active Learning in Higher Education, 12(1), 57-68. doi:10.1177/1469787410387722

Lim, C. J., & Lee, S. (2007). Pedagogical Usability Checklist for ESL/EFL E-learning Websites. Journal of Convergence Information Technology, 2(3), 67 - 76.

Martín-Rodrígueza, Ó., Fernández-Molinab, J. C., Montero-Alonsoc, M. Á., & González-Gómezd, F. (2015). The main components of satisfaction with e-learning. Technology, Pedagogy and Education,, 24(2), 267 - 277. doi:10.1080/1475939X.2014.888370

Mehanna, W. N. (2004). e-Pedagogy: the pedagogies of e-learning. Alt-J Research in Learning Technology, 12(3), 279-293. doi:10.3402/rlt.v12i3.11259

Merrill, D. M., Drake, L., Lacy, M. J., & Pratt, J. (1966). Reclaiming Instructional Design. Educational Technology, 36(5), 5-7.

Mödritscher, F. (2006). The Impact of an E-Learning Strategy on Pedagogical Aspects. (30/09/2019). Retrieved from https://pdfs.semanticscholar.org/b6c1/9ca093d5187491be0afb4b5527e800e17abe.pdf

Oztekin, A., Kong, Z. J., & Uysal, O. (2010). UseLearn: A novel checklist and usability evaluation method for eLearning systems by criticality metric analysis. International Journal of Industrial Ergonomics, 40, 455 - 469. doi:10.1016/j.ergon.2010.04.001

Paechter, M., Maier, B., & Macher, D. (2010). Students’ expectations of, and experiences in e-learning: Their relation to learning achievements and course satisfaction. Computers & Education, 54, 222-229. doi:10.1016/j.compedu.2009.08.005

Ruiz, J. G., Mintzer, M. J., & Leipzig, R. M. (2006). The Impact of E-Learning in Medical Education. Academic Medicine, 81(3), 207 - 212.

Smith, S. D., & Borreson Caruso, J. (2010). The ECAR Study of Undergraduate Students and Information Technology, 2010. Retrieved from http://anitacrawley.net/Resources/Reports/ECAR%20study%20highlights.pdf

Tavangarian, D., Leypold, M. E., Nölting, K., Röser, M., & Voigt, D. (2004). Is e-Learning the Solution for Individual Learning? Electronic Journal of e-learning, 2(2), 273-280.

Page 27: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

26

Yelland, N., Tsembas, S., & Hall, L. (2008). E learning: issues of pedagogy and practice in the information age. In P. Kell, W. Vialle, D. Konza, & G. Vogl (Eds.), Learning and the learner: exploring learning for new times: University of Wollongong.

Page 28: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

27

Appendix:

Appendix 1: Bespoke Checklist used to evaluate the e-learning packages:

Page 29: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

28

Appendix 2: DClinPsy E-Learning Feedback Evaluation Form

Page 30: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

29

Page 31: Evaluating the E-Learning packages on the …...3 (Bransford et al., 2000; Mödritscher, 2006). The e-learning system should have an intuitive design, where users can navigate, find,

Service Evaluation Project Evaluating E-Learning

30