218
Critical Success Elements for the Design and Implementation of Organisational E-learningKristal Teresa Reynolds BBus(HRM) QUT, BBus(HonsI) QUT School of Management QUT Business School, Queensland University of Technology Submitted for the award of Masters of Business (Research) 2012

Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

Embed Size (px)

Citation preview

Page 1: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

“Critical Success Elements for the Design and Implementation

of Organisational E-learning”

Kristal Teresa Reynolds

BBus(HRM) QUT, BBus(HonsI) QUT

School of Management

QUT Business School, Queensland University of Technology

Submitted for the award of Masters of Business (Research)

2012

Page 2: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational
Page 3: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

i

Keywords

e-learning, organisational e-learning, learning management system, workplace

learning, adult learning, learning and development, training, organisations, training

evaluation, information systems, information systems evaluation, e-learning

evaluation, success, e-learning success

Short Abstract

Organisations are engaging in e-learning as a mechanism for delivering

flexible learning to meet the needs of individuals and organisations. In light of the

increasing use and organisational investment in e-learning, the need for methods to

evaluate the success of its design and implementation seems more important than

ever. To date, developing a standard for the evaluation of e-learning appears to have

eluded both academics and practitioners.

The currently accepted evaluation methods for e-learning are traditional

learning and development models, such as Kirkpatrick’s model (1976). Due to the

technical nature of e-learning it is important to broaden the scope and consider other

evaluation models or techniques, such as the DeLone and McLean Information

Success Model, that may be applicable to the e-learning domain. Research into the

use of e-learning courses has largely avoided considering the applicability of

information systems research. Given this observation, it is reasonable to conclude

that e-learning implementation decisions and practice could be overlooking useful or

additional viewpoints.

This research investigated how existing evaluation models apply in the

context of organisational e-learning, and resulted in an Organisational E-learning

Page 4: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

ii

success Framework, which identifies the critical elements for success in an e-

learning environment.

In particular this thesis highlights the critical importance of three e-learning

system creation elements; system quality, information quality, and support quality.

These elements were explored in depth and the nature of each element is described in

detail. In addition, two further elements were identified as factors integral to the

success of an e-learning system; learner preferences and change management.

Overall, this research has demonstrated the need for a holistic approach to e-

learning evaluation. Furthermore, it has shown that the application of both traditional

training evaluation approaches and the D&M IS Success Model are appropriate to

the organisational e-learning context, and when combined can provide this holistic

approach. Practically, this thesis has reported the need for organisations to consider

evaluation at all stages of e-learning from design through to implementation.

Acknowledgement

The author is grateful to the Cooperative Research Centre (CRC) for Rail

Innovation (established and supported under the Australian Government’s CRC

program) for the funding of this research. Project No. P4.110, “E-learning for rail”.

Page 5: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

iii

Table of Contents

List of Tables............................................................................................................. viii

Table of Figures .......................................................................................................... ix

List of Abbreviations................................................................................................... ix

Statement of Original Authorship ................................................................................ x

Acknowledgements ..................................................................................................... xi

Chapter One—Introduction .......................................................................................... 1

Contribution to Knowledge ...................................................................................... 2

Definition of E-learning ........................................................................................... 4

Research Problem ..................................................................................................... 7

Overview of Methodology ....................................................................................... 7

Thesis Structure ........................................................................................................ 8

Chapter Summary ................................................................................................... 10

Chapter Two—Literature Review .............................................................................. 11

Chapter Overview ................................................................................................... 11

Introduction ............................................................................................................ 11

Workplace Learning Context ................................................................................. 13

Workplace E-learning ......................................................................................... 15

Definition of e-learning ...................................................................................... 18

Benefits of e-learning.......................................................................................... 21

Drawbacks and limitations of e-learning ............................................................ 23

Adult Learning Applied to E-learning .................................................................... 26

Training Evaluation ................................................................................................ 34

Traditional learning and development evaluation .............................................. 35

Page 6: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

iv

Level one—reaction ........................................................................................ 37

Level two—learning ........................................................................................ 38

Level three—behaviour ................................................................................... 39

Level four—results .......................................................................................... 40

Information systems evaluation .......................................................................... 47

System quality ................................................................................................. 49

Information quality .......................................................................................... 49

Service quality ................................................................................................. 49

Use, intention to use and user satisfaction ...................................................... 50

Net benefits ..................................................................................................... 51

Research Framework .............................................................................................. 54

Framework elements of focus ............................................................................. 58

System quality ................................................................................................. 58

Information quality .......................................................................................... 60

Service quality ................................................................................................. 61

Chapter Summary ................................................................................................... 62

Chapter Three—Methodology ................................................................................... 63

Chapter Overview ................................................................................................... 63

The Research Question ........................................................................................... 64

Overview of Qualitative Research Approach ......................................................... 66

Unit of analysis ................................................................................................... 67

Case study research design ................................................................................. 68

Data collection methods .................................................................................. 70

Sampling .......................................................................................................... 72

Piloting the interviews ..................................................................................... 73

Data Analysis .......................................................................................................... 74

Quality of the approach .......................................................................................... 76

The Research Setting—Tracks ............................................................................... 78

Page 7: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

v

Levels of e-learning ............................................................................................ 79

Access to Tracks ................................................................................................. 80

Ethical Considerations ............................................................................................ 82

Boundaries and Limitations .................................................................................... 83

Chapter Summary ................................................................................................... 84

Chapter 4—Findings and Discussion ......................................................................... 85

Chapter Overview ................................................................................................... 85

Contextual Overview .............................................................................................. 88

E-learning at Tracks ............................................................................................ 88

E-learning team ................................................................................................... 88

E-learning courses at Tracks ............................................................................... 90

Current evaluation at Tracks ............................................................................... 90

Themes Presented by Research Questions ............................................................. 94

Research question 1: How does system quality apply in the context of

organisational e-learning and what is the nature of this factor? ......................... 94

Structure .......................................................................................................... 96

Ease of use....................................................................................................... 98

Functionality ................................................................................................... 99

Legitimacy..................................................................................................... 101

Long-term knowledge resource..................................................................... 102

Flexibility ...................................................................................................... 103

Accessibility .................................................................................................. 105

Research question 2: How does information quality apply in the context of

organisational e-learning and what is the nature of this factor? ....................... 106

Format ........................................................................................................... 107

Nature of content ........................................................................................... 108

Relevance ...................................................................................................... 112

Page 8: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

vi

Ease of understanding ................................................................................... 114

Interaction ...................................................................................................... 114

Alignment ...................................................................................................... 115

Content accuracy ........................................................................................... 117

Research question 3: How does service quality apply in the context of

organisational e-learning and what is the nature of this factor? ....................... 118

Types ............................................................................................................. 119

Expectations .................................................................................................. 122

New factors which emerged during interviews ................................................ 125

Learner preferences ........................................................................................... 125

Preference for face-to-face training ............................................................... 126

Hands-on learning ......................................................................................... 127

Presentation of information ........................................................................... 127

Individual differences .................................................................................... 128

Change Management ........................................................................................ 128

General process ............................................................................................. 129

Vendor process .............................................................................................. 130

Content development process........................................................................ 131

Evaluation processes ..................................................................................... 132

E-learning champion ..................................................................................... 132

Chapter Summary ................................................................................................. 134

Chapter 5—Discussion and Conclusion ................................................................... 136

Chapter Overview ................................................................................................. 136

Review of research framework ............................................................................. 137

Discussion of Findings and Theoretical Implications .......................................... 139

System quality ................................................................................................... 139

Information quality ........................................................................................... 144

Support quality .................................................................................................. 151

Page 9: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

vii

New Factors which Emerged During Analysis .................................................... 154

Learner preferences........................................................................................... 155

Change management ......................................................................................... 156

Contributions to Practice ...................................................................................... 160

Research Limitations ............................................................................................ 162

Directions for Future Research ............................................................................. 163

Thesis Summary ................................................................................................... 164

References ................................................................................................................ 165

Appendix 1—Interview protocol ............................................................................. 184

Appendix 2 - Questions for pilot and revised for research ...................................... 186

Appendix 3 - Coding classification .......................................................................... 188

Appendix 4 - Participant information sheet and consent form ................................ 191

Appendix 5 — Example narratives relating to descriptions of system quality ........ 194

Appendix 6 — Example narratives relating to descriptions of information quality 197

Appendix 7 — Example narratives relating to descriptions of support quality ....... 202

Page 10: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

viii

List of Tables

Table 1 Common definitions of e-learning 6

Table 2 Common definitions of e-learning 20

Table 3 Summary of selected works on interaction, and isolation in e-learning 24

Table 4 Assumptions of the andragogical model and their implications for

e-learning 28

Table 5 Mature learner learning principles and implications for e-learning 32

Table 6 Limitations of Kirkpatrick's (1976) model of evaluation 42

Table 7 Outcomes used in training evaluation (Noe & Winkler, 2009, p. 201) 46

Table 8 Educational success metrics 53

Table 9 Crucial assumptions in qualitative research 67

Table 10 Summary of access timeline 81

Table 11 Overview of participant attributes 86

Table 12 System Quality—Comparison of current study results to existing studies

141

Table 13 Information Quality—Comparison of current study results to existing

studies 145

Table 14 Support Quality—Comparison of current study results to existing studies

.................................................................................................................................. 152

Table 15 Workplace learning change principles (Rylatt, 2000, p. 6) ..................... 157

Page 11: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

ix

Table of Figures

Figure 1: Chapter Two Structure ............................................................................... 11

Figure 2: Literature review fields .............................................................................. 12

Figure 3: Learning in the workplace (Wang et al., 2010) ......................................... 17

Figure 4: Transfer of training model (Holton, 1996) ................................................ 43

Figure 5: DeLone and McLean’s (2004) updated D&M IS Success Model ............. 48

Figure 6: E-learning success research framework ..................................................... 57

Figure 7: Chapter Three Structure ............................................................................. 63

Figure 8: Elements of system quality ........................................................................ 96

Figure 9: Elements of information quality .............................................................. 106

Figure 10: Elements of support quality ................................................................... 119

Figure 11: Elements of learner preferences ............................................................. 125

Figure 12: Elements of change management........................................................... 129

Figure 13: Chapter Five Structure ........................................................................... 137

Figure 14: E-learning success research framework ................................................. 138

Figure 15: The Organisational E-learning Success Framework .............................. 159

List of Abbreviations

D&M DeLone and McLean

IS Information system/s

L&D Learning and development

LMS Learning management system

ROI Return on investment

Page 12: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

x

Statement of Original Authorship

“The work contained in this thesis has not been previously submitted to meet

requirements for an award at this or any other higher education institution. To the

best of my knowledge and belief, the thesis contains no material previously

published or written by another person except where due reference is made.”

Signature:

Date: 12th November 2012

QUT Verified Signature

Page 13: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

xi

Acknowledgements

“I am the designer of my own catastrophe”

Anonymous

At times, writing a thesis feels like a catastrophe waiting to happen. I would like to

take this opportunity to thank a few people who have made the process of writing

this thesis somewhat easier during the past year and helped me overcome any

problems I had (which at times felt like minor catastrophes!)...

My principal supervisor, Dr. Karen Becker – It seems like a long time since

we met in your ‘Intro to HR’ lecture. For some reason you took me under your wing,

mentored me, and believed in me, even when I didn’t believe in myself. I’m very

grateful for your guidance, friendship, many coffee breaks, and chats about life in

general. I can’t thank you enough.

I was lucky enough to have two associate supervisors:

Associated Professor Cameron Newton – for your quantitative slant on this

‘wordy’ thesis. I know at times this was stretching your comfort zone (and mine!)

and I appreciate your insights and support.

Dr. Kieren Jamieson – I feel privileged to be the first student you have

supervised. You are genuinely one of the kindest people I know, and I always looked

forward to your emails because I knew they would be so upbeat and supportive.

The support and encouragement of family and friends has been indispensable. Thank

you to all of you (you know who you are). There are a number of people in particular

that I would like to acknowledge...

Page 14: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

xii

My parents – for your continual love and support. I know academia is a

different world to you, but thank you for always trying to understand what I was

doing. You can now have your un-stressed ‘nice’ daughter back.

Tom – A special mention for your contribution of ‘reading my abstract’, and

being a self-nominated fourth supervisor (because three clearly wasn’t enough!). On

a serious note, thank you for always being interested in what I was doing, your

encouragement and support. I look forward to having my weekends back to spend

with you again.

Sarah – for being the distraction that I often needed. The afternoon Freddo

Frog breaks were essential to my sanity, and you were always a willing accomplice!

Finally, I acknowledge and thank the interviewees of the case organisation. Without

your honest insights I wouldn’t have a thesis. A particular mention to my industry

contact... You were instrumental in getting me access to the organisation and

overcoming any hurdles along the way. Thank you, and I hope you find this thesis

useful.

Page 15: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

1

Chapter One—Introduction

“Since evaluation of e-learning is necessary to demonstrate its worth, the

need for better and more widely used evaluation models is critical to the

future of e-learning.” (Moller et al., 2008, p. 71)

It is widely recognised that employees make a critical contribution to the

continued viability of organisations (Delahaye, 2011); however, the nature of the

economy and the types of jobs we undertake has changed and will continue to

change. In addition to evolving industries, the way work is done has also changed

drastically. Technology-based work practices are increasing and organisations will

continue to rely upon new technologies (Bondarouk & Ruël, 2010; Welsh, Wanberg,

Brown, & Simmering, 2003). The training and education of adults will play a pivotal

role in responding to the challenges that organisations face when evolving to meet

these changes (Department of Education Science and Training, 2003).

Organisations are engaging in e-learning as a mechanism for delivering

flexible learning to meet the needs of individuals and organisations (Australian

Flexible Learning Framework, 2011; Waight & Stewart, 2005a). Organisations have

seen the potential benefits and are increasingly investing significant resources, time

and money in complex technological innovations such as e-learning. The aim is to

improve delivery of learning and development (L&D) initiatives and ultimately,

organisational outcomes. Ozkan, Koseler and Baykal (2009, p. 112) summarise the

potential impact of e-learning on organisations:

“Since e-learning has several advantages in terms of cost reduction,

simplified training programs, flexibility, and convenience; it is poised to

Page 16: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

2

become an integral component of information dissemination, and emerges as

the new paradigm of modern education.”

Although e-learning in organisations continues to be adopted at increasing

rates, the literature focuses mainly on school-based environments (Chen, 2010). In

addition, while online learning and adult learners in academic settings have been

explored in recent years and a body of literature exists related to associated issues

(Johnson & Aragon, 2003; Lee, Owens, & Benson, 2002), literature regarding adult

learning and e-learning in organisational settings is yet to be extensively developed

(Waight & Stewart, 2005a).

In light of the increasing use and organisational investment in e-learning, the

need for methods to evaluate the success of its design and implementation seems

more important than ever. To date, developing a standard for the evaluation of e-

learning appears to have eluded both academics and practitioners. Derouin et al.

(2005, p.929) concluded that “overall, it is difficult to conclude that e-learning is

more, less, or equally effective at the learning level than traditional classroom-based

training”. This research investigates the critical factors for e-learning design and

evaluation as it applies in an organisational context. Furthermore, this study aims to

investigate how existing evaluation models apply in the context of e-learning, and

furthermore provide some guidance on this topic to practitioners and academics

alike.

Contribution to Knowledge

A review of training evaluation practices in organisations by Twitchell et al.

(2000) found that evaluation methods have largely remained static over the last 40

Page 17: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

3

years despite the changes in delivery methods and new technologies. The choice of

evaluation criteria is a critical decision when evaluating the effectiveness of L&D

(Arthur, Bennett, Edens, & Bell, 2003). The integration of new technologies into the

learning process presents new complications to the already challenging nature of

evaluation (Galloway, 2005). The obvious and currently accepted evaluation

methods for e-learning are traditional L&D models, such as Kirkpatrick’s model

(1976). Due to the technical nature of e-learning it is important to broaden the scope

and consider other evaluation models or techniques that may be applicable to the e-

learning domain.

A review of the literature reveals that research in the e-learning domain has

largely avoided the applicability of information systems (IS) research. Given this

observation, it is reasonable to conclude that e-learning implementation decisions

and practice (such as design, use, delivery, and success measures) could be

overlooking useful or additional considerations. McFarlan (1987) predicted over 20

years ago that researchers from different disciplines would need to form partnerships

in order to research IS technology in the context of organisations, and this holds true

today. In light of this, it is timely to explore the extent that broader IS theories apply

to the specific context of e-learning.

Although e-learning is not technically an IS, e-learning is facilitated by the

use of specialised IS. A widely accepted model of IS success is that of DeLone and

McLean (1992) which has become known as the D&M IS Success Model and has

been used extensively in the measurement of IS success for over 20 years (DeLone &

McLean, 2003). The original DeLone and McLean (1992) taxonomy contained five

variables: system quality, information quality, perceived usefulness, user satisfaction,

and IS use. The revised model (DeLone & McLean, 2003) consists of six interrelated

Page 18: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

4

factors (system quality, information quality, service quality, intention to use/use, user

satisfaction, and net benefits) to measure the dependent variable of Information

System Effectiveness. The current study proposes that by incorporating aspects of

traditional L&D evaluation dimensions, an updated version of the D&M IS Success

Model can be applied to create an Organisational E-learning Success Model.

Given that the literature in educational settings is more extensive than that of

organisational e-learning, this research aligns with Lee-Post’s assertions (2009, p.

62), that there is “a need to integrate and formulate a holistic and comprehensive

model for evaluating e-learning”. As such, the primary objective of this study is to

address this need and formulate a model of factors critical to e-learning success in

organisations. The major contribution of this study is the application and in-depth

investigation of the D&M IS Success Model in an organisational e-learning

environment. In order to ensure that evaluation models are relevant to an e-learning

context it is necessary to perform an assessment of the currently used elements of the

research framework to assess how they might apply, if they are necessary, and if new

relevant factors need to be incorporated. This thesis aims to fill this gap in the current

literature in this domain.

Definition of E-learning

E-learning has become a generic term to encompass a multitude of

instructional content or learning experiences enabled by electronic technologies, such

as the Internet, Web 2.0 applications, intranets and extranets (Bondarouk & Ruël,

2010). The relaxed terminology among researchers and practitioners—with e-

learning also being known as online learning, web-based training, and web-based

Page 19: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

5

learning (Moore, Dickson-Deane, & Galyen, 2011)—has resulted in a multitude of

definitions. A review of literature for this thesis found that there is an inconsistent

use of terminology and authors tend to use distance learning, e-learning, and online

learning interchangeably. Due to these inconsistencies in existing studies, research

relating to a range of terminologies will be referred to and discussed throughout this

thesis. The more commonly cited definitions in the literature are presented in Table

1.

Page 20: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

6

Table 1

Common definitions of e-learning

Author/s Definition

Bondarouk and

Ruël

“any type of learning situation in which instructional context is

delivered through the use of computer networked technology,

primarily over an intranet, or through the Internet, where and

when required” (Bondarouk & Ruël, 2010, p. 149)

Johnson, Hornkik,

and Salas

“training or educational initiatives which provide learning

material in online repositories, where course interaction and

communication and course delivery are technology

mediated”(Johnson, Hornkik, & Salas, 2008, p. 357)

Rosenberg “the use of Internet technologies to deliver a broad array of

solutions that enhance knowledge and performance”

(Rosenberg, 2001, p. 28)

Sambrook “any learning activity supported by information and

communication technologies (ICT)” (Sambrook, 2003, p. 191)

Sun, Tsai, Finger,

Chen, and Yeh

“the use of telecommunication technology to deliver

information for education and training” (Sun, Tsai, Finger,

Chen, & Yeh, 2008, p. 1183).

The American

Society for

Training and

Development

“a wide set of applications and processes, such as Web-based

learning, computer-based learning, virtual classrooms, and

digital collaboration. It includes the delivery of content via

Internet, intranet/extranet (LAN/WAN), audio- and videotape,

satellite broadcast, interactive TV, and CD-ROM” (ASTD,

2010, np)

Welsh, Wanberg,

Brown, and

Simmering

“the use of computer network technology, primarily over an

intranet or through the Internet, to deliver information and

instruction to individuals” (Welsh, Wanberg, Brown, &

Simmering, 2003, p. 246)

All the definitions presented recognise the transfer of information through a

technological medium, however there is little consensus as to what form the

technology takes. Taking into account the previous definitions of e-learning offered,

the definition of e-learning for the purposes of this research is:

Page 21: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

7

E-learning is any learning and development initiative which utilises computer

technology to facilitate learning when and where required.

Research Problem

Following a review and synthesis of the literature in the key areas of e-

learning, traditional evaluation, and IS evaluation, the overall purpose of the research

was to determine:

What are the critical elements to evaluate the success of e-learning initiatives

in an organisational setting?

From this broader research purpose, and the research framework that will be

presented in Chapter Two, three research questions were identified. The research

questions link directly to three key elements of the D&M IS Success Model which

have been incorporated into the research framework. The research questions are

outlined below:

1. How does system quality apply in the context of organisational e-

learning and what is the nature of this factor?

2. How does information quality apply in the context of organisational e-

learning and what is the nature of this factor?

3. How does service quality apply in the context of organisational e-

learning and what is the nature of this factor?

Overview of Methodology

This thesis has taken the form of applied research. Like any basic research,

the purpose of this thesis is to contribute to knowledge and theory to explain the

Page 22: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

8

phenomenon under investigation (Patton, 2002). Furthermore, it aims to go one step

further and “contribute knowledge that will help people to understand the nature of a

problem in order to intervene” thereby “allowing human beings to more effectively

control their environment” (Patton, 2002, p. 217). A core understanding of applied

research, is that it is conducted to test applications of basic theory and disciplinary

knowledge to real-world problems and experiences (Patton, 2002). In order to

address this applied research approach, a qualitative research method has been used.

Specifically, a single case study was seen as the appropriate design to facilitate data

collection.

Data were primarily collected from multiple semi-structured interviews in the

case organisation. In addition, other sources such as organisational documentation

were used. A systematic approach to data analysis was taken in order to search for

meaning within the data (Hatch, 2002). The coding process for this research followed

the suggestion of Beekhuyzen et al. (2010): that organisational coding is a good

place to start building a classification scheme. Organisational categories were broad

topics that were established prior to interviews (Maxwell, 2005). In the case of this

research, the theoretical framework provided broad topics to begin coding, for

example ‘information quality’, ‘system quality’, and ‘service quality’. The results of

this analysis are presented in Chapter Four.

Thesis Structure

This thesis is divided into five chapters: Introduction, Literature Review,

Methodology, Findings, and Discussion and Conclusion. Following this introductory

chapter, the literature pertaining to this study on e-learning, workplace learning, adult

Page 23: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

9

learning, traditional training evaluation, and IS evaluation is reviewed (Chapter

Two—Literature Review). First, an overview of e-learning in regards to benefits and

potential drawbacks are given, followed by a review of adult learning as it applies to

e-learning. Traditional models of training evaluation and IS evaluation are then

examined as the focal factors in this study. The chapter concludes with the

presentation of a research framework which addresses the overall research purpose,

and a discussion about the framework elements of focus for this thesis. Research

questions are presented throughout this final discussion.

Chapter Three—Methodology addresses the research approach this study has

utilised to investigate the research questions proposed in Chapter Two—Literature

Review. Initially the qualitative research approach is outlined, followed by an

overview of the case study research design which includes data collection methods,

sampling, and piloting of the interviews. The data analysis techniques employed are

presented, followed by a discussion about the quality of the research methodology

used. Finally, the case organisation is introduced and the chapter concludes with an

overview of ethical considerations, boundaries, and limitations.

The qualitative findings are presented in Chapter Four—Findings. The

chapter begins with an overview of the context of the investigation at the case

organisation. This includes an overview of the participants who were interviewed

and their key attributes. The findings are then presented in relation to the research

framework and each of the research questions.

Chapter Five—Discussion and Conclusion reviews the findings in the

preceding chapter and provides a discussion of the findings in relation to the

literature presented in Chapter Two—Literature Review. The theoretical and

practical contributions of this research are discussed, along with the study’s research

Page 24: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

10

limitations. Directions for future research are then suggested. Finally this thesis

concludes with a summary of its content and key messages.

Chapter Summary

This first chapter of the thesis has outlined the background and justification

of the research. In particular it presented a broad overview of the key literature

relating to the potential critical elements to e-learning evaluation. The chapter then

offered the overall purpose of the research and the research questions, and outlined

the methodology and structure of the thesis. The next chapter explores the theoretical

basis for this research, and presents the research framework to guide the study along

with a more detailed discussion of the relevant literature and subsequent

development of research questions.

Page 25: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

11

Chapter Two—Literature Review

Chapter Overview

The previous chapter provided an introduction to the research and an

overview of the thesis structure. This chapter provides a review of the literature in

the areas of workplace e-learning and evaluation. The chapter structure is presented

in Figure 1.

Figure 1: Chapter Two Structure

Introduction

This investigation brings together three bodies of literature. The first main

body of literature relates to e-learning, the second relates to traditional L&D

evaluation, and the third relates to IS evaluation. A review of the literature reveals

that research in the e-learning domain has largely avoided the applicability of IS

research. Given this observation, it is reasonable to conclude that e-learning

implementation decisions and practice (such as design, use, delivery, and success

measures) could be overlooking useful or additional considerations. McFarlan (1987)

Workplace Learning Context

Workplace E-learning

Training Evaluation

Information Systems Evaluation

Multiple Perspectives of Evaluation

Research Framework

Page 26: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

12

predicted over 20 years ago that researchers from different disciplines would need to

form partnerships in order to research IS technology in the context of organisations,

and this holds true today. In light of this, it is timely to explore the extent that

broader IS theories apply to the specific context of e-learning. A number of other

literature areas which could provide insight into e-learning evaluation, such as adult

learning, were also considered. The key fields requiring consideration are shown in

Figure 2.

Figure 2: Literature review fields

The chapter begins with an overview of the workplace learning context,

followed by a review of workplace e-learning and its associated benefits and

potential drawbacks. Adult learning and its application to e-learning is then

Workplace

Learning

Adult

Learning E-learning

IS

Evaluation

Traditional

L & D

Evaluation

E-learning

Evaluation

Page 27: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

13

discussed. Traditional training evaluation and IS evaluation are then examined as the

focal factors of this study. The chapter concludes with the presentation of a research

framework which addresses the overall research purpose, and a discussion about the

framework elements of focus for this thesis. Research questions are presented

throughout this final discussion.

Workplace Learning Context

Learning is vital on many levels: for the individual, the organisation, and

Australia’s economic performance as a whole (Department of Education Science and

Training, 2003). Learning helps individuals to develop new skills and keep pace with

new technologies and management practices. This in turn results in increased

productivity, stronger business potential, and higher profitability (Department of

Education Science and Training, 2003). With greater recognition of the importance

of continual learning, the broader work in the field of learning has moved into the

organisational arena, with a focus on learning within organisations for the purpose of

personal or professional development (Becker, 2007). Learning in the context of

organisations, often referred to as workplace learning, refers to learning or training

activities undertaken in the workplace, with the goal of enhancing individual and

organisational performance (Rosenberg, 2006). Workplace learning is different to a

single training course in that it moves from the traditional classroom-based learning

to considering learning as integral to an individual’s job, occurring within the

workplace as an ongoing process. Rylatt (2000, p. 5) recognises this distinction in his

definition which sees workplace learning as “a sustained and high leverage

development of people in line with organisational outcomes”. Salas and Cannon-

Bowers (2001) suggest that organisations are embracing workplace learning and

Page 28: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

14

moving from the traditional view of stand-alone training to training as an integrated

strategic component of the organisation.

Furthermore, workplace learning is distinctly different to learning that takes

place in educational settings. Wang et al. (2010) suggest four distinctions between

these two environments:

1. Employees are adult learners who enter workplace settings with prior life

experiences, different educational backgrounds, and working history, and as

such have different learning needs and expectations, and distinctive learning

characteristics. They have distinct job responsibilities, which require different

types and levels of expertise.

2. The goal of formal learning in educational institutions is knowledge transfer,

as compared to learning in the workplace which serves organisational goals

and needs; it focuses on organisational systems, structures, policies, and

institutional forms of knowledge to link individual and organisational

learning.

3. Workplace learning is more contextual and dynamic than in educational

settings; knowledge in the workplace is disseminated within an organisation

and arises from employees’ daily activities and interaction with the working

environment.

4. Workplace learning can be described as a community of practice where social

networking happens which allows the creation and transfer of knowledge

among individuals and groups.

Page 29: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

15

Workplace E-learning

As organisations become more competitive and the need for highly skilled

employees increases, there is a greater need for a responsive and innovative

workforce. A key factor in the ability of an organisation to create a climate for the

rapid acquisition of new knowledge and skills is the provision of planned and formal

training (Hayes & Allinson, 1997). One such form of planned training to have

emerged in recent years is e-learning. E-learning is a valuable training and

development solution, and has greatly impacted the way training initiatives are

delivered and how learning occurs for many organisations (Moore et al., 2011;

Waight & Stewart, 2005b).

E-learning had its beginnings in the educational arena. However, unlike

educational settings, e-learning in corporate environments is an under-researched

area. A proliferation of e-learning research based on formal courses in the

educational setting exists. However, it is recognised that “learning is a phenomenon

that is situated in a specific cultural context” (Tynjälä, 2008, p. 132), and therefore

learning in organisations is vastly different to that in educational arenas (Wang et al.,

2010). Several researchers have begun to explore e-learning in organisations with a

specific focus on the areas of: perceptions of, and reactions to, e-learning (Baldwin-

Evans, 2004); human resource development implications (Brown, Murphy, & Wade,

2006); strategies for workplace e-learning (Servage, 2005); pedagogical challenges

(Tynjala & Hakkinen, 2005); e-learning development (Wang et al., 2010); valuing

the adult learning in e-learning settings (Waight & Stewart, 2005a, 2005b); and e-

learning use and job outcomes (Chen, 2010).

Although it is safe to assume that the general goal of e-learning in the organisational

arena is to enhance organisational and individual performance (Rosenberg, 2006) it

Page 30: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

16

is unfortunate that (in practice) there is often a misalignment between learner needs

and corporate interests (Brink, Munro, & Osborne, 2002; Servage, 2005; Wang et al.,

2010). Wang et al.(2010) summarise this gap in corporate interests and learners’

needs:

“For individuals, although knowledge can be learned by participating in e-

learning programs, more often they do not think e-learning is helpful since

the knowledge learned cannot help improve their work performance. For

organisations, e-learning is generally designed without meeting the

organisational vision and mission (Wang et al., 2010, p. 167).

Furthermore, McGraw (2001) suggests five strategies which address this

misalignment and aid organisations to meet the needs of both the learner and the

business when implementing e-learning. These strategies are summarised by Ismail

(2001, p. 331):

A common language and vision to describe e-learning for the organisation

and its linkages to business needs.

Governing principles and organisation-wide support policies.

Creation of content that make learning compelling, engaging, and relevant to

target audience needs.

Support for individual learner profiles, including job- or role-based

competencies, interests, and long-term career goals.

A standards-driven technical architecture that can link to existing systems and

be accessed efficiently.

In an ideal situation, learning activities in the workplace should aim to

address corporate interests, individual needs, work performance, and consider the

Page 31: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

17

social context of learning which allows the creation and transfer of knowledge

(Wang et al., 2010). This incorporates the fundamental elements of a learning

environment: (1) the learner, (2) the learning content, (3) the social context, and (4)

other learning stakeholders (Illeris, 2003, 2004). These guiding principles should be

no different for e-learning applications. However, the benefits of e-learning

(particularly greater cost-effectiveness and flexibility) potentially result in

organisations implementing e-learning for the cost benefit of the organisation and not

in order to best meet learners’ needs.

An effective learning application, such as e-learning, should take into

consideration the four previously mentioned elements and their potential interactions

(Wang et al., 2010), as represented in Figure 3. Based on this holistic perspective, e-

learning is not just a standalone training course to facilitate transfer of knowledge but

rather a learning application that impacts on a number of stakeholders. If best

practice in development of an e-learning application is to consider all stakeholders,

then best practice evaluation should also consider these elements and they should be

reflected in evaluating its success.

Figure 3: Learning in the workplace (Wang et al., 2010)

Page 32: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

18

Definition of e-learning

E-learning has become a generic term to encompass a multitude of

instructional content or learning experiences enabled by electronic technologies, such

as the Internet, Web 2.0 applications, intranets and extranets (Bondarouk & Ruël,

2010). The relaxed terminology among researchers and practitioners—with e-

learning is also being known as online learning, web-based training, and web-based

learning (Moore et al., 2011)—has resulted in a multitude of definitions. Many other

authors have used the term e-learning and not provided a definition (for examples see

Baldwin-Evans, 2004; Chiu, Hsu, Sun, Lin, & Sun, 2005; Fisher, Wasserman, &

Orvis, 2010; Hernández, Gorjup, & Cascón, 2010; Hogarth & Dawson, 2008;

Hutchins & Hutchison, 2008; Jung, 2010; Lin, 2011; Lu & Chiou, 2010; Shivetts,

2011), working on the assumption that there is a general understanding regarding

what constitutes e-learning. When terms are used synonymously, thus showing that

there is not a consensus in definition, this can create difficulties for researchers in

performing meaningful research, performing cross-study comparisons, and building

on outcomes from previous studies (Moore et al., 2011).

Servage (2005, p. 305) has expressed concern with these variations in terms,

stating that there is an “utter lack of consistency” in terminology surrounding e-

learning. Although there are differences in terminology and some definitions are

broader than others, Servage’s concerns are undue as most definitions contain similar

elements. Moore et al. (2011) shared the concerns of Servage (2005) that there is a

lack of consistency in terminology. As a result, Moore et al. (2011) performed a

study to assess how researchers defined the learning environment and what they

identify as the differences between distance learning, e-learning, and online learning.

The study concluded that participants perceive a difference between the terms and

Page 33: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

19

that different characteristics are attributed to each of the learning environments; in

short, the participants struggled to find consensus as to what term should be used in

what situation.

Moore et al.’s (2011) findings were mirrored in the review of literature for

this thesis where it was found that there is an inconsistent use of terminology and

authors tend to use distance learning, e-learning, and online learning interchangeably.

This is an important result to highlight, as although this research focuses solely on e-

learning, due to the previously noted inconsistencies, research relating to a range of

terminologies will be referred to and discussed. The more commonly cited

definitions in the literature were outlined in Chapter 1 and are revisited in Table 2.

Page 34: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

20

Table 2

Common definitions of e-learning

Author/s Definition

Bondarouk and

Ruël

“any type of learning situation in which instructional context is

delivered through the use of computer networked technology,

primarily over an intranet, or through the Internet, where and

when required” (Bondarouk & Ruël, 2010, p. 149)

Johnson, Hornkik,

and Salas

“training or educational initiatives which provide learning

material in online repositories, where course interaction and

communication and course delivery are technology

mediated”(Johnson, Hornkik, & Salas, 2008, p. 357)

Rosenberg “the use of Internet technologies to deliver a broad array of

solutions that enhance knowledge and performance”

(Rosenberg, 2001, p. 28)

Sambrook “any learning activity supported by information and

communication technologies (ICT)” (Sambrook, 2003, p. 191)

Sun, Tsai, Finger,

Chen, and Yeh

“the use of telecommunication technology to deliver

information for education and training” (Sun, Tsai, Finger,

Chen, & Yeh, 2008, p. 1183).

The American

Society for

Training and

Development

“a wide set of applications and processes, such as Web-based

learning, computer-based learning, virtual classrooms, and

digital collaboration. It includes the delivery of content via

Internet, intranet/extranet (LAN/WAN), audio- and videotape,

satellite broadcast, interactive TV, and CD-ROM” (ASTD,

2010, np)

Welsh, Wanberg,

Brown, and

Simmering

“the use of computer network technology, primarily over an

intranet or through the Internet, to deliver information and

instruction to individuals” (Welsh, Wanberg, Brown, &

Simmering, 2003, p. 246)

In comparing and contrasting these definitions it is apparent that all the

definitions generally recognise that e-learning is the transfer of information through a

technological medium. There is little consensus however as to what form the

technology takes. In one case, it is specified that interaction and communication

Page 35: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

21

takes place; however, most state that information is only delivered, which assumes

that no interaction needs to take place.

Therefore, the definition of e-learning for the purposes of this research is:

E-learning is any learning and development initiative which utilises computer

technology to facilitate learning when and where required.

Regardless of whether researchers can reach a consensus on a common

definition and terminology it is important to know how e-learning is being used in

organisations and how the success or failure of e-learning implementation can be

evaluated effectively. Before this can be investigated it is important to understand the

context of e-learning in its entirety, including the benefits and drawbacks which will

now be discussed.

Benefits of e-learning

E-learning has been, and continues to be, attractive to organisations because it

has the ability to address barriers that may exist with traditional face-to-face training

methods. Organisations engage in e-learning for a number of reasons: to provide

consistent training across geographical boundaries, to reduce delivery cycle time, to

increase learner convenience, to reduce information overload, to improve tracking, to

lower expenses, and to save time (Chen, 2010; Welsh et al., 2003).

Specifically, benefits to the organisation include the ability to offer training to

a greater number of employees whilst achieving greater consistency in training

delivery, and improved tracking of course completion and testing (Hill & Wouters,

2010; Noe, 2005; Welsh et al., 2003). Furthermore, organisations are able to easily

update training materials and disseminate to employees in an efficient manner (Chen,

Page 36: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

22

2010). Waight and Stewart (2005b) performed a case study to investigate the e-

learning context in four Fortune 500 organisations which actively use e-learning. All

four organisations cited increased access and the ability to reach a geographically

dispersed workforce as their primary reason for adopting e-learning. This was closely

followed by three organisations listing reduced costs as a driving influence for the

introduction of e-learning.

A further potential benefit of e-learning is its ability to be a cost-saving

measure, particularly in terms of reduced travel and accommodation costs, facility

costs, and time lost during traditional off-the-job training (Welsh et al., 2003).

Although the initial investment in e-learning can be high, once an e-learning course

is designed, implemented, and in use, the long-term costs of training can be greatly

reduced (Kathawala & Wilgen, 2004). A case study of New Zealand organisations

engaging in e-learning (Clayton, 2009) reported that e-learning is flexible enough to

suit the L&D needs of a range of different organisational situations, from training

independent contractors to internal employees, and from small companies to large

organisations.

The benefits to employees include greater flexibility in terms of where and

when they complete the training, the potential to segment their training for just-in-

time training needs, and access to a greater variety and number of courses (Hill &

Wouters, 2010; Noe, 2005; Welsh et al., 2003). A clear benefit for employees is that

they do not need to be in one place at the same time, therefore they do not need to

travel. This flexibility means e-learning is suitable for employees on different

schedules (for example shift workers). In more advanced e-learning courses, learners

have the ability to customise material to meet their needs (Lee, Yoon, & Lee, 2009).

Page 37: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

23

In addition, e-learning can cater to different learning speeds and learning preferences,

for example, learners can repeat sections of content.

Overall, most authors seem to agree that the main benefit of e-learning for the

learners is increased accessibility: the ability to use the technology anytime and

anywhere which allows users to proceed at their own pace and engage in autonomous

work (Abrami et al., 2006; Lu & Chiou, 2010; Womble, 2008).

Drawbacks and limitations of e-learning

Although the benefits of e-learning are numerous, it would be unrealistic to

think that drawbacks and limitations do not exist. Learning and training are

inherently complex concepts and one criticism of e-learning applications is that they

are overly simplistic (Brown et al., 2006). Although e-learning use in organisations

continues to rise, many of the applications fail to motivate employees to learn (Wang

et al., 2010), perhaps due to their simplistic nature. For example, a study by Brown,

Murphy, and Wade (2006) of attitudes towards e-learning in organisations found the

primary barrier was the delivery environment: 28% reported that, due to motivational

issues and interruptions, e-learning is not as effective as traditional face-to-face

training. A further contributing factor is the lack of consideration of pedagogical and

organisational issues necessary for effective e-learning. The development of e-

learning tends to focus on the technical issues of design, rather than how it can meet

the organisation’s vision and mission, thus resulting in a program which individuals

perceive to be ineffective in improving work performance (Wang et al., 2010).

Other barriers perceived to be important by learners were the age of trainees

and their lack of IT skills, accessibility of training, a lack of engaging material, and

finally the lack of flexibility and interaction as compared to traditional training

Page 38: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

24

(Brown et al., 2006). Some empirical evidence has led to other concerns about one of

the most fundamental characteristics of e-learning: the lack of face-to-face

interaction. E-learning can be isolating and an array of authors have explored the

impact of this characteristic. See Table 3 for a summary of these authors’ works:

Table 3

Summary of selected works on interaction, and isolation in e-learning

Author(s)/year Summary

Arbaugh (2000) Found support for their hypotheses about interaction:

“perceived interaction difficulty will be negatively

associated with student satisfaction with an Internet-

based course”; and

“perceived instructor emphasis on interaction will be

positively associated with student satisfaction with an

Internet-based course”.

Beaudoin (2002) Assessed whether limited interaction online (low

visibility students) compromises learning in an online

environment. Results suggest that fully engaged, highly

participatory learners tend to perform strongly in graded

assignments, but that minimal online participation does

not compromise grades. The grades suggested that low-

visibility students are dedicating more time to reflection

and processing of course material that translates to

stronger assignments than those submitted by students

participating at an average level.

Burnett, Bonnici, Miksa,

& Kim (2007)

Assessed which dimensions of interaction (frequency,

intensity, and topicality) contribute to student

satisfaction/dissatisfaction. Results indicated some

support for the statement that the less frequent the

interaction, the more likely it is that students will express

dissatisfaction with the course.

Cobb (2009) Measured social presence (a predictor of interaction) and

satisfaction in an online course. Results showed that

students in online courses feel comfortable relating and

interacting in the online environment, and are satisfied

with online courses.

Page 39: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

25

Derouin, Fritzsche, &

Salas (2005)

The authors performed an extensive literature review on

the current state of e-learning. They presented ‘lack of

engagement’ to be a challenge for e-learning

development. They suggested that collaboration and

interaction is a way to engage learners.

Garrison, Anderson, &

Archer (2000)

Presented a model of community inquiry that constitutes

three elements essential to an educational transaction:

cognitive presence, social presence, and teaching

presence. The authors suggest that these elements will be

impacted upon when the medium of communication

changes from traditional face-to-face to others such as

computer mediated communication.

Johnson, Gueutal, &

Falbe (2009)

Investigated factors which affect learning effectiveness.

Results showed that trainee interaction is positively

related to satisfaction and course performance.

Muilenburg & Berge

(2005)

Reported on a study that determined the underlying

constructs that comprise student barriers to online

learning. The single most important barrier to students

learning online was a lack of social interaction.

Richardson & Swan

(2003)

Examined the relationship of social presence, perceived

learning, and satisfaction with the instructor in an online

college course. A positive relationship was found

between social presence and perceived learning, and

social presence and perceived satisfaction with the

instructor. Students’ perceptions of social presence also

served as a predictor of perceived learning.

Sun et al. (2008) The authors investigated the critical factors affecting

learners’ satisfaction in e-learning. They predicted that

learner-perceived interaction with others would

positively influence perceived e-learner satisfaction with

e-learning. The results, however, were insignificant and

this prediction was not supported.

Although e-learning, like any form of L&D, has its drawbacks, a review of

the benefits suggests that e-learning will only increase in terms of the number of

organisations adopting e-learning solutions and the number of e-learning courses

offered. The Australian Flexible Learning Framework Benchmarking Report

(Australian Flexible Learning Framework, 2010, p. 1) supports this sentiment stating

that “e-learning is now an integral component of training for Australian businesses”

Page 40: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

26

with results of the 2010 survey showing that the use of e-learning in organisations

continues to increase. Sixty per cent of employers surveyed said that they expect

their organisation’s use of e-learning to increase in the next two years (versus 49% in

2009). In light of this, understanding what factors are important in evaluating these

courses will become critical to ensuring positive outcomes from e-learning.

Adult Learning Applied to E-learning

Adult learning principles are critical considerations when analysing the

quality of information delivered in an organisational e-learning context. Waight and

Stewart (2005a, p. 341) suggest that “a strong foundation in learning theories is

highly desirable of an e-learning team” and as such andragogy (how adults learn)

should inform the design of e-learning courses. Although most adult learning

theories were developed prior to technology-enhanced learning, and the links to e-

learning are unclear, in many cases they can be inferred and will be presented

throughout this review. The implication of adult learning theory in the workplace—

and as an extension e-learning—is that learners are more motivated once learning

objectives have been set out to meet their needs.

Although a number of key authors have contributed to the field (Delahaye,

2005; Delahaye, Limerick, & Hearn, 1994; Merriam, 1987), Malcolm Knowles has

long been recognised as the early champion of the notion that adults may use

learning processes different to those of children (Delahaye, 2005). Knowles coined

the term andragogy over 30 years ago to label the assumptions of adult learning and

since then it has emerged as a dominant framework for teaching adults (Holton,

Wilson, & Bates, 2009). Broadly defined as the “art and science of helping adults

learn” (Knowles, 1990, p. 54), Knowles, Holton and Swanson (2005, p. 60) have

Page 41: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

27

since posited that andragogy is “an intentional and professionally guided activity that

aims at change in an adult person”. During this time, the difference between child

learners and adult learners was redefined as those with low or high learner maturity.

This distinction reflects the psychological perspective that we become adults when

“we arrive at a self-concept of being responsible for our own lives, of being self-

directing” (Knowles et al., 2005, p. 64). Table 4 presents a summary of the

assumptions of the andragogical model (core adult learning principles), and therefore

the different approaches appropriate to those of low or high learner maturity.

Page 42: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

28

Table 4

Assumptions of the andragogical model and their implications for e-learning

Adult learning

principles

Summary of principle Implications for e-learning

Learner’s Need to

Know

Adults want to know why

they need to learn

something before learning

it.

Orientation session, self

evaluation, record keeping to

track progress.

Self-Concept of the

Learner

The self-concept of adults

is heavily dependent upon a

move toward autonomous

and self-directed learning.

Computer conferences, self-

directed learning, no

competition; share in

evaluation, mutual inquiry.

Prior Experience of the

Learner

Prior experiences of the

learner provide a rich

resource for learning.

Group discussion, case

method, projects,

meaningful problems,

context of everyday life,

simulations, peer helping,

debates, role playing.

Readiness to Learn Adults typically become

ready to learn when they

experience a need to cope

with a life situation or

perform a task.

Models, counselling, tasks

related to developmental

stages.

Orientation to Learning Adults’ orientation to

learning is life-centred;

education is a process of

developing increased

competency levels to

achieve their full potential.

Problem-solving exercises,

threaded discussions, class

calendar.

Motivation to Learn Motivation for adult

learners is internal rather

than external.

Activities that promote

development of positive self-

concept, deal with time

constraints, respectful

climate, stimulating tasks,

enthusiastic atmosphere.

Adapted from Knowles et al. (2005) and Colton and Hatcher (2004)

Page 43: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

29

To summarise Table 4, an individual is redefined as mature when their self-

concept is one of being a self-directed learner rather than a dependent learner. A

mature individual has had opportunities to accumulate a wealth of experience that

becomes an increasingly rich resource for learning. Their readiness to learn becomes

oriented toward developmental tasks that provide them with desired knowledge. In

addition, their orientation towards learning shifts from one of subject-centredness to

one of performance and task centredness. And finally, mature learners are motivated

internally, rather than by external pressures (Knowles, 1980).

A key distinction between pedagogical and andragogical learning strategies is

the level of independence or self-direction. High maturity learners often desire self-

direction, in that they decide what will be learned, how it will be learned, and what

will be assessed, as compared to low maturity learners who often have these

decisions made for them (Delahaye, 2005). As e-learning has the potential to allow

learners to be independent and self-directed (Berge & Giles, 2008), it promises a way

to apply andragogical principles to learning interventions. The six assumptions (as

introduced in Table 4) can be reviewed in light of their application to e-learning.

Learner’s need to know: Mature learners want to know what learning will

occur, how it will be learned, why learning is important, and how it will be assessed.

For e-learning, a situation which often puts the learner in isolation without a

traditional face-to-face facilitator, this requires the development of alternative tools

to raise the awareness of the need to know.

Self-concept of the learner: Mature learners by nature have a self-concept of

being responsible for their own decisions. E-learning is an opportunity for adult

educators to provide learners with autonomy in their learning experience, allowing

Page 44: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

30

the learner autonomy of time, place, and pace of e-learning (Abrami et al., 2006; Lu

& Chiou, 2010). E-learning can be a tool for independent, self-directed learning.

Prior experience of the learner: Mature learners come into an educational

experience with a greater range of experiences than that of immature learners. This

can have both positive and negative effects in that experience can serve as a basis for

new learning, but it can also mean that adults have pre-developed mental habits and

biases which can affect learning new ideas (Knowles et al., 2005). For e-learning

therefore, this means providing opportunities for individualisation of the training,

incorporating techniques that tap into the experience of the learner.

Readiness to learn: Mature learners will be ready and willing to learn when

they see a need for the knowledge in order to cope effectively with real life

situations. Therefore, if e-learning is to be successful it too will need to be seen by

the learner as important, and the learner will need to be convinced of the

appropriateness in the e-learning course for their stage of development.

Orientation to learning: Mature learners are life-centred in their approach to

learning in that they are motivated to learn when learning is task-centred or problem-

centred, and they see a utility in the learning. New information is most effectively

received when presented in the context of real-life situations. Therefore in e-learning

it will be important that course content and theory is presented in a practice-oriented

context, incorporating real life situations or events to which learners can relate.

Motivation to learn: Mature learners are more motivated towards learning

that helps them solve problems or results in internal payoffs rather than external

payoffs such as promotions or salary increases. In the context of e-learning, adults

will be most motivated when they believe they can learn the new material, the

learning will help them with a problem, and it is important to them.

Page 45: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

31

While much literature exists surrounding instructional methods for e-learning,

web-based learning, and online learning, there have been very few attempts to apply

these principles of adult learning to their instruction and even less attempts to

evaluate programs against their alignment with adult learning principles (Colton &

Hatcher, 2004). Colton and Hatcher (2004) attempted to fill this gap with the

development of The Online Adult Learning Inventory. This study was exploratory in

nature, combining quantitative and qualitative methods in addition to a Delphi panel

research method to result in a content valid instrument to evaluate online courses.

Information on instructional methods by each adult learning principle was also

collected (Table 4 introduced earlier presents a summary of how the principles might

be operationalised in an e-learning setting).

Delahaye and Smith (1998) consolidated work by previous authors on adult

learning and proposed ten learning principles unique to mature learners (Delahaye &

Smith, 1998, p. 12). These are practical principles which guide L&D initiatives.

These core principles also provide a sound foundation for planning an e-learning

course. They are outlined in Table 5, with a description of their implications for e-

learning.

Page 46: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

32

Table 5

Mature learner learning principles and implications for e-learning

Learning principles

unique to mature

learners

Summary of principle Implication for e-learning

Whole or part

learning The information is presented as a

complete whole, or in sequenced

reasonably sized parts. This

principle will be different

depending on the situation and the

information.

Does the e-learning situation

require the learner to gain

information as a complete whole

in order to learn, or as a set of

successive parts to develop

knowledge or skills? The ability

of the learner to use e-learning

systems and their willingness to

participate may also impact how

the e-learning course is

structured.

Spaced learning Learning should be spaced to

allow for the information to be

assimilated before presenting

more information. Spacing can

refer to spacing learning over a

period of time, or breaking up the

learning activities.

In terms of e-learning, it is

suggested that spacing e-learning

courses may provide time for

learners to assimilate information

to maximise learning.

Active learning Suggests the learner should be

actively involved in the learning

process.

Learners will need to play an

active role in engaging in e-

learning experiences, and need to

be given the opportunity to

reflect on these experiences.

Feedback Both the learner and the facilitator

should receive feedback. The

learner needs to receive feedback

on their learning and the

facilitator receives feedback to

confirm the learner’s

understanding.

E-learning is no different to other

learning experiences and

feedback will need to be given in

an e-learning environment.

Overlearning Is the concept of practice beyond

the level of perfect recall so that

learners do not forget information.

Learning experiences need to

encourage this practice.

It is suggested that e-learning

courses should incorporate

activities that allow the learner to

practice activities (multiple times

if they want to) to reinforce the

knowledge learned. Reinforcement It may be appropriate to

incorporate positive or negative

reinforcement into the learning

process. Learners can either

experience a positive outcome or

remove a negative situation.

This principle can be

incorporated into e-learning

courses, particularly the ability to

reinforce positive outcomes in

learning activities.

Page 47: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

33

Primacy and

recency The most important information

should be presented first or last,

as learners tend to recall

information better when presented

at these points.

This can be applied to e-learning

courses in that the most

important information should be

presented at either the beginning

or the end of the course.

Meaningful

material Material must be meaningful to

the learner. This initially means it

must be relevant to prior

information or experiences, and

then it must be considered

important for the learner’s future.

Learners will need to be able to

see the relevance of the e-

learning to ensure sufficient

motivation to learn.

Multiple sense

learning Learners are best engaged when

they are provided opportunities to

engage different senses, for

example hearing and visualising.

The learning experience is better

for the learner when it is more

stimulating.

E-learning provides opportunities

for learners to be actively

engaged in learning through

different senses. For example,

videos, music, and/or reading.

Transfer of learning Learning should be structured to

ensure transfer back to the

workplace. A danger can be that

learners can perform new tasks or

exhibit new knowledge in the

training environment, however

back on the job a transfer of

learning does not occur

To ensure transfer, e-learning

courses should aim to be as

relevant to the work environment

and activities should be in the

context of the work environment

to assist in transfer back to the

workplace.

In addition to these ten generic considerations, Delahaye and Smith (1998)

added a further five principles that are exclusive to mature learners:

Learner responsibility: Mature learners are self-directed and take

responsibility for their own learning. For e-learning, this means that

learners will need to be given the opportunity to self-direct their

learning.

Learning for life applications: Mature learners see learning as a

lifelong pursuit rather than a one-off activity. E-learning courses

provide an opportunity for learners to engage in continuing education.

Page 48: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

34

Learning by reflection on experience: Mature learners learn by

reflecting on previous experiences and the resulting outcomes in order

to determine the most appropriate ways of behaving in the future.

Learners need opportunities while completing e-learning courses to

reflect on past experiences and identify knowledge they have

previously gained.

Support and respect for fellow learners: The social environment is

important to mature learners. Importance needs to be placed on the

shared experience of learning and gaining the respect of fellow

learners. It is important to consider this social aspect to ensure a

supportive environment in an e-learning context.

Learning by experimenting: Mature learners need the opportunity to

put into practice their learning and experiment with new ways and

ideas. E-learning provides an effective way for learners to experiment

and try things multiple times until they are satisfied.

Training Evaluation

Organisations make significant investments in many forms of training, and

the need to evaluate training initiatives is acknowledged by practitioners and

academics alike. However, the most significant problem is that there is no clear

definition of e-learning effectiveness (Hodges, 2009). Few organisations

comprehensively evaluate their training programs in a manner meaningful to their

business (Kraiger, 2002; Nickols, 2005; Twitchell et al., 2000). A number of reasons

have been suggested as to why organisations fail to conduct training evaluation; one

of the most common reasons being that it is not required by the organisation

Page 49: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

35

(Kraiger, 2002; Twitchell et al., 2000). In addition, management is often not

interested in evaluation data—evaluation is not considered important or a priority

(Kraiger, 2002). Furthermore, it has been suggested that organisations are frustrated

with the state of evaluation research and guidance on how to best execute evaluation

programs (Kraiger, 2002). Moller et al. (2008) report that program evaluation is

rarely planned and when it is companies do not know what to measure and how to

use this information. As such, this study aims to investigate how evaluation models

apply in the context of e-learning, and furthermore provide some guidance on this

topic to practitioners and academics alike.

The choice of evaluation criteria is a critical decision when evaluating the

effectiveness of L&D (Arthur, Bennett, Edens, & Bell, 2003). The integration of new

technologies into the learning process presents new complications to the already

challenging nature of evaluation (Galloway, 2005). Although the obvious and

currently accepted evaluation method for e-learning is traditional L&D models, due

to the technical nature of e-learning it is important to broaden the scope to

encompass IS evaluation as well. Furthermore, a review by Twitchell et al. (2000)

found that evaluation methods have largely remained static in the last 40 years.

Therefore the following will discuss both L&D evaluation—in particular

Kirkpatrick’s model (1976)—and IS evaluations—in particular DeLone and

McLean’s D&M IS Success Model (DeLone & McLean, 1992, 2003).

Traditional learning and development evaluation

Currently there is no single theory that exists which has been shown to

predict e-learning effectiveness (Hill & Wouters, 2010). However, general L&D

evaluation models have been applied to various studies.

Page 50: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

36

Research suggests that evaluation can provide information about the

efficiency and effectiveness of training programs (Kraiger, 2002; Nickols, 2005;

Salas & Cannon-Bowers, 2001). Training evaluation from a traditional training and

development perspective refers to “a system for measuring whether trainees have

achieved learning outcomes” (Kraiger, Ford, & Salas, 1993, p. 312). This usually

includes “the systematic collection of descriptive and judgemental information

necessary to make effective training decisions related to the selection, adoption,

value, and modification of various training activities” (Goldstein & Ford, 2002). This

information can be used to address two primary issues: (1) whether the training

objectives are achieved, and (2) whether accomplishing these objectives results in

enhanced performance (Goldstein, 2002).

Measures of training effectiveness seek to “explicate why training did or did

not achieve its intended outcomes” (Kraiger et al., 1993, p. 312). Effectiveness is

assessed by identifying and measuring various factors relating to training outcomes

and the transfer of training (Tannenbaum, Mathieu, Salas, & Cannon-Bowers, 1991).

Generally, the issues of effectiveness tend to be broader than those of evaluation

(Kraiger et al., 1993). As can be seen in this discussion, the terms evaluation,

efficiency and effectiveness tend to be used interchangeably. However, for

consistency this research adopts the term evaluation.

Kirkpatrick’s (1976) model of evaluation is widely recognised in both

practitioner and academic literature as a means of assessing whether training has

been successful at an individual and organisational level, as well as for creating an

evaluation strategy (Goldstein & Ford, 2002; Kraiger, 2002; Nickols, 2005; Salas &

Cannon-Bowers, 2001; Sutton & Stephenson, 2005). Kirkpatrick’s model has

remained popular for over 30 years, mostly due to its potential to simplify training

Page 51: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

37

evaluation which is an otherwise complex process (Bates, 2004). A number of

authors have stated that Kirkpatrick’s classic model is also applicable in an e-

learning context (Galloway, 2005; Kramer, 2007; Moller et al., 2008; Ruiz, Mintzer,

& Leipzig, 2006). The following sections outline the four levels of the model.

Level one—reaction

Reaction is referred to as the degree to which participants react favourably to

the training (Kirkpatrick, 1976). In general, Reaction (the first level of Kirkpatrick’s

model) is measured as the trainee’s overall assessment of the L&D course following

its delivery, or how satisfied they were with the course (Brown, 2005). This feedback

is an instant reaction of the learner, often in the form of a survey following the

training, and is the most commonly used of the four evaluation levels (Rylatt, 2000).

This data provides information as to whether the participants found the program

valuable. In an e-learning context, this data collection method does not change and

can actually be easier as surveys can be administered automatically at the end of an

e-learning session.

Although common, evaluation methods which only use the Reaction level of

the model are often criticised for a number of reasons. Satisfaction of training does

not necessarily equate to learning or a return on investment (ROI) to the

organisation. Kirkpatrick’s (1976) original model did not imply a link between the

levels of evaluation; however, a recent study by Cobb (2009) has shown that there is

a strong relationship between a trainee’s initial reaction and changes in on-the-job

behaviour (equated with the third level of Kirkpatrick’s model). Satisfaction is one of

the major factors used to evaluate e-learning courses, along with cost efficiency and

learning resources (Waight & Stewart, 2005a). A further limitation of level one

Page 52: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

38

responses is that external factors may influence negative feedback from participants,

for example a lack of interest in the topic, external distractions, or resentment for

time taken away from their job for training (Galloway, 2005). Siztmann, Brown,

Casper, Ely and Zimmerman (2008) confirmed this in a meta-analysis of trainee

reactions; their results suggested that trainee reactions mainly capture aspects of the

training environment rather than reflecting a true measure of the training content or

outcomes.

At an organisational level, a criticism of Reaction-level data is that it is not

perceived as valuable by organisation decision-makers (Chapman, 2004; Nickols,

2005; Sutton & Stephenson, 2005). Reaction-level data is appropriate for direct

feedback to trainers, but does not demonstrate training achievements nor value to

management (Kraiger, 2002).

Level two—learning

The second level—Learning—refers to the degree to which participants

acquire the intended knowledge, skills, attitudes, confidence, and commitment based

on their participation in a training event (Kirkpatrick, 1976). Learning can be

evaluated via observation at a learning event and hard data acquired by measuring

completion of learner tasks (Rylatt, 2000).

A number of steps can be taken to accurately measure Learning. These

include: conducting pre-tests and post-tests to assess knowledge, skills, and abilities

before and after training; conducting performance assessments; comparing results

with a control group who did not undertake training; or comparing results to previous

learning initiatives (Rylatt, 2000). As with level one (Reaction), if measured by self-

report data, the limitation of this level of evaluation is that the responses are

Page 53: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

39

subjective—responses may be reflective of how the participant felt about the training

rather than what they actually learnt.

Level three—behaviour

Behaviour is the degree to which participants apply what they learned during

training when they are back on the job. Phillips (1996) suggests that level three

evaluations serve the following functions: (1) determining success in accomplishing

organisational goals, (2) identifying strengths and weaknesses in the training and

development process, (3) identifying which participants were the most successful,

and (4) providing an opportunity to reinforce major points to the participants.

Although change of workplace behaviour is one of the main goals of any

training program, this, as well as results and ROI, are the hardest to evaluate, and as

such little literature exists to provide guidance on evaluation of these stages.

Galloway (2005) suggests that level three has the potential to be more relevant now

than it did when the model was first developed, particularly in the context of new

technologies such as e-learning. Computer-based, on-the-job performance testing

provides a way to address the difficulties traditionally associated with evaluation at

this level. With computer-based performance testing it is possible to measure

whether assignments were completed correctly and apply time frames for testing. An

example of the metrics which can be used for evaluating computer-based tasks are:

Process (were the correct tasks performed?), Sequence (were the tasks performed in

the correct order?), Results (were the correct results obtained?), and Time (were the

results obtained within time constraints?) (Galloway, 2005). Galloway (2005, p. 24)

suggests that with the aid of computer-based testing, on-the-job application of

learning attained can truly be tested in a way that was previously impossible; the test

Page 54: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

40

provider can “see whether the applicant performed the desired steps, trace the order

of the steps, and view the finished product and/or result. Time constraints can also be

monitored”.

Level four—results

The final level—Results—assesses to what degree targeted organisational

outcomes occur as a result of the training event and subsequent reinforcement

(Kirkpatrick, 1976). Whilst Reaction, Learning, and Behaviour address the impact of

training at an individual level, Results assesses the impact of training on factors of

organisational performance such as efficiency and cost, production rates, quality,

frequency of accidents, and sales. Many managers and academics believe this is the

most important level to evaluate (Rylatt, 2000) because the resulting data can be used

to justify or improve training and development efforts (Galloway, 2005). However, it

is the most difficult level to obtain data about due to issues of causality. It is hard to

isolate effects on organisational performance and definitively state that positive

results at levels one, two, and three directly affect organisational performance issues,

and that they weren’t the result of another contextual factor.

As with any novel business initiative, the relative newness of e-learning

opens itself to potential criticism from higher management in regards to its cost

effectiveness, necessity, and impact on organisational performance (Galloway,

2005). Thus, the need for a holistic evaluation model that provides evidence of

learning and individual improvement as well as ROI is critical. The limitations of

Kirkpatrick’s (1976) model are well documented (see Table 6) and further

developments have been made in an attempt to build a more comprehensive model of

training evaluation. For example Phillips (1996) suggested a fifth level to this model:

Page 55: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

41

ROI. This level focuses on the net benefit of training in monetary terms. Phillips

(1996) argued that this extra level results in objective data, which can be used by

evaluators to make important decisions about continued funding, and to some degree

add a level of credibility to the training program (Galloway, 2005). It is, however,

possible to interpret the Results level, as including ROI.

Page 56: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

42

Table 6

Limitations of Kirkpatrick's (1976) model of evaluation

Limitation Summary of limitation Author/(s), Year

Incompleteness of the model The model presents an oversimplified view of training effectiveness that does not

consider individual or contextual influences in the evaluation of training. There

are a wide range of organisational, individual, and training design and delivery

factors that can influence training effectiveness before, during, or after training.

Characteristics of the organisation and work environment and characteristics of

the individual trainee are also crucial input factors.

Bates (2004), Holton

(1996)

The assumption of causality

There is an assumption that the four levels of criteria represent a causal chain such

that positive reactions lead to greater learning, which produces greater transfer and

subsequently more positive organisational results. Research, however, has largely

failed to confirm such causal linkages.

Alliger & Janak (1989),

Bates (2004), Noe &

Winkler (2009)

Levels are positively intercorrelated Linked to the assumption of causality, a set of essentially positive

interrelationships is thought to exist among levels of training evaluation. Alliger & Janak (1989),

Kraiger (2002)

Incremental importance of information There is an assumption that each level of evaluation in the model provides data

that is more informative than the last. This assumption has incorrectly generated

the perception that establishing level four results will provide the most useful

information about training program effectiveness.

Alliger & Janak (1989),

Bates (2004), Newstrom

(1978), Noe & Winkler

(2009)

No purpose of evaluation The model has no flexibility to relate the outcomes used for evaluation to the

training needs, the program learning objectives, and strategic reasons for training.

The model focuses on outcomes and process, and pays little attention to inputs.

Delahaye & Smith (1998),

Kraiger (2002), Noe &

Winkler (2009)

Collection of outcomes in an orderly

manner The model implies that data should be collected in an orderly manner: level 1,

followed by level 2, followed by level 3 etc. Whereas realistically they should be

measured when appropriate.

Kraiger (2002), Noe &

Winkler (2009)

Page 57: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

43

Holton (1996) strongly argues that Kirkpatrick’s (1976) four-level evaluation

model is really only a taxonomy of outcomes, and is fundamentally flawed as an

evaluation model. Although Holton recognises the contribution the model has made

to the field of training evaluation, he argues that it does not take into account the

many contextual factors that may affect the transfer of learning. In response, Holton

(1996) developed the Transfer of Training Model which focuses on individual

performance (see Figure 4).

Figure 4: Transfer of training model (Holton, 1996)

The primary outcomes of training identified by Holton (1996) are learning,

individual performance, and organisational results. The model assumes that there are

three primary factors that affect the transfer of training: trainee reactions, motivation

to learn, and ability. Although Kirkpatrick’s (1976) model sees trainee reaction as a

primary outcome of training, Holton (1996) views it as an intervening variable that

has an impact on training. Further differences between the models are that individual

performance is used instead of behaviour, and that primary and secondary influences

on outcomes are included.

Page 58: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

44

Similarly, to address the criticisms of Kirkpatrick’s (1976) model, and to

develop a more comprehensive model of evaluation criteria, Kraiger, Ford and Salas

(1993) took a different approach and developed a classification scheme for

evaluating learning outcomes based on existing learning constructs. Each of these

outcomes is discussed briefly below.

Cognitive outcomes

Cognition is concerned with variables relating to the quantity and type of

knowledge, and the relationships between these knowledge elements (Kraiger et al.,

1993). Cognitive outcomes can be broken into three measures which are useful for

evaluating training: verbal knowledge, knowledge organisation, and cognitive

strategies. These measures are used to assess how familiar trainees are with

principles, facts, techniques, and procedures or processes presented in the training.

Overall, cognitive outcomes measure what was learnt in training, not how they will

use this knowledge back on the job, and as such they can be related to level two

(Learning) of Kirkpatrick’s (1976) model.

Skill-based outcomes

Skill-based outcomes are used to assess the development of technical or

motor skills (Kraiger et al., 1993). Typically, goal orientation and linking of

behaviours in an organised manner are characteristics of skill development. This

development occurs in three stages: (1) initial acquisition of skills, (2) skill

compilation, and (3) skill automaticity. The process of acquiring skills and the use of

skills on the job relates to Kirkpatrick’s (1976) level two (Learning) and level three

(Behaviour) criteria.

Page 59: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

45

Affective outcomes

Affective outcomes are based on Gagne’s (1984) reasoning that attitudes can

determine behaviour and performance, and as such should be included as a learning

outcome. Kraiger (1993) built on this premise to include motivation and affective

outcomes. Although level one (Reaction) of Kirkpatrick’s (1976) model can be an

affective measure as it looks at a trainee’s perception of the training, Kirkpatrick did

not include affectively-based measures as indicators of learning.

Table 7 shows a synthesis of these outcomes, how they are measured, and

their relationship to Kirkpatrick’s four-level framework and Phillip’s fifth level.

Page 60: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

46

Table 7

Outcomes used in training evaluation (Noe & Winkler, 2009, p. 201)

Outcome Example How it is measured What is measured Relationship to four-level

framework, and Phillip’s

fifth level

Cognitive

outcomes

Safety rules

Electrical principles

Steps in appraisal interview

Written tests

Work samples

Acquisition of knowledge Level 2

Skill-based

outcomes

Jigsaw use

Listening skills

Coaching skills

Aeroplane landings

Observation

Work samples

Ratings

Behaviour

Skills

Level 2

Level 3

Affective

outcomes

Satisfaction with training

Beliefs regarding other

cultures

Interviews

Focus groups

Attitude surveys

Motivation

Reaction to program

Attitudes

Level 1

Level 2

Level 3

Results Absenteeism

Accidents

Patents

Observation

Data from information

systems or performance

records

Organisation payoff Level 4

Return on

investment

Dollars

Identification and comparison

of costs and benefits of

program

Economic value of training Level 5—Phillips

Page 61: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

47

However, these models were originally developed to evaluate traditional

classroom training programs, and questions have been raised as to whether they are

applicable as training evolves to include technology-based methods such as e-

learning (Galloway, 2005). The need for an evaluation model that provides evidence

of learning as well as financial results is well documented, yet due to the complex

nature of this type of evaluation a model has not eventuated. As summarised by

Galloway (2005, p. 24), the newness of e-learning (like any novel business initiative)

“opens itself to potential scepticism with regard to effectiveness, necessity, increased

asset value, cost effectiveness, and increased production”. As traditional approaches

of training evaluation such as Kirkpatrick’s (1976) four-level taxonomy continue to

influence common practice in organisations, it seems timely to focus on the need for

specific e-learning evaluation models which incorporate these traditional approaches

with the appropriate aspects of IS evaluation. Although much has been suggested

since Kirkpatrick’s model was originally developed, this research suggests that the

basics of the original model are still relevant and as such are incorporated into this

study. However, the criticisms and shortcomings that have been discussed are also

considered.

Information systems evaluation

Although e-learning is not technically an IS, e-learning is facilitated by the

use of specialised IS. IS encompass “the Information Technology, infrastructure,

systems, procedures and human resources that are required to collect, store, manage

and communicate information that supports and enhances the operations of an

organisation. IS include Enterprise Resource Planning systems, Electronic

Commerce, Enterprise Information Technologies, Computer-based Information

Page 62: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

48

Systems and hardware infrastructure” (Jamieson, 2007). Research in IS and

information technology (IT) fields has considered technology adoption and factors

affecting IS success.

A widely accepted model of IS success is that of DeLone and McLean

(1992), which has become known as the D&M IS Success Model and has been used

extensively in the measurement of IS success for over 20 years (DeLone & McLean,

2003). As previously noted, the initial DeLone and McLean (1992) taxonomy

contained five variables: system quality, information quality, perceived usefulness,

user satisfaction, and IS use. These five dimensions were the result of a review of the

literature at that time. The revised model (2003) consists of six interrelated factors

(system quality, information quality, service quality, intention to use/use, user

satisfaction, and net benefits) to measure the dependent variable Information System

Effectiveness. The updated D&M IS Success Model can be seen in Figure 5,

followed by an explanation of each of the dimensions.

Figure 5: DeLone and McLean’s (2004) updated D&M IS Success Model

Page 63: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

49

System quality

System quality measures technical success—the desired characteristics of the

system itself—which produces the information (DeLone & McLean, 1992, 2003;

Nielsen, 2005). A number of studies (Etezadi-Amoli & Farhoomand, 1996; Goodhue

& Thompson, 1995; Teo & Wong, 1998; Wixom & Watson, 2001) typically measure

system quality in terms of “ease-of-use, functionality, reliability, flexibility, data

quality, portability, integration, and importance” (DeLone & McLean, 2003, p. 13).

The quality of the system has a direct influence on individual impacts (measured as

quality of work environment and job performance) (DeLone & McLean, 2003).

Information quality

Information quality is the measurement of output from the IS. It stresses

characteristics of the information and the way it is presented according to the needs

of the users (Nielsen, 2005). Information quality was defined as quality of the

content, accuracy, precision, currency, reliability, timeliness, completeness,

relevance, and format required as perceived by the end user (DeLone & McLean,

2003; Negash, Ryan, & Igbaria, 2003; Nielsen, 2005).

Service quality

The service quality dimension was added to the updated model to ensure the

effectiveness focus was not only on the product itself but the services function as

well (DeLone & McLean, 2004). Service quality refers to the level of service

received by IS users and the way in which the service is provided by the IS

department or providers/maintainers of the system (DeLone & McLean, 2003; Pitt,

Watson, & Kavan, 1995). This is principally measured as user satisfaction with the

Page 64: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

50

service provided (DeLone & McLean, 2003; Pitt et al., 1995). Service quality is

viewed as the difference between the expected service from the IS department and

the perceived service received by the end-user (Parasuraman, Zeithaml, & Berry,

1988).

Use, intention to use and user satisfaction

System use focuses on the utilisation and interaction of the IS by individuals,

groups, or organisations (Nielsen, 2005; Straub, Limayem, & Karahanna-Evaristo,

1995). DeLone and McLean (2003) suggest an alternative measure of system use is

intention to use, which is appropriate depending on the research context. Use can

often be difficult to interpret as it includes a multitude of dimensions such as

mandatory or voluntary, informed or uninformed, and effective or ineffective. In

some instances measuring intention to use (which is an attitude) may be a worthwhile

alternative as it relates to behaviour (DeLone & McLean, 2003).

User satisfaction refers to “the user’s response to the use of the output of an

enterprise information system, the psychological state of the user after using the

enterprise information system” (Santa, 2009, p. 39). It is also defined as the extent to

which the user of the IS perceives an improvement in job performance. User

satisfaction is closely tied to user involvement, particularly during the phases of the

analysis, design, and implementation of an IS. In addition, Baroudi, Olson and Ives

(1986) argued that user involvement in IS development is considered a very

important way of obtaining a system of an acceptable quality and also a way to

guarantee successful implementation of the enterprise IS. Thus, user involvement can

be employed as a dimension to measure system effectiveness. Many researchers

(Berthon, Pitt, Ewing, & Carr, 2002; Bokhari, 2005; Chen, Soliman, Mao, & Frolick,

Page 65: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

51

2000; Doll & Torkzadeh, 1988; Downing, 1999; Mohmood, Burn, Gemoets, &

Jacquez, 2000; Norman, Ngai, & Cheng, 2002; Somers, Nelson, & Karimi, 2003;

Zviran & Pliskin, 2005) have widely reviewed research in end-user satisfaction and

concluded that user satisfaction is one of the most commonly used measures to assess

the effectiveness or success of an IS within an organisation.

Net benefits

Net benefits was an improvement to the original model (DeLone & McLean,

1992) in which individual impacts and organisational impacts were collapsed into

one descriptor of the final success variable.

Individual impact. Individual impact refers to the influence that information

from the IS has on the attitude of the user in regards to the user’s job (Santa, 2009). It

includes the personal improvements and also the overall consequences on the

performance of the department or business unit in relation to what effect the

information from the IS has on management decisions. This impact occurs when the

information is received and interpreted by the users and applied to their jobs

(DeLone & McLean, 1992; Nielsen, 2005).

Organisational impact. Organisational impact draws from research that

investigated the influence of implemented IS on organisational performance (DeLone

& McLean, 2003; Nielsen, 2005). According to Saarinen (1996) organisational

impact relates to the benefits of the investment in technological innovation.

DeLone and McLean (2003, 2004) have shown the adaptability of the D&M

IS Success Model by applying it to the context of e-commerce success. Measures for

each of the factors were adjusted to accurately capture the e-commerce context. For

example, measures of information quality were completeness, ease of understanding,

Page 66: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

52

personalisation, relevance, and security. Elements of service quality unique to an e-

commerce setting were assurance, empathy, and responsiveness. Petter, DeLone and

McLean (2008) state that whilst recent research provides strong support for the

D&M IS Success Model, more research is needed—particularly empirical research—

to establish the strength of interrelationships across different contextual boundaries.

E-learning is one such context which lends itself to the application of the D&M IS

Success Model.

Although the D&M IS Success Model has been applied in many different

domains it has received little attention in the area of e-learning (Klobas & McGill,

2010; Petter et al., 2008). In recent years, researchers have begun to link the two and

a limited number of studies have resulted. Holsapple and Lee-Post (2006) interpreted

the dimensions of the D&M IS Success Model in the context of educational e-

learning and developed an E-learning Success Model. Metrics were also included for

each of the model’s six dimensions (see Table 8). For example, system quality

measures the characteristics of ease of use, user-friendly, stability, security, speed,

and responsiveness. Holsapple and Lee-Post validated the model with an action

research methodology, which resulted in a slight change to the model in which user

satisfaction was moved from being a ‘use’ dimension to a factor of ‘system

outcomes’. Lee-Post’s (2009) application of the model in educational settings has

found the model to be valid; however, the authors call for further research to explore

the applicability of the success model in other areas of e-learning besides the higher

education setting.

Page 67: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

53

Table 8

Educational success metrics

DeLone & McLean Dimension E-Learning success metrics

System Quality Easy-to-use

User-friendly

Stable

Secure

Fast

Responsive

Information Quality Well organised

Effectively presented

Of the right length

Clearly written

Useful

Up-to-date

Service Quality Prompt

Responsive

Fair

Knowledgeable

Available

Use Powerpoint slides

Audio

Script

Discussion board

Case studies

Practice problems

Excel tutorials

Assignments

Practice exam

User Satisfaction Overall satisfaction

Enjoyable experience

Overall success

Recommend to others

Net Benefits Positive aspects

Enhanced learning

Empowered

Time savings

Academic success

Negative aspects

Lack of contact

Isolation

Quality concerns

Technology dependence

Page 68: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

54

Very recently, Alsabawy, Cater-Steel, and Soar (2011, 2012) presented their

research-in-progress on ‘Measuring E-learning System Success’ at the Pacific Asia

Conference on Information Systems. Building on the work of Holsapple and Lee-

Post (2006), Lee-Post (2009), and Wang, Wang, and Shee (2007), Alsabawy et al.

(2011) propose an evaluation methodology model to assess e-learning systems

success. Although adding to the support of the use of the D&M IS Success Model in

an e-learning context arguing that: “the DeLone and McLean model is believed to be

one of the most important measurements which can be used to address this issue in

the e-learning field” (Alsabawy et al., 2011, p. 5), their research is limited to the

education sector. Furthermore, this research took a quantitative approach and only

proposed particular relationships in the model. The research of Alsabaway et al.

(2011, 2012) strengthens the argument that a holistic view which incorporates IS

models into e-learning research is needed. This current research therefore aims to

address the limitation of recent studies which while aiming to do this are only

situated in an educational context.

Research Framework

This study proposes that although there have been many new technological

developments in training delivery, the underlying dimensions of success have not

changed. The D&M IS Success Model is an existing success-measurement

framework that has been applied in a wide range of studies since its publication in

1992 (DeLone & McLean, 1992), and has shown to be an effective measure of IS

success. Although Kirkpatrick’s (1976) model of evaluation made valuable

contributions to the thinking and practice of training evaluation (Bates, 2004), there

is now a necessity to address the need for evaluation methods that are appropriate to

Page 69: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

55

assess the application of technologies such as e-learning that did not exist 30 years

ago. It is proposed that by incorporating aspects of traditional L&D evaluation

dimensions an updated version of the D&M IS Success Model can be applied to

organisational e-learning success measurement.

Whilst e-learning success has been studied extensively in the educational

setting, studies addressing how to best evaluate e-learning success in organisations is

less developed. A number of authors have explored factors or intervening variables

that could potentially have an effect on e-learning and distance education success in

educational settings. This research has considered a variety of measures such as e-

learning quality (Jung, 2010), learning outcomes (McClelland, 2001; Motiwallo &

Tello, 2000), teaching practices (Savenye, Olina, & Niemczyk, 2001), learning styles

(Byrne, 2002), and cost-benefits (Smith, 2001). Pittinsky and Chase (2000)

developed comprehensive guidelines and benchmarks in a study and report

developed for six tertiary institutions that are leaders in distance education. Their

study developed 24 benchmarks for internet-based distance education in seven

categories: institutional support, course development, teaching/learning, course

structure, student support, faculty support, and evaluation and assessment.

Lee-Post (2009, p. 62) commented that there is “a need to integrate and

formulate a holistic and comprehensive model for evaluating e-learning … there is

also a need to broaden the viewpoint of learning success from a result to a process

perspective”. A further limitation of these studies is their focus on factors and

intervening variables that impact on e-learning success. Thus, “it is difficult to

understand and isolate success factors of e-learning as there is a lack of consensus of

what constitutes success of e-learning” (Lee-Post, 2009). Given that the literature in

educational settings is more extensive than that of organisational e-learning, this

Page 70: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

56

research aligns with Lee-Post’s assertions, and as such the primary objective of this

study is to address this need and formulate a model of factors critical to e-learning

success in organisations.

Although limited, a number of authors have begun to use respecified versions

of the D&M IS Success Model for evaluation purposes in organisations (see Chen,

2010; Wang et al., 2007; Wu & Wang, 2006). Chen (2010) sought to link e-learning

use to job outcomes using the D&M IS Success Model. Although Wang et al. (2007)

had developed and validated a scale based on the D&M IS Success Model for the e-

learning context, Chen (2010) used constructs from traditional IS studies. Results

indicate a link between e-learning use and job outcomes; however, Chen cautions

that:

“the link between system use and perceived outcomes cannot be established

on the basis of a single empirical study … Further empirical studies gathering

data from multiple sources, including supervisors, are recommended … In

addition, important environmental variables, such as group support and

organisational culture, facilitating conditions (e.g., rewards), and individual

learning capabilities, are not included in this study”.

Chen (2010) suggests that further research is needed that takes into

consideration these environmental influences.

Although both of these studies have made progress in utilising the D&M IS

Success Model to evaluate organisational e-learning, both have failed to conduct in-

depth analysis to determine how the model applies in an e-learning context and what

exactly they should be measuring.

Page 71: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

57

The creation of the original D&M IS Success Model was driven by a process

understanding of IS and their impacts. As Delone and McLean (2003, p. 16)

themselves highlighted: “this process model has just three components: the creation

of a system, the use of the system, and the consequences of this system use. Each of

these steps is a necessary, but not sufficient, condition for the resultant outcome(s)”.

The research framework developed for this study maintains the three components of

system creation, system use, and system consequences as reflected in Figure 6. This

e-learning success research framework guides the data collection for this study and,

as highlighted in red, the focus of this research is to investigate the system creation

elements in an e-learning context. Although important, system use and system

consequences are not part of this study. As can be seen, all fundamental elements of

the D&M IS Success Model have been maintained, and the four levels of

Kirkpatrick’s (1976) model have been mapped to this.

Figure 6: E-learning success research framework

SYSTEM

CREATION

SYSTEM USE SYSTEM

CONSEQUENCES

INFORMATION

QUALITY

SYSTEM

QUALITY

SERVICE

QUALITY

INTEN-

TION TO

USE

USE

LEVEL 3 -

BEHAVIOUR

LEVEL 2 -

LEARNING

NET BENEFITS

LEVEL 4 –

RESULTS

LEVEL 5 - ROI

USER

SATISFACTION

LEVEL 1 -

REACTION

Page 72: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

58

Framework elements of focus

The D&M IS Success Model has become widely cited, and many studies

have considered the applicability of the model in a variety of research contexts. In

order to ensure that evaluation models are relevant to an e-learning context, it is

necessary to perform an assessment of the currently used elements of the research

framework to assess how they might apply, if they are necessary, and if new relevant

factors need to be incorporated. The following is an overview of empirical research

related to each of the system creation elements and the associated research questions

developed for this study.

System quality

System quality has traditionally been measured in terms of “ease-of-use,

functionality, reliability, flexibility, data quality, portability, integration, and

importance” (DeLone & McLean, 2003, p. 13). It is assumed that the technical

elements of an e-learning system would typically be measured similarly to that of

other IS; however, there may be additional factors that are important.

A number of authors have investigated the factors of system quality relevant

to an e-learning context. Wang et al.’s (2007) measures of system quality include:

The e-learning system provides high availability.

The e-learning system is easy to use.

The e-learning system is user-friendly.

The e-learning system provides interactive features between users and the

system.

The e-learning system provides a personalised information presentation.

Page 73: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

59

The e-learning system has attractive features to appeal to users.

The e-learning system provides high-speed information access.

Najmul Islam (2012) explored the role of perceived system quality as a

motivation to continue e-learning system use in an educational setting. An

exploratory study was conducted to probe the system quality factors, as it was

recognised that not all variables are equal in an educator’s usage of an e-learning

system. From Najmul Islam’s study (2012, p. 31) four traits emerged as relevant to e-

learning and system use:

Access: Degree of accessibility, responsiveness, and availability of the

e-learning system.

Ease of Use: Degree to which an individual perceives that using the e-

learning system is free of effort.

Integration: The way the e-learning system allows data to be

integrated from various existing course pages.

Reliability: The dependability of the e-learning system operation.

Like Najmul Islam (2012), it is recognised that not all variables that were

found to be relevant in the previously discussed studies will be relevant in

organisational e-learning. As such, this study aims to explore the relevant

traits of system quality in an organisational e-learning context:

Research Question 1: How does system quality apply in the context of

organisational e-learning and what is the nature of this factor?

Page 74: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

60

Information quality

Adult learning principles have previously been highlighted as important

considerations to e-learning course design. Further to incorporating adult learning

principles, personal relevance, and authenticity have been shown to be important in

the design of learning materials. The “alignment of learning tasks to related work can

engage learners in the learning process and assist learners in transferring their

knowledge and skills to the workplace” (Waight & Stewart, 2005a, p. 340).

Walker and Fraser (2005) developed the Distance Education Learning

Environments Survey (DELES) in order to better investigate and measure constructs

specifically related to distance education learning environments in post secondary

distance education. The resulting DELES consists of six learning environment

constructs: instructor support, student interaction and collaboration, personal

relevance, authentic learning, active learning, and student autonomy. These

constructs were initially tested on a sample of 680 students for their association with

enjoyment of distance education. Although all six constructs were found to be

significantly correlated, personal relevance and authentic learning were found to

have the strongest association with enjoyment.

Wang et al. (2007, p. 1804) developed the following items to measure

information quality in an e-learning system using a survey methodology:

The e-learning system provides information that is exactly what you need.

The e-learning system provides information you need at the right time.

The e-learning system provides information that is relevant to your job.

The e-learning system provides sufficient information.

The e-learning system provides information that is easy to understand.

The e-learning system provides up-to-date information.

Page 75: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

61

These measures give a general overview as to what information quality means;

however, this study aims to investigate the characteristics of information quality in

greater depth. It is predicted that these adult learning principles, particularly the ten

practical principles presented by Delahaye and Smith (1998), and the constructs of

personal relevance and authentic learning (Walker & Fraser, 2005) will be relevant to

information quality in an e-learning context. As such the second research question is:

Research Question 2: How does information quality apply in the context of

organisational e-learning and what is the nature of this factor?

Service quality

The final factor of consideration in this research is service quality.

Traditionally service quality is measured through user satisfaction with the IS.

However, user satisfaction is a separate factor of the D&M IS Success Model (2003)

(and the research framework of this research), and thus an investigation into the

nature of service quality as a standalone factor is required.

Wang et al. (2007) considered a broader range of issues in the development

of their survey measures. They include:

The e-learning system provides a proper level of online assistance and

explanation.

The e-learning system developers interact extensively with users during the

development of the e-learning system.

The IS department staff provides high availability for consultation.

The IS department responds in a cooperative manner to suggestions for future

enhancements of the e-learning system.

Page 76: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

62

The IS department provides satisfactory support to users using the e-learning

system.

Similarly, Ozkan, Koseler, and Baykal (2009) used a survey methodology to

validate their ‘Hexagonal e-learning assessment’ model. This study addressed the

need for Learning Management System (LMS) evaluation and measurement of e-

learning system success in higher education. In the context of educational e-learning,

the authors identified student tracking, course/instructional authorisation,

instructional design tools, course management, knowledgeablility, and security as

elements of service quality. Ozkan, Koseler, and Baykal (2009) interpreted service

quality to be a ‘technical issue’, however in the context of organisational e-learning it

is anticipated that this factor would incorporate broader elements. As such, the final

research question is:

Research Question 3: How does service quality apply in the context of

organisational e-learning and what is the nature of this factor?

Chapter Summary

This chapter outlined key academic literature on e-learning evaluation from

the perspective of traditional learning and development (L&D) evaluation, and IS

evaluation. Figure 6 presented a research framework which brought together these

two perspectives, and is the basis of the research questions developed to guide the

study. It was highlighted that the focus of this research is the system creation

elements. The following chapter, Chapter Three—Methodology, presents the

research design and methodology for this research study.

Page 77: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

63

Chapter Three—Methodology

Chapter Overview

This chapter describes the methodology underpinning this study and provides

a rationale for employing a qualitative research approach, specifically a case study

design, to explore the critical elements of e-learning evaluation in organisations. The

chapter structure is outlined in Figure 7.

Figure 7: Chapter Three Structure

The Research Question

Overview of Qualitative Research

Approach

Operationalising the Research

Question

Data Collection Methods

The Research Setting

Ethical Considerations

Sampling

Data Analysis

Quality of the Approach

Conclusion

Page 78: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

64

The Research Question

The overall purpose of the research is to determine:

What are the critical elements to evaluate the success of e-learning initiatives

in an organisational setting?

It is evident from the literature review in Chapter Two that the present study

encompasses two different, yet similar, research areas: traditional L&D evaluation

and IS evaluation. The resultant research framework mapped Kirkpatrick’s (1976)

and Phillips’ (1996) levels of evaluation to the elements of the D&M Success Model

(DeLone & McLean, 1992, 2003) to create research constructs. Using this research

framework, it is proposed that all of these resultant constructs are potentially

important to consider when evaluating the success of implementing e-learning in

organisations. However, the scope of this research is restricted to investigating the

system creation elements. Eisenhardt (1989) noted that a recognition of a priori

constructs helps guide understanding and theory building. These constructs will

guide the investigation process in order to address the main research problem. It is

important to note that although the constructs within Kirkpatrick’s model and the

D&M IS Success Model are identified as potentially important in the literature

review they are tentative and may not exist or be of importance (or equal importance)

in any resultant findings or theory. Furthermore, this research took the stance of

Merriam (1998, p. 121) that “being open to any possibility can lead to serendipitous

discoveries”.

The following research questions are based upon the overall purpose and

research framework developed in Chapter Two—Literature Review:

Page 79: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

65

Research Question 1: How does system quality apply in the context of

organisational e-learning and what is the nature of this factor?

Research Question 2: How does information quality apply in the context of

organisational e-learning and what is the nature of this factor?

Research Question 3: How does service quality apply in the context of

organisational e-learning and what is the nature of this factor?

While the arguments and research questions presented in Chapter Two are

principally derived from literature and solidly underpinned by theory, it is necessary

to explore them further empirically. This thesis has taken the form of applied

research. Whilst the purpose, like any basic research, is to contribute to knowledge

and theory to explain the phenomenon under investigation (Patton, 2002), it aims to

go one step further and “contribute knowledge that will help people to understand the

nature of a problem in order to intervene; therefore allowing human beings to more

effectively control their environment” (Patton, 2002, p. 217). A core understanding

of applied research, such as this study, is that it is conducted to test applications of

basic theory and disciplinary knowledge to real-world problems and experiences

(Patton, 2002). The following chapter outlines the qualitative approach taken in this

applied research, and how this was operationalised in a case study research design.

No research is ideal and often tradeoffs are made in design strategies (Patton, 2002);

as such this chapter will justify why certain decisions were made regarding the

methodological approach of this research.

Page 80: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

66

Overview of Qualitative Research Approach

This study is based on a qualitative methodology—the study of social

phenomena (Marshall & Rossman, 2011). It is argued that qualitative research is an

approach to research rather than a set of techniques (Morgan & Smircich, 1980).

Denzin and Lincoln (2011, p. 3) define qualitative research as “a situated activity that

locates the observer in the world ... Qualitative researchers study things in their

natural settings, attempting to make sense of, or interpret, phenomena in terms of the

meanings people bring to them”.

Qualitative research methods are increasingly being used in evaluation

studies, particularly evaluation of information technology and computer systems

(Kaplan & Maxwell, 2005). Leininger (1992, p. 401) argues that “the goals of

qualitative research are not to 'measure' something but rather to understand fully the

meaning of phenomena in context and to provide thick accounts of phenomena under

study”. Dick (1990, p. 4) also argues that “by attaching numbers to phenomena, you

limit what can be taken into account”. This research approach is deemed particularly

helpful for studies such as this one where it is important to determine “what might be

important to measure” rather than taking measures (Kaplan & Maxwell, 2005, p. 31).

In this way the qualitative nature of this dissertation enables an understanding of how

traditional frameworks apply in a modern e-learning context. Furthermore, a

qualitative approach was deemed appropriate for this study due to the form of

research questions, which ask ‘how’ questions, rather than being causal, relationship

based, or asking how many, or how much, in which case a quantitative survey may

have been more appropriate (Silverman, 2007; Yin, 2003).

Brantlinger (1997) suggests categories of crucial assumptions in qualitative

research, which are important to consider as they have the ability to shape the

Page 81: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

67

method selection. These are presented in Table 9, with an interpretation of how they

relate to this research.

Table 9

Crucial assumptions in qualitative research

Assumption Assumptive Continua Application to this research

Nature of the

research

Technical and neutral < >

Controversial and critical

This research is technical and

neutral, intending to conform to

traditional research within the

management discipline.

Relationship of

the participants

Distant and objective < >

Intimate and involved

The researcher positions

themself as distant and objective

to the participants’ lives.

Direction of

gaze

Outward toward others <

> Inner contemplation

and reflection

This research is outward towards

others by externalising the

research problem.

Purpose of the

research

Professional and private <

> Useful to participants

and the site

The researcher sees the purpose

as both professional—as partial

fulfilment of a higher research

degree—and also as useful and

informative to the participants

and participating case

organisation.

Intended

audience

Scholarly community < >

The participants

themselves

The intended audience is both

the scholarly community and the

participants themselves and the

participating organisation.

Researcher’s

political position

Neutral < > Explicitly

political

The researcher views themself as

neutral in this situation with no

political agenda.

Adapted from: Brantlinger (1997) and Marshall and Rossman (2011).

Unit of analysis

This research has two units of analysis relevant to each of the research

questions: the individual and the two groups of stakeholders (users and L&D

Page 82: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

68

professionals). Patton (2002) suggests that groups can be selected as a unit of

analysis when there are distinguishable characteristics that separate people into

groups and these characteristics have implications for research. The primary focus of

data collection is impacted by the units of analysis (Patton, 2002). In the present

study the focus is on individuals in the setting. However, analysis is guided by a

focus on both comparisons across individuals, and across stakeholder groups.

Decisions about sample size and strategy also depend on the unit of analysis chosen

to study. The sample strategy for this research is outlined later in this chapter.

Case study research design

The research strategy employed in this study is a single case design (Yin,

2003). More specifically, it is an analysis of a single case organisation that has

recently adopted a new LMS and is now delivering e-learning courses via this LMS.

Yin (2003, p. 13) defined a case study as:

“an empirical inquiry that investigates a contemporary phenomenon within

its real-life context, especially when the boundaries between phenomenon and

context are not clearly evident; and in which multiple sources of evidence are

used”.

Stake (2005) states that case studies are not a methodology but rather a choice

of what is to be studied. In contradiction, Creswell (2012, pg. 97) views case studies

as a methodology:

“Case study research is a qualitative approach in which the investigator

explores a real-life, contemporary bounded system (a case) or multiple

bounded systems (cases) over time, through detailed, in-depth data collection

Page 83: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

69

involving multiple sources of information ... and reports a case description

and case themes”.

For this research the researcher viewed case studies as a real-life setting to

investigate a contemporary phenomenon in a single case by utilising appropriate data

collection methods. In this situation, the single case study was seen as the appropriate

design to facilitate data collection rather than the methodology.

There are number of reasons why a case study approach was considered the

most appropriate research design to adopt. First, a single case study research design

was deemed most appropriate due to the breadth versus depth of information trade-

off, the purpose of the study, the resources available, the time available, and the

interests of those involved (Patton, 2002). Second, case studies are a common

approach to the conduct of qualitative research (Burns, 2000) and enable the

researcher to collect relevant data to develop further understanding of a particular

phenomenon or topic (Creswell, 2003; Stake, 2005). Third, case studies are

considered appropriate where little theory exists in relation to a concept, and the goal

is to build theory (Eisenhardt, 1989). In particular, case study research has the

strengths of allowing for the generation of new or novel theories, generating theory

that is likely to result in constructs and hypotheses which are testable, and producing

theories which are empirically valid (Eisenhardt, 1989). Eisenhardt (1989, p. 546)

explains further:

“Although a myth surrounding theory building from case studies is that the

process is limited by investigator’s preconceptions, in fact, just the opposite

is true. This constant juxtaposition of conflicting realities tends to ‘unfreeze’

thinking, and so the process has the potential to generate theory with less

Page 84: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

70

researcher bias than theory built from incremental studies or armchair,

axiomatic deduction”.

Building theory from case studies is a research strategy that involves using

one or more cases to create theoretical propositions from case-based, empirical

evidence (Eisenhardt, 1989; Eisenhardt & Graebner, 2007) and replicable logic

(Eisenhardt, 1989; Yin, 2009). In this research, a case study will provide rich data

around evaluation of e-learning courses and e-learning systems in order to develop a

testable model for future empirical investigation.

Data collection methods

Burns (2000) identified three principles for case study data collection which

were followed in the design of this research: use multiple sources, maintain a clear

chain of evidence, and record data. There are at least six sources of evidence that

may be used during a case study: documentation, archival records, interviews, direct

observation, participant-observation, and physical artefacts (Yin, 2009). Whilst the

primary data collection method of this research was semi-structured interviews other

information was also collected. All data collection methods are explained below.

Semi-structured interviews. The use of convergent interviewing (Dick, 1990)

was identified as an appropriate data collection method. The questions used in the

convergent interviewing process emerged from the literature review and research

framework. Convergent interviewing can be utilised as both a data collection and

data analysis method. Convergent interviewing is an iterative process which provides

the opportunity during the data collection process to refine interview questions and

determine appropriate participants and sample size (Dick, 1990). The aim of this

iterative process is to identify areas of agreement and disagreement between

Page 85: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

71

participants until convergence occurs, and any divergence remaining can be

adequately explained.

As suggested by Creswell (2003), an interview protocol was developed for

recording information during the interviews (see Appendix 1). The protocol includes

instructions to the researcher including opening statements, the key interview

questions, probes to follow key questions, and space for recording the interviewee’s

comments. Three categories of questions were devised to elicit information from

interviewees: the e-learning system in general, use of e-learning, and outcomes of e-

learning. These categories of interview questions were developed in response to the

literature review and the research framework in order to obtain the relevant

information required to address the research questions. For example, the first

category of questions concerned the factors of system quality, information quality,

and service quality. Questions concerning use were informed by the factors of use

and intention to use. Finally, questions concerning outcomes were motivated by the

factors of satisfaction, learning, behaviour, and net benefits.

All interviews were digitally recorded using a voice recorder, transcribed and

then reviewed by the interviewer for accuracy and content. The transcripts from the

pilot study were transcribed and analysed by the researcher as each interview was

conducted so that the convergent interviewing cyclical process of “design, data

collection, interpretation, redesign, data collection, reinterpretations, redesign” (Dick,

1990, p. 5) could take place.

Organisational contact. The organisational contact was the first point of

contact for the researcher within the case organisation. Discussions initially took

place to gather a range of available data relating to the organisation in general, as

well as the background to the recent implementation of e-learning. Information

Page 86: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

72

gathered included company size and structure as well as specific information about

current evaluation practices.

Organisational documentation. Organisational documentation gathered

included e-learning design specification and development documents, and evaluation

approaches currently used at the case organisation. Each of these documents added

context to the understanding and analysis of the interviews. Including the analysis of

documents is potentially rich and is often a way for researchers to supplement other

methods such as participant observation and interviews (Marshall & Rossman,

2011). Documents can also provide background information that helps establish the

rationale for case or participant selection (Marshall & Rossman, 2011).

Direct observation. Observation was engaged in as an informal method of

data collection for this research. This involved “hanging around in the setting” and

“getting to know people” (Marshall & Rossman, 2011, p. 139) in order to understand

the social setting and capture the “context within which people interact” (Patton,

2002, p. 262). As noted by Patton (2002, p. 262), “understanding context is essential

to a holistic perspective”.

Sampling

Whilst negotiating access to the case organisation, the sponsor was asked to

provide the contact details of a list of possible participants who had participated in e-

learning. These participants were classified into two stakeholder groups: e-learning

users and L&D professionals. Then, as part of the interviewing process, these

participants were asked for referrals to other possible participants. This approach is

referred to as snowball sampling (Cavana, Delahaye, & Sekaran, 2001; Glesne,

Page 87: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

73

1999), and is recommended by Dick (1990) as an appropriate way to select

participants during a convergent interviewing process.

Sample size

Purposeful sampling was used in this research and can be further classified into

a range of different methods (Patton, 2002). In this study, typical case sampling was

used to allow the researcher to “illustrate or highlight what is typical, normal,

average” (Patton, 2002, p. 243). A total of 14 interviews with 15 interviewees (two

participants were interviewed together) were conducted at the case organisation. This

number was deemed to be appropriate as the researcher reached ‘saturation of data’

(Marshall & Rossman, 2011) during the convergent interviews, and further

interviews were not likely to yield significantly new information. Chapter Four—

Results, contains an overview of the interviewees and their key attributes.

Piloting the interviews

Prior to data collection at the case organisation, a pilot study was undertaken

to test the developed interview questions and the interviewing process to ensure that

the research process and the data collection and analysis methods would be

appropriate and would achieve the desired research outcomes. Pilot studies are useful

not only for testing the questions, but also from a practice perspective to demonstrate

the ability of the researcher to manage the research (Marshall & Rossman, 2011).

This became particularly evident when managing the perspective from which the

participants responded to the questions: either as a user or an L&D professional.

L&D professionals had a tendency to respond from a user perspective, and the users

had a tendency to want to respond from a broader perspective rather than just as a

Page 88: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

74

personal user. This issue was resolved by the researcher by reminding the participant

about their perspective as each question was asked. For example, “As a user, what do

you believe makes a good e-learning system?”

Three steps were taken to refine and pilot the questions. Initially the questions

were tested individually with a non-expert for basic understanding and clarity.

Valuable feedback was gained on the interview design, for example the ordering of

questions, and on the level of understanding, for example did they ‘make sense’.

After initial refinement, feedback was then sought from three subject matter experts

regarding the content of the questions. Following feedback from the subject matter

experts, the final stage involved piloting the questions within an organisation to test

the interview process as it applied to a workplace setting. A table including the

original pilot questions, an analysis of the questions, and the refined questions can be

seen in Appendix 2.

Data Analysis

According to Hatch (2002, p. 148), “data analysis is a systematic search for

meaning”. A number of steps were taken in the process of data analysis so that the

researcher could gain the greatest insight and make an interpretation of the meaning

of the data (Creswell, 2003). The steps taken follow the guidelines set out by

Creswell (2003) and are outlined below:

Step One: The data was organised and prepared for analysis. This

involved the transcription of interviews, scanning hard copy

documents, and typing up field notes. The data was then loaded into

NVivo Version 9.2 in preparation for coding. NVivo was chosen for

Page 89: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

75

analysis as it “enables the researchers to see the data well, as it

accurately reflects the data back to the researcher” (Beekhuyzen et al.,

2010, p. 1).

Step Two: All data was read to gain first impressions and look for

general ideas and meaning in the information.

Step Three: Analysis of the data began with a coding process of

organising information into broad categories.

Coding involves “linking, breaking up and disaggregating the data so that

once coded, the data look different, as they are seen and heard through the category

rather than the research event” (Morse & Richards, 2002, p. 115). It is suggested that

the first level of coding should organise the data into broad categories. These

categories facilitate comparison between things in them, and help the researcher to

develop theoretical concepts (Maxwell, 2005), as was required for this research.

Maxwell (2005) separates the process of categorising into three categories:

organisational, substantive, and theoretical. Morse and Richards (2002) suggested

three similar approaches to coding: descriptive, topic, and theoretical coding. The

coding for this research followed the suggestion of Beekhuyzen et al. (2010) that

organisational coding (equivalent to descriptive), is a good place to start building a

classification scheme. This is because organisational categories are broad areas or

topics that can be established prior to interviews (Maxwell, 2005). In the case of this

research, the theoretical framework provided broad topics to begin coding, for

example ‘information quality, ‘system quality’, and ‘service quality’.

These organisational categories were then applied to the transcripts to begin

coding for patterns. One of the primary goals of coding is to find repetitive patterns

Page 90: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

76

of action and consistencies in the data (Saldana, 2009). As such data were

categorised into patterns which were used as a guide during the coding process:

Similarity (things happen the same way).

Difference (they happen in predictably different ways).

Frequency (they happen often or seldom).

Sequence (they happen in a certain order).

Correspondence (they happen in relation to other activities or events).

Causation (one appears to cause another) (Hatch, 2002, p. 155).

During the categorising process a codifying process was applied to group,

regroup, and relink, in order to organise and group similarly coded data into

‘families’ of parent nodes and children nodes (Saldana, 2009). See Appendix 3 for a

table of the complete coding classification.

Quality of the approach

Assessing the rigor of qualitative work can be a challenge due to the

inconsistencies in standardised procedures (Morse, Barrett, Mayan, Olson, & Spiers,

2002), as compared to those that exist in evaluating the rigor of quantitative research.

Qualitative research has been criticised for its failure to adhere to traditional validity

and reliability criteria. However, validation and reliability strategies do exist to

ensure the accuracy of qualitative studies (Creswell, 2012). The goal of validation in

qualitative research is to establish ‘trustworthiness’ in order to achieve credibility,

authenticity, transferability, dependability, and confirmability (Lincoln & Guba,

1985). Creswell (2012) recommends engaging in at least two verification procedures

Page 91: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

77

in order to establish the credibility and trustworthiness of a study. A number of

strategies were employed in this study to ensure the rigour of this research. They are

outlined below.

Validation strategies

Prolonged engagement and persistent observation. This strategy was used to

establish credibility of the research (Creswell, 2012). The researcher travelled

interstate to spend time in the field, rather than complete all data collection via phone

interviews, in order to build trust with participants and learn the culture of the

organisation. Once engaged with participants in the field, the researcher was able to

make decisions about what was relevant to the purpose of the study.

Triangulation. Triangulation provides a means to establish validity and

credibility to the findings. Triangulation involves the use of multiple and different

sources, methods, investigators, and theories to substantiate evidence. As previously

discussed, this research utilised a number of different data collection methods.

Peer review or debriefing. A peer debriefer plays the role of devil’s advocate,

asking the hard questions about methods, meanings, and interpretations (Lincoln &

Guba, 1985). Under the same premise as inter-rater reliability in quantitative

research, debriefing acts as an external check of the research. A number of debriefing

sessions took place over the time of this research with the research supervisors.

Rich, thick description. An effort was made to describe in detail the

participants and the setting under study. The enables the reader to make decisions

about whether the findings can be transferred to other settings and under what

conditions.

Page 92: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

78

Reliability

The researcher sought to achieve “intercoder agreement” when coding the

transcripts, which is one of the main methods of addressing reliability in qualitative

research (Creswell, 2012). A process, as outlined by Creswell (2012), was followed:

A codebook was developed which outlined the coding structure and

the definitions associated with each code (see Appendix 3).

After initial coding of the first two to three transcripts, the researcher

met with supervisors to examine the codes, their names, and the

definitions. At this stage the codebook was updated to include both

nodes and subnodes.

The researcher and supervisor then coded a transcript together to see

if they agreed on the same text segments that were coded.

Once agreement was reached, the codebook was updated and the

researcher continued to code the remaining transcripts.

The Research Setting—Tracks

All data collected were from a large Australian rail organisation. A key issue

in qualitative (and particularly case study) research is purposeful selection of a site

which can best help us understand the problem and research questions (Creswell,

2003). The case organisation was chosen as an appropriate site for two reasons.

Firstly, the organisation recently adopted a new LMS. Second, the organisation

currently invests in the development of e-learning and intends to increase the number

of e-learning courses delivered in the future.

The case organisation cannot be identified by name in accordance with the

terms of the ethical clearance for this research, and as such it is referred to

Page 93: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

79

throughout this thesis using the pseudonym ‘Tracks’. Although the Australian state

in which it operates cannot be disclosed, a brief description of the organisation is

provided.

As is predominant in the Australian rail industry, Tracks is 100% state

government-owned, and is based in and services one Australian state. Tracks

operates and maintains an Australian state suburban, interurban and rural rail

network for passenger and freight services. Currently Tracks employs over 10,000

people who are geographically dispersed across the state. E-learning was initially

adopted as a training approach in 2008, covering topics such as the use of financial

systems, security transit procedures, and safety-related policies and practices.

As stated on their website, Tracks’ top priority is the safety of all people who

use their services, which means the safety of not only its customers, but also

employees, contractors, and the community. In order to ensure safety of all people, a

key priority is effective training methods, e-learning being a current focus.

Levels of e-learning

Tracks has an internally developed classification for the levels of e-learning

that they offer to their employees, which helps guide the development of further e-

learning. Based on levels of interactivity, this classification ranges from Level 1 (low

interactivity) to Level 4 (simulations reflecting real-life situations). At present, most

e-learning offered is situated at either end of the spectrum (Levels 1 and 4), with

little development at Level 2 (moderate activity) and Level 3 (complex/advanced

interaction). As such, it was important to take this into consideration when choosing

and targeting the sample group in order to ensure it was representative of the

majority of e-learning delivered at Tracks.

Page 94: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

80

Access to Tracks

Access to Tracks was negotiated over a five month period. This research is

part of a project within a Cooperative Research Centre for Rail Innovation. The

project has a steering committee with six individuals each representing an

organisation participating in the project—predominantly the L&D or Human

Resource Manager from these organisations. These steering committee participants

became initial contact points to engage with an organisation to collect data, of which

Tracks was chosen as the case organisation.

Initial contact was made in person in November 2011 with the representative

of Tracks, a manager in their L&D section, where the researcher verbally outlined

the research proposal as background to the study. This representative became the

sponsor of the research and assisted in negotiating formal access through the

organisation. Saunders et al. (2009) suggest a number of strategies for gaining

organisational access, a number of which were utilised in this research:

the researcher made themself familiar with the organisation, and tried

to understand the background and the services they offer before

making contact;

sufficient time was allowed to negotiate access and build relationships

with key people in the organisation;

the researcher used existing contacts within the organisation, using

them as a starting point to develop new contacts and develop access

incrementally;

a clear account of the purpose and type of access required was

provided at the beginning of the process in the format of a research

proposal;

Page 95: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

81

the research proposal was presented in suitable business language; and

finally

the possible benefits of the research were highlighted up front to the

organisation.

It was also understood that access is a continual iterative process, where

further access may be required after initial entry and data collection (Saunders et al.,

2009). As such, all interviewees were asked at the end of their interviews if the

researcher could contact them again if further information or clarification was

required. Table 10 summarises the access timeline and associated activities

undertaken to gain entry into the organisation.

Table 10

Summary of access timeline

Period Activity

16 November 2011 Meeting with Track’s representative on site; verbal

outline of research proposal.

6 December 2011 Written research proposal emailed to sponsor to gain

approval from senior managers.

February 2012 Access granted pending ethics approval.

9 March 2012 Ethics approval received (researcher institution).

20 March 2012 Contact with key participants (e-learning development

officer, e-learning instructional designer, and manager

to identify users and managers) to organise interview

schedules.

27 March 2012 Meeting with key participants and sponsor on site.

Data collection (interviews) formally commenced.

Interviewed two members of the e-learning (L&D)

team.

20 April 2012 Conducted on site interviews with simulator users and

managers.

19 April – 24 May 2012 Conducted remaining interviews via phone.

24 May 2012 Data collection formally concluded.

Page 96: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

82

Ethical Considerations

The research reported in this thesis was granted ethics approval by the

Queensland University of Technology (QUT) Human Ethics Committee (QUT

approval number: 1200000100) in line with standard ethical guidelines and the

National Statement on Ethical Conduct in Human Research (Australian Government,

2007). A variety of ethical issues were taken into account prior to conducting this

study in line with Patton’s (2002) comprehensive framework of ethical issues within

research projects, including explaining purpose, promises and reciprocity, risk

assessment, confidentiality, informed consent, and data access and ownership.

The purpose of the study was explained clearly to the interviewees. This was

reinforced in the information on the Participant Information Form that was presented

in lay language appropriate to the audience. In regards to promises and reciprocity, it

was clearly outlined in the Participant Information Form that although the project may

not directly benefit them the final reports provided to the case organisation would

provide an insight into how learning technologies can be used most effectively in the

rail industry. Participants were also offered the opportunity to request a copy of the

final report. A risk assessment was conducted prior to commencing the research and

this study was considered to be low risk; it was deemed that there was no risk beyond

normal day-to-day living associated with participation in the project.

Confidentiality of all participants was ensured in this research. All comments

and responses were treated confidentially, and the names of individuals were not stored

with the transcripts. Participants were also assured that the level of analysis conducted

and the reporting of findings would not allow for the identification of individuals.

The interviews were recorded using an audio device, later transcribed and then deleted at

the end of the project. Transcripts could only be accessed by the research team and were

Page 97: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

83

stored in a secure location. Interviewees were given the option, by indicating on the

consent form, to read the transcript for verification purposes prior to final inclusion.

Participants have a right to privacy and being informed about all aspects of

the research (Zikmund, 2000) and as such informed consent was sought from the

case organisation and all participants. Initial access was granted by a senior manager

in the case organisation. All participants were provided with an information sheet

and consent form relating to the study (see Appendix 4) prior to conducting

interviews. At all times it was made clear that interviewees were free to withdraw

from the study at any stage.

Boundaries and Limitations

There are criticisms of single case study designs which need to be

acknowledged. The main being the limitation of a single case rather than multiple

case design. The choice of single case design means that the results do not allow for

comparisons between organisations. Although a multiple case design would have

increased the transferability of this research, it was outside the scope of the study. At

no point does this study claim to be generalisable, but rather to investigate and

develop significant theory, which Yin (2003) suggests is an appropriate use of single

case design. It should be noted that the aim of this study was to further explore the

concepts in the research framework, not to fully explain the relationships or develop

generalisable propositions.

Although a single case study is not statistically generalisable, Symon and

Cassell (2012) claim that non-generalisability is a myth and there are other modes in

which case study findings are generalisable, for example isomorphic learning—the

idea that lessons from an event can be applied in other settings (Symon & Cassell,

Page 98: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

84

2012). In the case of this research, cross-organisational isomorphism may apply, in

which case the findings can be applied to different organisations in the same sector.

The qualitative nature of this research means that the findings could

potentially be subject to other interpretations. The chance of this was reduced by

following the protocols outlined in this chapter, undertaking multiple reviews and

debriefing sessions conducted with supervisors to discuss agreement of findings, and

an intercoder agreement process to ensure agreement in data analysis and the

resultant findings.

Chapter Summary

This chapter provided a detailed and comprehensive outline of the qualitative

research approach and research design of the case study reported in this thesis. An

overview of case study design was provided, followed by a detailed explanation of

the data collection methods, sampling strategy, and data analysis. The ethical

considerations of this study were also discussed, as well as issues of validity and

reliability in qualitative research. The following chapter provides the findings of the

study.

Page 99: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

85

Chapter 4—Findings and Discussion

Chapter Overview

Chapter Three established and justified the methodological approach and data

collection methods used for conducting this research. This chapter presents the

findings and discussions as they apply to the research questions. The findings are a

result of the analysis of the data collected via interviews with users of e-learning, and

L&D professionals at the case organisation Tracks, as well as organisational

documents. The purpose of these interviews was to gain a full understanding of how

the system creation elements of the research framework apply in the context of

organisational e-learning.

The analysis begins with an overview of the context of the investigation at the

case organisation—e-learning at Tracks. Table 11 displays the attributes of the

participants who were interviewed, including a brief synopsis of their role at Tracks,

any exposure they had to e-learning in the past, and other key attributes. The names

of participants have not been used and have been replaced with an alias to protect

their anonymity. Following this is a discussion about the e-learning team within

Tracks’ L&D department, their roles, what they currently do, what they are currently

developing, and their future plans. Finally, an overview is provided of Tracks’

current e-learning evaluation.

Page 100: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

86

Table 11

Overview of participant attributes

Participant Gender Age Role

Environment:

Traditional E-

learning, or

Simulator

Role Classification:

Learning and

Development, or E-learning User

Prior Exposure to E-learning

EMILY Female Gen X E-learning

instructional designer Traditional L&D Completed traditional e-learning at Tracks.

Involved in the development of e-learning at

Tracks. DAISY Female Gen X E-learning

development officer Traditional L&D Completed traditional e-learning at Tracks

Involved with developing processes about the

implementation and development of e-learning

at Tracks. PERCY Male Gen X Training manager Simulator L&D Piloted e-learning programs with students in

previous role as a teacher. Completed traditional e-learning at Tracks. Used and managed the simulators at Tracks.

THOMAS Male Gen X Principal trainer Simulator L&D Completed traditional e-learning at Tracks. Used simulators and part-task simulators at

Tracks. GORDON Male Gen X Train driver Simulator User Completed simulator training. Used computer-

based scenario role-play to fix simulated train

faults. HENRY Male Baby

Boomer Test guard train guard Simulator User Completed simulator training. Used part-task

simulators—computer-based scenario role-

play to fix train faults.

Page 101: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

87

JAMES Male Gen X Train driver Simulator User Completed simulator training. Completed

computer-based assessments at Tracks.

College. EDWARD Male Baby

Boomer Train guard Simulator User Completed simulator training. Used part-task

simulators—computer-based scenario role-

play to fix train faults. TOBY Male Baby

Boomer Train driver Simulator User Completed e-learning in an MBA course.

Completed simulator training. Completed traditional e-learning at Tracks.

ANNIE Female Unknown Station training

manager Traditional User Completed traditional e-learning at Tracks.

Completed e-learning courses at university. HENRIETTA Female Unknown Support services

training manager Traditional User Used the simulators at Tracks.

Completed traditional e-learning at Tracks. Completed e-learning courses outside of

Tracks. Involved in a webinar course development.

FLORA Female Gen X Development manager Traditional User Tested new e-learning courses being

developed at Tracks. ALFIE Male Gen Y Appointed duty

manager/acting as

station manager

Traditional User Used e-learning to deliver training in a

previous position at Tracks. Tested new e-learning courses being

developed at Tracks. BILLY Male Baby

Boomer Fleet manager/depot

manager Traditional User Tested new e-learning courses being

developed at Tracks. Completed computer product training (Excel

and Word) BERTIE Male Baby

Boomer Operational support

manager Traditional User Tested new e-learning course being developed

at Tracks. Completed e-learning courses outside of

Tracks.

Page 102: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

88

Contextual Overview

E-learning at Tracks

Tracks currently has 16,000–17,000 employees. There are approximately

5000 employees at Tracks who do not have access to computers as part of their day-

to-day job. This makes the job of training a large and geographically-dispersed

workforce challenging. In order to continually upskill and ensure the competency of

their workforce, Tracks procured a new LMS which went live in April 2012 and is

positioning to deliver a large amount of their training via e-learning. The following

outlines the e-learning team that was put in place to support the new LMS and

delivery of e-learning, and the e-learning courses that are currently being delivered at

Tracks.

E-learning team

The e-learning team is a project team established two years prior to the

research to enact processes related to the implementation of the new LMS and e-

learning courses. The team consists of an e-learning instructional designer, an e-

learning development officer, an LMS administrator, and an e-learning marketing

position. Two key members of this project team were interviewed; an overview of

their roles follows:

Emily and Daisy are a contract instructional designer and an e-learning

development officer respectively. The instructional designer’s role is to liaise with

the business on content and the learning specifications for e-learning courses. This

includes devising an approach, creating a storyboard, and then developing the e-

learning course. Another aspect of this role, in conjunction with the e-learning

Page 103: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

89

development officer, is to establish standards for e-learning design. This involves

setting guidelines which cover the whole spectrum of the design and development

process from instructional design aspects to the technical specifications and

publishing standards.

The e-learning development officer holds a permanent position to assist the

instructional designer with training in regards to testing e-learning courses in the

LMS, upskilling other team members using the LMS, and helping to establish the

processes for how the team will work with the training managers and other business

representatives to develop future e-learning courses. This position also involves

working alongside the vendors to build e-learning courses for Tracks and assisting to

project manage the process with the Senior Course Review Officers. In this case the

vendors were external e-learning developers. The e-learning development officer

explained that in the future they will need to expand the e-learning team once the

contractors leave to take up the additional roles currently being performed.

The project team is still in the process of establishing roles within the group.

The roles of e-learning coordinator, LMS administrator, and the e-learning marketing

position are recognised as important to the team; however, they are yet to be defined.

There are currently no curriculum developers in the e-learning team and there are

presently no plans to include this skill set as the long-term plan is to use external

vendors to develop e-learning courses. The e-learning team is currently creating a

small number of courses in-house to support the rollout of the new LMS.

Page 104: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

90

E-learning courses at Tracks

A number of participants during the interviews described e-learning at Tracks

as being in its infancy. There are currently 150 modules on the LMS that are

technically classified as e-learning training. Of those, 114 are basic Microsoft guides

(equating to the Level One classification described in Chapter Three). The remaining

are mostly generic courses for corporate employees (e.g., e-health). Most recently a

safety management system (SMS) e-learning course was developed, and this is in the

process of being piloted in different business units.

Some of the other modules are only online tests. Daisy explains further:

“the LMS produces an assessment, so it's not actually course based... those

results actually goes to the allocated training manager who requested that

assessment to be produced. So it's not just actually e-learning... It’s having a

quiz type assessment, straight up... It could be from straight from [sic]a face-

to-face or a blended type learning... I think moving forward, they'll be

probably looking at doing more of those types of things.”

Current evaluation at Tracks

Participants were asked how e-learning is currently evaluated at Tracks and to

explain the process to the researcher if they had sufficient knowledge.

Overwhelmingly, participants responded that they had little knowledge of the

evaluation process, or they assumed that there would be an evaluation process but

they weren’t aware of the details. When questioned further about any evaluation they

remembered completing after finishing a course, participants generally spoke about

Reaction level feedback (Level 1 in Kirkpatrick’s model). Typically this is a

Page 105: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

91

feedback form at the end of a course to give personal feedback on the course and the

trainer. A key issue for evaluation is the extent of use once submitted by trainees.

Trainees are in the habit of filling out basic evaluation forms, but they generally

don’t see the effects of them post-training:

“At the end... yes... every course that you do, there is always an evaluation

form done. Whether it’s taken seriously or not, that’s another thing.”

(Edward).

From an L&D perspective, evaluation is seen as a standard part of the process

of delivering training, regardless of the delivery method; however, at Tracks this is

currently limited to Level One (learner’s reactions) and what trainees thought of the

course. The e-learning team has a set of standard questions that are sometimes

modified depending on the course. These standard questions ask participants to rate

aspects of the course on a scale from one (poor) to five (excellent), followed by the

opportunity to describe the least and most valuable parts of the course. Interestingly,

although the e-learning team sees this as standard evaluation the participants either

don’t remember completing them or don’t see this reaction level of data collection as

a formal evaluation technique. The following is an example of generic questions

participants would be asked after completing an e-learning course:

1. The course aim and learning outcomes were clearly outlined during

the course.

2. The online training delivery was effective.

3. The course was presented in a logical sequence and it provided

appropriate feedback at each step.

4. The course was interesting and engaging.

Page 106: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

92

5. The use of video/voice over/interactivity helped me gain new

information.

6. The course content was relevant, realistic and easy to understand.

7. The assessment questions were clear, appropriate and easy to follow

(if relevant).

(Extract from LMS e-learning Course Evaluation Document)

Although a large part of e-learning delivery from the perspective of the e-learning

team is the LMS (which is also further discussed under ‘System Quality’), the above

generic questions focus solely on the course itself and the training material or

content, and fail to ask questions about the LMS as an entity in itself.

In addition to Level One evaluation, the L&D participants understood the

need for some form of ROI evaluation in the e-learning design and implementation

process, and as such have added a new ‘ROI’ step in the flow chart of the evaluation

process (Tracks, 2012). However, the objective of this step is to inform decision

making in discussions with the L&D Managers prior to submitting a Projects and

Curriculum Submission Form for approval by the E-learning Advisory Centre. This

approval allows the L&D Managers to initiate the process to procure a vendor to

design and develop the e-learning course. Although this initial decision making tool

is an important step in the process, the current flowchart finishes at delivery of the

course and does not allow for any higher level evaluation after delivery.

Percy discussed the differences between simulator evaluation and

traditional e-learning evaluation. Percy previously commented on the large number

of people that have been put through simulator training at Tracks: “We’ve trained

600 people in under six months”. However, it is interesting that there is no

evaluation undertaken above Level One—Reaction:

Page 107: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

93

“Facilitator: Do you know if the simulation is evaluated at any other

level within Tracks and how it’s evaluated?

Percy: No, it isn’t.

Facilitator: They don’t do any return on investment type

evaluation?

Percy: Not at this point in time.”

When then asked the same question about traditional e-learning, Percy commented:

“That’s a good question. No, I don’t know. I suspect it probably isn’t at

the moment, which doesn’t mean it isn’t, but I’m assuming it probably

isn’t. In general, we’re not very good at evaluating materials”

Although from this participant’s statements it seems that neither simulator training or

traditional e-learning are being evaluated, it was inferred that only traditional e-

learning requires evaluation. The participant felt that simulator training at Tracks has

already proven itself to be an effective training mechanism due to the positive

feedback received (essentially only Level One in the Kirkpatrick (1976) model). The

implication was that as this method had been used for an extended period of time it

did not require more rigorous evaluation. Percy did however concede that the large

volume of people who have been put through simulator training means that there are

now more skilled train drivers than there are trains and a more rigorous evaluation

process may have alerted management to this over-training prior to it occurring.

Participants felt however that traditional e-learning, has not yet proven itself, and

therefore needs to be subject to a more rigorous evaluation process.

Page 108: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

94

Themes Presented by Research Questions

Beekhuyzen et al. (2010) describe the process of qualitative analysis as being

much like the looking glass; it is a process of fracturing the data (or smashing the

glass) into manageable pieces, then reconstructing it to reflect back a view of reality.

Following this the chosen theory is then used to guide the investigation. The

following presentation of findings takes this approach by using the theoretic

framework developed in Chapter Two as a guide to reconstruct the data and present

it in a meaningful way. A model is used to represent each element and the factors

that were found to be important. This is particularly useful as diagrams “help us

disentangle the threads of our analysis and present results in a coherent and

intelligible form” (Dey, 1993, p. 192).

The major themes that emanated from the data relevant to the various

elements of the IS and L&D evaluation models that apply in e-learning include:

system quality, information quality, support quality, learner preferences, and change

management. A discussion of each follows.

Research question 1: How does system quality apply in the context of

organisational e-learning and what is the nature of this factor?

Before the system quality factor can be explored in an organisational e-

learning environment, it is important to define what a system is in the context of this

case study. At Tracks, and as explained in Chapter Two—Literature review, an e-

learning system refers to both the LMS and the e-learning courses. An LMS is how

users access e-learning, and then how their managers can track their training and

Page 109: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

95

completion of e-learning courses. An LMS also has the ability to produce reports.

Daisy explains how the LMS interacts with other IT systems at Tracks:

“It also produces report [sic]as well. So, if managers need to see a report on

how a particular course has gone... This LMS system within L&D will also be

linked to another HR system… So it will actually put against their record, the

employee record, if they've completed a particular course or if their

percentage pass mark—things like that. So, this LMS is basically going to be

tracking employees’ results.”

Interviewees were asked what they believe makes a good e-learning system

and what elements they would look for in assessing system quality. System quality in

the D&M IS Success Model is typically measured in terms of “ease of use,

functionality, data quality, portability, integration and importance” (DeLone &

McLean, 2003, p. 13). In an e-learning environment, system quality was found to

mean format/structure, accessibility, legitimacy, flexibility, administrative functions,

functionality, ease of use, and long-term knowledge resource. Figure 8 displays the

elements that were found to be important in assessing system quality, and is followed

by an explanation of each of these subthemes identified in the data. Additional

example narratives relating to qualitative assessment of system quality are displayed

in Appendix 5.

Page 110: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

96

Figure 8: Elements of system quality

Structure

The participants spoke about a number of factors related to the structure of an

e-learning system, including module sections, order of modules, agenda, assessment

items, navigation, format of LMS, and functionality of the simulator system. Each of

these will be discussed in turn.

Navigation was typically described by interviewees as needing to be intuitive

and logical. Navigation was described as how a trainee initially accesses the LMS

and how they navigate to a module. Any new system can be complicated to learn and

trainees can be confused when confronted with a new complex system. Alfie

discusses the difficulties of a complex LMS for a trainee:

“ People only go in there to do training, and there's a few other things

associated with the e-learning tool. It becomes quite confusing... the website

itself has got to be formatted so that people know where to access things

easily.”

Simple formatting can alleviate these problems, for example a simple front page that

isn’t busy, directs the learner where to go, and uses different colours or appearances

for sections so that trainees can quickly identify what courses they have completed.

System

Quality

Structure Ease of Use Functionality Legitimacy

Long-term

Knowledge

Resource Flexibility Accessibility

Page 111: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

97

In addition, the formatting of the e-learning courses themselves is important to

consider, such as a well communicated agenda and modules that are appropriately

divided into sub-modules or sections which support a trainee’s ability to self-pace

their own learning. Flora explains:

“I think it's good to have it in sections ... like sub-modules. The one that I did

was quite good because it sort of had an agenda on the left hand side and

then you, it played on the right hand side so you could sort of track where you

were going and how the different bits were going to fit together, [it built]

quite a convincing case of what is was teaching you. Keep it quite structured

... is the best way to go.”

The similarity in navigation across different e-learning modules is also of

importance to trainees. Bertie explains further:

“It's got to be similar across the board, across all training packages. The

system’s got to be common, if you like so that there's not different systems in

different e-learning packs. So if they are using one pack then they familiarise

themselves with one system, it's got to be the same as when they do a different

e-learning course.”

The structure in a simulator environment, however, is different to that of

traditional e-learning. Rather than a focus on modules, sections, and agendas in the

learner’s mind, structure refers to the way the simulator functions and its layout.

Simulators are expected to have identical layouts to the real train, and users

experience the same layout, functionality, and movement as if they were actually

Page 112: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

98

driving a train. The users saw these as positive benefits of simulators over other

technology they had used such as traditional e-learning.

Ease of use

In addition to the format/structure of the LMS and courses being logically

presented, interviewees commonly spoke about the need for the LMS and e-learning

courses to be easy to use. Regardless of the participants’ classification as an L&D

professional, simulator user, or traditional e-learning user, ease of use was

acknowledged as important:

“...easy to use. While I'm fairly familiar with computers and operating

systems, other users aren't so proficient. So a system [has] got to be easy to use.”

(Bertie)

E-learning is intended to solve problems of training a large number of people;

however, all those people might not necessarily have the same computer skills or

self-efficacy, and these needs must be taken into consideration. A potential barrier to

e-learning is that trainees who lack confidence with computers may be put off by the

technology before they even start a course, and so the LMS needs to have a high

level of usability. From an L&D perspective the administrative functions need to be

easy to use; it should be easy to upload and replace courses. In addition, it should be

easy to generate any information that is needed, for example reports and assessment

results.

Page 113: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

99

Functionality

This element that emerged from the data as an element of system quality

refers to whether the system works as it was intended. Interviewees spoke about a

range of functionality issues such as technical errors, technical specifications, and

timing out of courses. From the designer’s perspective, functionality plays a large

part in the initial development process. Emily explains the main things they consider

when deciding whether or not something is suitable for e-learning in regards to

functionality:

“delivery environment so things like bandwidth, technical environment that

users will be accessing, the standard of computers, and whether they've got

access to soundcards for audio, headphones and that sort of thing.”

From another perspective, users expressed frustration with technical problems

that stop or delay them from accessing and completing e-learning courses,

particularly when trying to fit them into already busy work schedules:

“I think obviously good technology, because when I did the e-learning

module for [manager] the videos didn't work... good seamless technology,

you know minimal technical error is usually best because I think it's

frustrating if things don't work.” (Flora)

Logins were initially identified by users as a potential problem area for

system functionality. Although users see this as part of the system, this is difficult to

incorporate into the LMS and e-learning evaluation as logins usually refer to the

supporting corporate systems or infrastructure, for example the login to the computer

to run the software or access the network. This, however, is not the system itself. A

Page 114: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

100

number of interviewees were also concerned about the functionality of e-learning

systems in terms of timing out once they had logged in, or needing to pause a course

to do other work and then re-access the course at the same point at which they left it.

Flora, who was interviewed as a user, but is also a manager, spoke about her staff

and the potential concern surrounding this:

“He was doing an online e-learning module around safety and then I needed

him to do something urgently and he's like “do I close it down and look

incomplete”, and I think whatever you decide your system is for timing and

timing out, and as long as you make that clear when the person starts, that

this how it works.... I feel unsure about if I leave this it will look like it took

me five hours to do it. Which it didn't, I just minimised it and had to do other

things.”(Flora)

Billy, who is also a user and a manager, supported this view:

“for whatever reason if he walks away from his desk, it goes into sleep mode

or whatever, but when he comes back it's still there for him. Or one where he

can go away to do a job and actually close out of it, but when he goes back

in, he goes back into the same spot that he was up to last time.”

This disrupted environment was experienced firsthand by the researcher

during an interview with Alfie when he needed to pause the interview multiple times

to answer phones, talk to fellow employees, or talk to customers. Other technologies,

such as online surveys, have a pause function where a person can stop and start at

their convenience, and in a busy environment like at Tracks this was identified as a

useful function within e-learning.

Page 115: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

101

Legitimacy

An interesting factor which was not identified in either L&D or IS literature

is a sense of legitimacy of the system. This factor is based around a feeling of trust of

the e-learning user in the organisation delivering the e-learning, and the authenticity

of the e-learning content. The legitimacy factor encompasses aspects such as voice-

over accents and branding, where assumptions that are made about the

trustworthiness of the e-learning system depends on these aspects. Flora explains:

“I think a bit of branding so that you feel confident about it being connected

to your company or your organisation... I definitely remember noticing, oh

this is an American product, off-the-shelf and I made a whole lot of

assumptions about what that meant, so the validity of the training course,

which were perhaps were[sic] accurate or inaccurate, we'll never know. It

may be, when things do have, there is a foreign element, that could be

distracting.”

When asked about the way training materials can be presented in e-learning,

and if they would like options like voice-overs, Billy also commented on accents:

“If it was a perfect world where you had every opportunity for everything, yes

I would possible say a voice-over, as long as it wasn't a real broad yank

talking.”

As well as representing legitimacy, this comment from Billy also highlights

the concept of alignment between the individual and the system. Whilst participants

did not like the voice-overs if they were a foreign accent, if it was an Australian

voice-over they feel the learning is more relevant to them.

Page 116: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

102

Long-term knowledge resource

This element is the potential ability for an LMS to become a personal

resource library. It includes the ability to track training, keep personal records, re-

access past e-learning, and retain information for future reference.

A common demand from e-learning users was the ability to re-access training

they have already completed, and the ability to save key information from the

courses into easy to access factsheets. Although e-learning is an efficient mechanism

for delivering training, unlike face-to-face courses where trainees often have work

booklets to take away with them, e-learning courses often only have a certificate of

completion for trainees to print out. It is difficult to remember everything in a

training course and some form of reference material for future use would assist the

trainee back on the job after training.

Flora discussed that she would like reference material, but rather than a hard

copy she would be happy if she had the ability to access the completed modules:

“If it's offered to you as a one off, once you've done it you can't access it

again, that's not ideal. But if you can perhaps do the course, gain your

accreditation for it, but then in the future go yes, I remember I learnt that,

and go back and log in, it could add a lot of value ... I know that having

done[that] sort of Microsoft Office based training in the past, I would love

that because you think it makes perfect sense to me while I'm doing pivot

tables and I get back to my desk and I'm like what was that? And you go to

your book but that's not quite the same as being on your computer.”

Once again, simulators are a different scenario to that of traditional e-learning

courses. Within a simulator system there is the ability to capture the learner’s profile

Page 117: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

103

in real-time and build up a bank of training history for the trainee. The option to

record everything the learner does, including the footage of them whilst using the

simulator, means that the learner could potentially build a personal portfolio of

achievement or, from a manager’s point of view, a record of competencies that

haven’t yet been attained. Management can capture real-time footage as well as the

inputs and scenarios they create in the system, and in real-time place bookmarks in

the footage of particular instances they want to review. It becomes an explicit way of

capturing whether the trainees have met competencies. Thomas explains:

“My preference would be to capture that digitally because you've got that

opportunity. So these simulators here, for example, when we do have

assessments, if somebody's not meeting the outcomes of the course, we

actually record that assessment as evidence of why they didn't meet the

outcomes of the course. We can go back to it at any time and it captures

everything, even the footage.”

Flexibility

This element addresses the ability of trainees and designers to modify the

system to their own individual preferences and to interface at their own pace.

Participants reported that in a workplace setting things don’t always go to plan; the

way things are done or the way decisions are made is often non-linear, and training

needs to be flexible enough to replicate this environment. At the moment simulators

have the capacity to incorporate this flexibility; however, best-practice for e-learning

in the future should be to incorporate greater flexibility.

Page 118: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

104

With simulator training flexibility becomes particularly important as trainers

have the ability to change scenarios and change workings of the simulator to suit the

needs of the trainee. Percy further explains why the simulators at Tracks meet his

needs of being flexible and self-paced:

“From a trainer's point of view, you really need to have the flexibility to be

able to inject faults and/or events, let's call it events, into that scenario and

also be able to record that ... And I think what’s been successful about our

training is that the trainers control the faults on the simulator, and they

interact with the participant, and based on the participant’s decision making

processes, that’s what triggers off what they’re going to do next. So it’s not

controlled ... they can mix it up, and based on the participant’s strengths and

weaknesses.”

Participants who had had the opportunity to participate in both simulators and

traditional e-learning courses generally didn’t look favourably on traditional e-

learning after using simulators because it doesn’t have the same flexibility:

“We aren’t very fond of it because it’s very rigid; it hasn’t got any flexibility

in it... that’s the positive I see out of the simulator training. It’s fluid and you

can engage with the participant, whereas something like that, it’s fixed, and it

might not necessarily be a wrong step, but if you do something and it doesn’t

like it, you have to start the whole process all over again.”(Percy)

Page 119: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

105

Accessibility

Accessibility to the system is of particular importance to organisations such

as Tracks that have a dispersed workforce. An important question will be whether it

can be used on a variety of computers at different locations and whether the system

will support this access. Bertie explains:

“It's got to be able to be used in different locations. Some of our areas that

[sic] don't actually have access to computers, so a bit of flexibility is involved

with that. I know in [Tracks] we're looking at setting up sites for that, for the

remote learning or e-learning. So that we’ve got have a number of computers

where they can go and actually sit at these computers with a standard logon

and logon on to do the e-learning at these sites.”

Due to the nature of the workforce at Tracks, the rail crew have a lot of

downtime whilst they are on standby waiting for trains and traditionally they have

not had access to computers. Thomas explains that in his role as trainer he can see

the potential of making these systems more accessible to employees:

“I think there's a big hole in what we do, because you could have desktops in

a depot or in a location where train crew are and you could say to them, look

guys, you've got six months go to through and do this training and

assessment. When they've got a spare minute, they can go and do it. A lot of

crew really are keen to keep their skills up and often complain that they don't

have that opportunity. This provides you with a big opportunity to do that.”

Page 120: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

106

Research question 2: How does information quality apply in the context

of organisational e-learning and what is the nature of this factor?

Information quality, or the measurement of output from the IS, is typically

defined as quality of content, accuracy, precision, currency, reliability, timeliness,

completeness, relevance and format required. This study also found accuracy,

relevance and format to be of importance in an e-learning environment. In addition,

the elements of ease of understanding, interaction, alignment, and nature of content

were found to be of relevance to information quality (see Figure 9). An explanation

of the subthemes identified in the data is presented in the following section.

Additional example narratives relating to qualitative assessment of information

quality are displayed in Appendix 6.

Figure 9: Elements of information quality

As with system quality, the differentiation between a traditional e-learning

environment and a simulator environment became apparent. As mentioned

previously, in traditional e-learning systems information quality refers to the output

from the system and generally refers to training content. In a simulator environment

the training content becomes the simulation experience, and the impact on the trainee

in terms of how real the experience is. If system quality is equated to ‘do all the bells

Information

Quality

Format Nature of

Content Relevance

Ease of

Understanding Interaction Alignment

Content

Accuracy

Page 121: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

107

and whistles work?’ information quality is ‘can I drive the train, and would I know

what to do if something goes wrong?’ This difference is noticeable when assessing

interaction. The following sections discuss the elements that were found to be

important in assessing information quality. The differences between simulator and

traditional environments are also highlighted when relevant.

Format

Participants were asked to identify the format in which they prefer to have

information delivered to them via e-learning. A key finding that emerged was that

participants favour a number of different formats for information and, more

importantly, regardless of what the formats are there needs to be a blend of different

types. Participants indicated their preference was for content other than plain text on

the screen, for example diagrams, questions and answers, videos, and hyperlinks to

external content. Participants explain their preferences further:

“diagrams ... Question and answering using diagrams et cetera. Pictures

relating to different parts of the train and mechanisms and duties.” (Henry)

“Not reading much, because reading is just like reading a book ... So I think

normal video would be better.” (Toby)

“I actually like the audio so I don’t have to read the instructions or read the

text on the screen.” (Henrietta)

“my only preference is, where possible, graphics to go with it, not just

words. So you could actually see what he should have seen, what he did each

page.” (Billy)

Page 122: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

108

Nature of content

Participants were also asked what type of training content they prefer to have

delivered via e-learning. Typically, nature of content refers to the type of knowledge

being delivered. This was touched on when participants spoke about knowledge-

based versus skills-based content; however, a number of other factors such as

stability of content, safety-critical content, and location of trainees impact on the

nature of content suitable for e-learning.

A number of participants suggested that anything can be converted to e-

learning, depending on the approach that is taken and the resultant quality. However,

most participants commented on specific information appropriate for an e-learning

environment. The appropriateness of safety-critical material for e-learning was of

particular importance for participants. In general, participants felt that e-learning is

not appropriate for safety-critical material because if it was misinterpreted or not

understood it could result in dire circumstances:

“If it’s not safety-critical, and it doesn’t matter how you do it, you’re

generally heading in the right direction ... Training content that is safety

critical related, I think, should not be on e-learning ... Because you need the

human interaction in regards to question answers. When you use an e-

learning tool you can never ever get an appropriate answer from a computer

... If you put something on an e-learning tool they might misinterpret it. So

you've got to be particularly careful about how you use an e-learning tool,

particularly if/when it comes to legislation and safety.” (Edward)

Page 123: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

109

Safety is also an important consideration from a design perspective when

L&D professionals decide whether or not a course is suitable for e-learning, as Emily

explains:

“whether or not it's safety-critical or not is one of the key considerations

within this organisation as to whether or not they would consider it suitable

for e-learning delivery.”

Participants felt that the e-learning at Tracks is currently not mature enough to be

deemed appropriate for something as critical as safety.

In contrast, Henrietta mentioned that content such as safety is good for an e-

learning format as it is generally stable and doesn’t need to be frequently updated.

However, this may be because she was making the point that content needs to be

stable, rather than it being related to safety content specifically. Furthermore, e-

learning is seen to be safe when it is cost effective:

“safety will be a great one to do because it’s something that involves a high

level of participant base, a high universal base need. But if you are having to

update the content and that involves a huge cost each time every six weeks,

then it doesn’t become cost effective... For example, currently we are about to

launch a pre-induction component to our induction. So that’s quite an

effective way of one taking e-learning and incorporating it into a sort of

blended approach.” (Henrietta)

Stability of material, or the frequency with which material needs to be updated, is

also a factor which L&D professionals believe impacts whether something is

appropriate to develop into e-learning. Emily explains:

Page 124: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

110

“whether or not the content is sort of stable in terms of you know, does it get

updated often and if it does, then it probably isn't that suitable for e-

learning.”

Induction material was identified as a particular type of content that would be

suitable for e-learning. This is because induction is generally knowledge-based

content rather than learning a new skill. In addition, induction training tends to be

compliance- or competency-based training, which generally doesn’t need to be put

into practice immediately and requires no interaction:

“Anything that's really just information based where you don't need to have

the two-way communication. And that really means that most of our courses

could really be converted over to e-learning ... most of our induction courses

could be e-learning.”(Annie)

Furthermore, compliance training courses often need to be delivered to a large

number of employees, which would be time consuming and costly if done via face-

to-face methods, but is time and cost efficient via e-learning. Percy explains:

“It’s just an information dump, and people can just read it. For example, we

had to get 5,000 staff to watch a particular video that went for five minutes,

and then obviously the benefits of putting it on LMS and then we can keep

track of who has and hasn’t watched it.”

Similarly, management courses, which tend to increase a manager’s knowledge base

rather than teach new skills, were identified as appropriate for e-learning:

“In terms of management development, you know things like code of conduct,

fraud and corruption. Those sort of policy and procedure type compliance —

Page 125: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

111

that level of compliance driven, policy and procedural information knowledge

based—that can work quite well as well in terms of updates and briefings.”

(Henrietta)

Nature of content also relates to whether the content needs to be

straightforward and not too technical. The characteristics of a workforce such as rail

means that many employees, such as train drivers, don’t use computers very often,

and if e-learning is too complicated they won’t get the maximum benefit. James

explains:

“If it’s straightforward stuff, it’s not too technical. It’s just, we don’t sort of

use computers that often here. Some blokes do but most of us in general

don’t. I’m speaking on my behalf, my personal experience, I’m not much of a

computer freak or whatever. There’s a couple that are. Just the basics. If it’s

pretty basic to use. I know enough to do what I need to do.”

Content relating to the use of technology was identified as appropriate for e-

learning. For office-based employees who use computers and systems every day,

filling in online forms, and using numerous computer programs, it seems appropriate

to train them using a technology-based format. Compared to soft skills, which

generally require interaction and communication, this type of content was seen as

more appropriate to train in an e-learning course rather than a traditional face-to-face

environment:

“when you're doing that task, you're doing it at the computer, so doing it as

an e-learning module is probably quite effective. Whereas doing an e-

learning module on managing performance, you don't do that at a computer.

Page 126: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

112

You do that with people ... Perhaps if it's people-based skills, or

communication-based skills, e-learning might be only part of a broader

training package for people.“(Flora)

Relevance

Relevance was the most common concern when participants were asked

about the type of training content they prefer to have delivered via e-learning. This

element refers to applicability of the information to the target audience, applicability

of the content to their job, and in a simulator environment how ‘real’ the training

was. For example:

“So as long as it’s relevant to the job and you know how to use it that’s the

main thing for me and you can identify it and actually carry it out in your

actual day to day work.” (Henry)

These comments were also supported by Thomas:

“I know that there's a lot of stuff out there that anyone can get into.

Everyone's got access to it now. But most of what they do is things like

computer program training, so how to use Excel or how to use Word properly

or whatever. That's all well and good if you work in an office. For our train

crew, not so much; there's not a lot out there that they have—that is valuable

to their role at this point in time, so it really needs to be targeted to the role.”

A potential problem is if the training is not perceived as relevant, staff will be

less engaged. Thomas explains:

Page 127: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

113

“For the average person in an office who uses Excel all the time or uses

Word, great. For train crew who never use Excel or Word, it's not targeted at

them. So you're not actually reaching out and engaging them, saying we've

got this really cool system, we want you to be involved in it and this is what's

in it for you. You're not getting buy in from them.”

As mentioned previously, information quality in the context of simulation

training is about the experience, and relevance equates to realism (how realistic the

simulator is). One of the interviewees explains further:

“As close to reality as possible I guess... The older ones... The tracks and

lines we were going through were fictional where they didn’t exist, whereas

the new [simulator] is computer generated, images in front of you are

actually based on actual lines. So you were already familiar with what’s

going on. I think it’s much improved. I think it gets back to the reality

aspect.” (Gordon)

The concept of the realism of simulators in information quality differs to that

of ‘real’ that was presented in system quality. ‘Real’ in information quality refers to

the relevance of the content such as the graphics, for example if the train lines are the

same as the ones the users will be driving after completing training. In comparison,

‘real’ in system quality refers to the layout, functionality and movement of the

simulators.

Page 128: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

114

Ease of understanding

Regardless of the suitability of content or the format of the information,

participants reported that if it is not easy to understand the training is not considered

to be effective. A component of ease of understanding is the language that is used.

Language needs to be plain English that is appropriate for the target participants, as

well as appropriate to the content being trained. Annie commented on the

relationship between ease of understanding and successfully completing the course:

“I guess the outcome at the end, whether I can actually understand the

information and pass the assessment; that'd be part of [what] I'd consider

whether it's a reasonable sort of a program or not on e-learning. So if I was

not successful, I would question whether it was me or whether it was the

actual program.”

Interaction

This element was discussed in relation to interaction increasing engagement,

and whether the e-learning course keeps the trainee’s interest. It does not incorporate

how users interact with the system from a technical perspective, as this is essentially

navigation and format which was categorised as system quality as discussed

previously.

Participants commented that as well as having appropriate and relevant

content, they also want instruction that is engaging and keeps their interest. Annie

explains:

“I would rate it by whether it kept my interest... so I can immerse myself into

it... I guess that comes to the interactive part. How interactive I am in it, you

Page 129: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

115

know, whether or not I am just reading which usually is—well, for the two

courses that I saw it was read the page, click for the next page, do a couple of

model questions before undertaking your assessment. I didn't find it terribly

exciting, you know. It wasn't a really riveting topic and maybe if it had had a

bit more video, maybe, might have helped for the particular topic.”

Interaction also gives trainees an opportunity to practise the knowledge and

skills they are learning, and provides an opportunity to reinforce that knowledge.

This is shown in an example given by Percy:

“a lot of the e-learning models here, like you just basically refer to your

Powerpoint slides, and they ask some questions, and you think... I felt, I don’t

even know what I’ve read for the last 10 slides because I haven’t been paying

any attention. And then it’s just like a guessing game. You can usually get

most of them or if you only get one wrong, they’ll send you back and you do it

all again.”

Alignment

Similar to relevance, an organisation can have good quality information, but

if there are no clearly defined learning goals that have been clearly set out from the

initial development stage, and there is no alignment between the content, goals, and

assessment, then it may not be a successful e-learning system. Emily explains further

from an L&D professional’s perspective:

“first of all being very clear on the learning outcomes and the learning

objectives that you kind of want out of the e-learning to begin with ... Then

Page 130: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

116

making sure that all of the activities that you design within the e-learning

course contribute to meeting those objectives. I guess it comes down to really

making sure that you are giving them opportunities to develop the knowledge

they need ...”

Although assessment is usually an automatically incorporated component of

e-learning, Emily discusses why an assessment is not always necessary:

“a badly designed assessment is as bad as having nothing at all, really. I'm

actually of the thinking that an assessment at the end is not necessarily

always required if you've got relevant and meaningful interactivity

throughout the actual module. I mean an assessment can be good to provide

... the questions need to be well written ... they need to actually measure the

learning objectives that you sort of have in mind to begin with. A lot of

assessments just measure recall, really ... if they’re badly written questions

then they're not really going to be effective as an evaluation tool anyway”.

Although alignment between learning goals, outcomes, and content is

generated from L&D at the design stage, a number of users also discussed the

concept of alignment. Specifically, Edward highlighted that although they are often

judged on aspects such as time to complete training, this does not align with the

safety messages that are conveyed to them on a daily basis:

“According to the computer, you are judged on time. I mean, again, work

doesn’t want to judge you on time. They don’t want you to hurry up, work,

they want you to take your time. Safety is critical. So, I don’t know how they

mark people on time on the computer then.”

Page 131: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

117

Content accuracy

The final aspect of information quality is the accuracy of content, which also

includes accuracy of assessments. Similar to legitimacy in system quality, a lack of

content accuracy can lead to a loss of trust in the system, content, and training

overall. Alfie explains problems he has encountered with accuracy of assessments:

“we actually had a lot of problems with the actual way it was set up in

regards to the assessment ... what happened was we had a different series of

questions ... From that base of 500 questions a candidate will get asked 100.

The questions will be multiple choice; select one answer or different

responses. None were written; no written response answers. From those

questions we had a lot of problems because the questions themselves didn't

have the correct answers, or they had multiple answers or other problems.”

Content accuracy in simulators becomes particularly important because it

relates to how accurate the ‘real’ experience is. The advantage of simulators is that

they are a direct replication of a real life experience; however, if this accuracy is

taken away it defeats the purpose of simulators:

“...a good example would be braking into a station. You can imagine a

train—the pure size of it, the braking capabilities of it is lower than say, a

motorcar or something. With a simulator one thing you notice when you’re

coming to a platform at 90kms an hour, put the brakes on and you would be

stopping halfway along the platform. You do that in the real world … Choom,

out the other end you go.” (Gordon)

Page 132: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

118

Participants commented that although the simulators offer a good replication

of their working environment they are not perfect and there are situations such as

braking that could be improved in order to ensure accuracy.

Research question 3: How does service quality apply in the context of

organisational e-learning and what is the nature of this factor?

Support quality is traditionally known in the D&M IS Success Model

(DeLone & McLean, 2003) as service quality. Service quality generally refers to the

level of service received by the users of the system and the way in which the service

is provided by the IS department or other provider. Similarly, participants spoke

about two clear groups of elements: types of support and expectations about that

support (see Figure 10). However, in an e-learning environment, it is more accurate

to label this factor as support quality rather than service quality because in an e-

learning environment trainees are looking for things that will support their learning

and support their use of the LMS. In other applications of the D&M IS Success

Model, such as e-commerce, service quality was appropriate as it related to service to

customers. Therefore, from this point on, service quality will be relabelled as support

service for this research. Two key elements emerged from the data, as shown in

Figure 10 and an explanation of each of these subthemes is provided in the following

sections. Additional example narratives relating to qualitative assessment of support

quality are displayed in Appendix 7.

Page 133: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

119

Support

Quality

Types Expectations

Figure 10: Elements of support quality

Types

Participants were asked, in an ideal world what support they think should be

provided when they complete an e-learning course. Responses included: no support

required, built-in support/glossary, tutorial, phone/hotline, L&D plan support,

comfort levels, and a trainer in the room. Each of these forms of support are

explained in further detail in this section.

No support required. A number of participants commented that when moving

away from a face-to-face to an e-learning environment, the technology used and

simplicity and design of the e-learning system means that ideally no support should

be required:

“In an ideal world, they wouldn't need any support. It should be simple

enough to use and/or simple enough that you can give them very basic

support and instructions.” (Thomas)

Built-in support/glossary/tutorials. From an L&D perspective, participants

reported that a focus of support should be technical assistance for the learners to be

Page 134: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

120

able to initially access e-learning, and then once they have access learning support

about the actual content:

“Support to be able to actually know how to access the e-learning; if it's

going to be via the LMS because that's another system that they're going to

have to know how to navigate in order to actually get access to the content

...” (Emily)

Furthermore, built-in tutorials were suggested by participants as a potentially

effective support mechanism, particularly for people who are not as computer literate

and require the guidance that would traditionally be provided in face-to-face training:

“For those people who aren’t as comfortable with technology, it definitely

helps to have a little intro as to how it works. I guess the other thing is

sometimes navigation all through the site. Even sometimes, as obvious as one

would think, so having a how-to thing ... I quite like that kind of pre-learner

thing. Where you know it has a little avatar and it takes you to this bit and

takes you to that bit, so I don’t actually have to read instructions. It [sic]

literally just kind of—it’s a visual demo.” (Henrietta)

Phone/hotline. The majority of interviewees suggested that a hotline or phone

number to call would be an essential type of e-learning support. Although e-learning

removes the physical face-to-face interaction, most participants would still like

access to someone that they can talk to and ask questions rather than email, or look

up the answers in a built-in help:

Page 135: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

121

“I think if you had somebody on the other end of the phone pick up and say,

okay, I'm having problems actually operating or I don't understand how to

get into it, you need to have that facility.” (Thomas)

“I think there should be a hot phone number, where you can ring if you get in

any difficulty or you don't understand something, even though you've read it

three times, if you still can’t understand it there should be someone you can

ring up who's an expert on that e-learning topic and say, look, I just don't get

this, can you explain it to me in more simple terms.” (Billy)

Learning and development plan support. It was identified that as well as

technical support, training plan support in terms of the courses that are available,

how they are grouped together or complement each other, and then being able to

choose those that are appropriate to the individual learner is important. Flora explains

further:

“... perhaps there's also [a need for] access to someone to say well I know

that I'm now on this new job and I need to do a set of training courses but I'm

not exactly sure which ones are suited to me. And I know you'll have some of

that conversation with your manager and with your team that you work with

but perhaps that's something for L&D as well, to offer some guidance around

which courses are relevant to you or that type of thing.”

Trainer presence. An advantage of the simulator system is that although it is

a highly technical form of training, the trainees never feel lost or alone as they still

have a trainer in the simulator to guide them. Gordon explains further:

Page 136: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

122

“Well I think that the way they had it set up was pretty spot on. When we

were in the cab we had an instructor outside, like a little control room and

they were able to communicate to us through a loudspeaker within the cab

and we could talk to them and there was that support base there. If we had a

question right there and then you’d go, hey what’s this, and they’d speak to

us.” (Gordon)

In a traditional e-learning setting this can be achieved with a blended learning

approach, where e-learning is used as a complimentary training method to face-to-

face instruction. It was suggested by a participant that in a blended format trainees

get the best of both situations, and training content can be divided and presented

depending on what is most appropriate to the content.

Expectations

The participants had a number of expectations in relation to the support that

should be offered as part of e-learning; they want timely, easily accessible, and

effective/useful support, with knowledgeable support staff.

Timeliness. E-learning is often seen as a faster and more efficient method of

training that can be integrated into a normal workday. This means, however, that any

support that is required needs to be timely and prompt, otherwise employees will

become frustrated with e-learning. Examples of comments about timeliness include:

“The good parts would be it’s 24 hours, and it's in a format where you don't

have to wait for the person to come back to you, you know, two hours later.

Page 137: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

123

Because if you're e-learning you really want the answer there and then on the

spot, so you can move onto the next module ... to me that’s the whole concept

of e-learning. I've got 10 minutes downtime, I want to spend some time on a

computer learning a program, but I don't want anything to interfere with it.”

(Billy)

“So it needs to be a pretty quick response ... that phone call is actually

answered in a timely way and not [sic]—because if you're sitting in front of

the computer and you're doing something and you've got a query, it's no good

getting a call back half an hour later because you're already in the middle of

you know, whatever the course is.”(Annie)

Knowledgeable. Particularly in the case of simulators, participants identified

that they would want to know that the trainers themselves know how to use the

simulators, to reassure the learner of the trainer’s ability to teach them. This is

potentially because a simulator is quite a technical environment which requires a

particular skills base. Similarly, if a trainee has a question about an e-learning course

they are completing, they want to know that the support person they contact will be

able to answer their questions:

“I would be looking for the person knows [sic] the content of the courses,

that they're a specialist in the area and can answer questions that come up.”

(Annie)

Effective/useful. A number of participants commented that they would rate

any support on how effective or useful it was. Furthermore, it was also identified that

Page 138: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

124

support would be considered effective if the number of calls placed to support

reduced over time:

“I'm talking about the initial tutorials on the system. You could be rated on

the effectiveness of the tutorial, whether it was clear, precise. If you

understood it straight away. The need to review it multiple times [might

indicate a lack of effectiveness] because you didn't understand it because it

wasn't as clear.” (Bertie)

“I guess a good measure as well would be that the calls placed would reduce

over time. Because you're not then just answering one offs but you're

enabling users. That the answers you're giving are not just saying click here

but saying the reason to click here is because, you know, it means you're

doing this or it means you can access that, and so it's sort of giving those

quality answers that mean you understand what you're doing, not just being

told how to fix it this time.” (Flora)

Awareness of support. The final expectation identified by participants is that

they know that the service is available. Any service is redundant if users don’t know

of its availability. This seems logical, however from the following discussion with

Annie, it seems possible that lack of awareness could occur:

“Interviewee: Yeah, now I don't know whether we do [provide support]. Do

we?

Facilitator: I'm not sure.

Interviewee: I'm not sure either and if I'm not sure that means if we do it we

don't advertise it very well.”

Page 139: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

125

New factors which emerged during interviews

In addition to the previously identified system creation elements directly

related to the D&M IS Success Model (DeLone & McLean, 1992, 2003), which were

found to be important in an organisational e-learning environment, two further

elements were identified as factors integral to the success of an e-learning system.

These were learner preferences and change management.

Learner preferences

As well as their preference for the format of information, a number of

individuals also referred to their personal learning style, or learner preferences, as

impacting on their resultant learning from e-learning or satisfaction with e-learning

in general (see Figure 11). Although this incorporates some elements of previous

factors, particularly information quality such as pace and interactivity, the following

are more specific to the individual learner, equating to an individual learner profile.

Although it is difficult to cater to the needs of every learner, it was identified that

individual needs should be considered at the design stage.

Figure 11: Elements of learner preferences

Learner

Preferences

Preference for

face-to-face

training

Hands-on

Learning

Presentation

of Information

Individual

Differences

Page 140: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

126

Preference for face-to-face training

Although participants recognised the benefits of e-learning, a number of them

still prefer face-to-face training. This was not always due to the technology, but for

reasons such as networking possibilities at traditional training and the preference to

be away from the work environment when undertaking training. As previously

discussed, in an e-learning context this can be provided for by integrating e-learning

into a blended approach:

“Because I'm a real people person, I guess, and I really, whenever I've gone

to training courses I really enjoy being out of the office and I feel that I

concentrate then better on what I'm there to focus on and learn. I always meet

one or two new people that I then add to my professional network.” (Flora)

Again, this is an advantage of simulator training. Although it is

technologically based, there is still a component of face-to-face interaction that

trainees desire:

“A lot of people like it because it's not classroom based, it's not the stodgy

old sit in the classroom, there's a lecturer at the front and they talk, talk, talk

and then at the end of the day you absorb it. So it really lends itself to the

learning style of the people that we train ... Train crew aren't wired—train

crew make good train crew because they're touchy-feely learners. They're not

an auditory learner as such, as much as other areas are.” (Thomas)

Page 141: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

127

Hands-on learning

A further benefit of a blended approach is that it allows for hands-on learning.

As was previously identified in ‘nature of content’, some content is seen as more

appropriate for e-learning than others, and participants identified that some content,

particularly that which involves communication, really requires hands-on experience:

“There are obviously some courses that would benefit more from a hands-on

component of it as well. That could be a course split physically into two,

where you do ... it in an e-learning environment and then follow up with a

hands-on course of training after that as well ... Visual representations of

how to do things followed by doing it themselves would be useful. It's the old

see and do training technique. So you see it first and then you do it.” (Bertie)

Presentation of information

This element refers to the alignment between the way the information is

presented and the learning needs of the e-learning user. In order to get the best

outcomes from the training, the e-learning course needs to be designed to meet all

preferred learning styles, including the opportunity to read information, listen to

information, and practise with activities to reinforce the learning:

“The learning needs and a clear kind of understanding of the user from its

technical perspective ... It goes beyond just ... dumping [content] into a [sic]

e-learning environment ... But if you can move beyond that and use a wide

range of techniques to address all components of the learners’ needs. Such

as, you know, the VAK—visual audio kinaesthetic—if possible then that can

make a very ... be a very appropriate tool ...” (Henrietta)

Page 142: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

128

Individual differences

Rather than different learning styles, this element relates to individual

differences that could potentially impact on learning. For example, different cultural

or ethnic backgrounds which could impact on the level of language skills or the type

of language used. Furthermore, learning difficulties, such as hearing or visual

impairments, need to be taken into account in the design of e-learning course. Alfie

explains:

“But, when you think about it, people who were training may come from

different backgrounds whose English may not be their first language. They

may have hearing difficulties. They may have learning difficulties that we're

not aware of. So all these things have got to be taken into consideration when

you're, basically, pushing an e-learning type tool, training tool, onto

particularly [sic] employees.”

Change Management

The second factor which emerged during the interview process was the

concept that e-learning is situated within the organisational context of change

management (see Figure 12). The process of deciding to use e-learning, followed by

the design, implementation, and evaluation, was seen as the most important part of

change management, particularly by the L&D professionals. An e-learning champion

was also seen as an element of this factor.

Page 143: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

129

Figure 12: Elements of change management

General process

Deciding to use e-learning within an organisation to deliver training is no

different to the implementation of any other business initiative. It requires the

development of defined business processes and protocol to ensure everyone

understands the reason for implementation, the key goals to be met, and the roles for

each person involved in the e-learning system. Daisy describes the process of e-

learning at Tracks from an L&D perspective and the importance of defining the

process from the first day the organisation decides to include e-learning in their

training initiatives:

“For me, a good e-learning system is all about the process. Because if you

don't have the process set up correctly, you can't even produce an e-learning

course, basically. And we started off with a methodology but we haven't

really followed it through and we're still refining it and I think we're still

trying to figure out if that's the correct one for us or something. But what we

actually lack here at the moment is defining that clear process. And that

process will actually start from the business, content-wise, you know the

Change

Management

General

Process

Vendor

Process

Content

Development

Process

Evaluation

Processes

E-learning

Champion

Page 144: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

130

purpose of it, is it best to actually have it online or face-to-face, all that kind

of stuff. Going right through to if there's an assessment attached.” (Daisy)

Similarly, Emily describes how important the process is to the development

of e-learning:

“the process that sort of underly [sic] the development and use of the system

is really important as well ... If you actually don't have that down pat, you

don't have an e-learning [sic]. For me, that is a whole system which we're yet

to really tie down here so that would make a good e-learning system.”

(Emily)

Vendor process

A specific part of the e-learning process is the coordination with external

vendors when they are involved in developing the modules. The first step of the

process is the decision to use external vendors rather than develop e-learning in-

house. The next step is to ensure that guidelines are set and can be communicated to

the vendor to ensure the desired quality is met. These guidelines also become

important for future e-learning courses to ensure that there is consistency in the look

and quality of the suite of e-learning courses that the organisation offers to

employees:

“Also in working with an external vendor, just to make sure that, you know

they're actually, they've got solid you know design and development processes

in place ... The guidelines and everything that we're establishing is good for

that purpose as well ... So internal stakeholders know what to look for, even

Page 145: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

131

when they're reviewing deliverables from external vendors and what baseline

standards should be.” (Emily)

Content development process

This element involves the business process of deciding what training content

will be developed into e-learning. This element encompasses wider aspects than just

choosing the information to develop into e-learning. For example, who is responsible

for choosing the appropriate content, what steps need to be taken to develop an initial

idea into an e-learning module, and what processes are put in place to ensure content

is updated to remain current and relevant. Daisy discussed what the current process

at Tracks entails:

“What they're trying to do here is adapt e-learning to that current process

whereby they have to fill out, what is now known as a PIP form so the

business actually fill out their need and requirement for learning and they'll

work with the training manager to define whether it should be face-to-face or

online ... there's steps after that for how you'll build it. So you've got [a]

storyboard and all that kind of stuff that follows.” (Daisy)

Even when processes are established, an issue can arise if all stakeholders are not

aligned in their understanding of what is required of them or how they fit into the

process. This can stem from a lack of education of all stakeholders when initially

implementing the e-learning system. Emily discusses how this relates to the situation

at Tracks:

“In terms of the training managers and even the business stakeholders, they

don't necessarily always realise that their role is to actually ... check that the

Page 146: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

132

content is correct. Because it's not actually just as easy as you know, doing

text update and then—there's a whole other process associated with it. The

text itself might be easy to update but then the whole process around

replacing courses and archiving old versions and all of that sort of thing, that

still hasn't quite been worked out.” (Emily)

Evaluation processes

As previously noted, some elements of e-learning at Tracks are in their

infancy, and thus evaluation processes have not been developed to any large extent.

However, evaluation processes are as important as initial implementation processes

in order to ensure that the desired outcomes are obtained from the training. Emily

describes the evaluation process challenges they currently face at Tracks:

“Evaluation processes ... we don't really know at this stage, how often we’re

going to review and evaluate e-learning courses and even evaluating the

feedback, the user feedback evaluations. There's not really a process around

that. We get the data but what do we do with the data? How often do we sort

of collect the … [data] … and then actually make changes. How often do we

release updated versions of the modules ... Also a process around even

deciding what changes are appropriate and that again needs to go back to

the training managers.” (Emily)

E-learning champion

The concept of a champion is someone who promotes or advocates for a new

idea or initiative in order to create understanding and confidence. These champions

Page 147: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

133

are often at a peer level rather than advocating top-down from management. New

initiatives, and especially new technologies, can seem daunting to employees who

are not familiar with them; however, participants reported that peer champions can

help to provide support and assurance and to gain trust to ease uncertainty:

“Even having, I think for other systems they've got like system champions and

stuff. Like high-end users who use the system very well and therefore act as

an advocate in their divisions. That sort of thing is usually effective because it

becomes oh, one of us. Not someone from somewhere else telling us we

should do X or Y.” (Flora)

Similarly, it was suggested that getting people comfortable with basic e-

learning before engaging a high number of e-learning courses is important to build

up the confidence of employees:

That's really the idea behind, it was [an e-learning team member’s] idea

actually to internally develop some stuff with them and kind of take them

through the process with a lot more support so that they become a bit more

enthusiastic ... I guess with anything that's new, especially technology-

related, could be a bit of a black box for the people who aren't used to using

it. So, just getting their comfort levels up a little bit more.” (Emily)

At a higher level, new training initiatives need to be not only promoted but

supported by upper levels of management. To do this, management needs to have a

clear understanding of the importance of training and the value of using e-learning.

The attitudes of management will potentially filter down to other employees and

impact the success or failure of training initiatives such as e-learning. Thomas

Page 148: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

134

discussed why training champions at the management level are so important, and

why training is often seen as the ‘poor cousin’ in organisations:

“I think the biggest issue is going to be you've really got to champion this

stuff out amongst the business because training's always been the poor cousin

... people see, particularly management, see training as a necessary evil. We

have to do it to be compliant and make sure that our people meet the bare

minimum, but that's all it is and it's a cost, because we have to take people off

the roster for a day to go and do some training. So I don’t think the vast

majority of management, in particular who have never been involved in

training, have a particularly good view of training or understand its

importance and that will be bundled in or bundled on to the simulator stuff

because they don’t understand what we're aiming at ... there's huge potential

there ... So in the long-term we're going to have to sell it and really make

people understand what that value is to the business and it's not just a

monetary value, not at all. In fact, monetary wise, you're probably no better

off than having a paper assessment.”

Chapter Summary

This chapter has presented the findings as they relate to the research

questions of this study. The chapter began by outlining the participants and their key

attributes followed by an overview of the context of this research—e-learning at

Tracks.

It has been shown that the three system creation elements—system quality,

information quality, and support quality—are important in an organisational e-

learning context. The dimensions of these elements have been outlined in detail to

Page 149: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

135

create a comprehensive picture of system creation in an organisational e-learning

context. Furthermore, an extra two factors—learner preferences and change

management—were also found to emerge from the data.

A discussion of the results as they relate to relevant literature will be

presented in the following chapter, as will limitations and implications of the results.

Page 150: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

136

Chapter 5—Discussion and Conclusion

Chapter Overview

This chapter provides a final discussion and interpretation of the research

findings in relation to the research questions and the literature presented in Chapter

Two—Literature Review. The contributions to theory are discussed followed by the

practical implications for organisations to improve the success of their e-learning

initiatives. The limitations of the study are recognised along with recommendations

for future research into the use of e-learning in organisations. This chapter concludes

by providing an overall summary of this thesis. The chapter structure is outlined in

Figure 13.

Page 151: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

137

Figure 13: Chapter Five Structure

Review of research framework

The overall purpose of this research was to determine what are the critical

elements to evaluate the success of e-learning initiatives in an organisational

setting? An extensive review of the literature on e-learning, traditional L&D, and IS

evaluation identified the need to further explore the applicability of broader IS

theories to the specific context of organisational e-learning. Based on this literature a

Review of Research Framework

Contributions

Research Questions

Research Limitations

Directions for Future Research

Key Literature

Contributions to Theory

Thesis Summary

Findings

Contributions to Practice

Page 152: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

138

research framework was built to guide this research. The framework was presented in

Chapter 2 and is revisited in Figure 14.

Figure 14: E-learning success research framework

The research framework presented the three components which are suggested

as critical to e-learning success: system creation, system use, and system

consequences. The focus of this research was to investigate the system creation

elements in an e-learning context, as guiding factors which have the potential to

impact on system use and system consequences. From this research framework three

research questions were derived:

Research Question 1: How does system quality apply in the context of

organisational e-learning and what is the nature of this factor?

SYSTEM

CREATION

SYSTEM USE SYSTEM

CONSEQUENCES

INFORMATION

QUALITY

SYSTEM

QUALITY

SERVICE

QUALITY

INTEN-

TION TO

USE

USE

LEVEL 3 -

BEHAVIOUR

LEVEL 2 -

LEARNING

NET BENEFITS

LEVEL 4 –

RESULTS

LEVEL 5 - ROI

USER

SATISFACTION

LEVEL 1 -

REACTION

Page 153: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

139

Research Question 2: How does information quality apply in the context of

organisational e-learning and what is the nature of this factor?

Research Question 3: How does service quality apply in the context of

organisational e-learning and what is the nature of this factor?

Some of the findings of this study are consistent with the literature which is

predominantly based in education, but this is one of very few studies which is

situated in an organisational context. Therefore, these findings advance findings of

previous research and add a depth of understanding to the gap in the literature about

evaluation of e-learning in organisations. The findings of this research as they relate

to the research questions and the relevant literature will be discussed in the following

section.

Discussion of Findings and Theoretical Implications

System quality

Research question one asked ‘how does system quality apply in the context of

e-learning and what is the nature of this factor?’ This research found that the

elements critical to system quality in an organisational e-learning context are

structure, accessibility, legitimacy, flexibility, functionality, ease of use, and long

term knowledge resource (refer to Figure 8 in Chapter 4).

Technical elements such as structure, accessibility, and functionality indicate

support for the definition of system quality in a traditional IS context which states

that system quality measures technical success—the desired characteristics of the

system itself, which produces the information (DeLone & McLean, 1992, 2003;

Page 154: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

140

Nielsen, 2005). However, elements such as legitimacy and long-term knowledge

resource seem to be specific to the organisational e-learning context. An overview of

the overlap between elements found in this study and existing studies is presented in

Table 12. In instances where existing studies refer to a similar concept by a different

name, the name is provided in brackets for ease of reference.

Page 155: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

141

Table 12

System Quality—Comparison of current study results to existing studies

System Quality

Element

DeLone &

McLean (1992,

2003)

DeLone &

McLean (2004)

Holsapple & Lee-

Post (2006); Lee-

Post (2009)

Wu & Wang

(2006)

Wang et al.

(2007)

Chen (2010)

Traditional IS E-Commerce Educational Organisational:

Knowledge

Management

System Success

Organisational Organisational

Structure X

(User-friendly

interface)

X

Accessibility X

(Availability)

X

(Availability)

Legitimacy

Flexibility X X

(Adaptability)

Functionality X X

Ease of Use X X

(Usability)

X

(Ease of use;

User-friendly)

X X X

(Buttons for

operations are

clearly and easily

understood)

Long-term

Knowledge

Resource

Page 156: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

142

Typically, in a traditional IS context, system quality is measured in terms of

“ease-of-use, functionality, reliability, flexibility, data quality, portability,

integration, and importance” (DeLone & McLean, 2003, p. 13). As can be seen in

Table 12, there are similarities between the findings of this study and a number of

these elements, namely ease-of-use, functionality and flexibility. This seems logical,

as regardless of the type of system, users could expect to require a system that is not

difficult to use, functions as intended, and is flexible enough to meet the needs

specific to the learning situation in which it is being used. The remaining elements of

structure, accessibility, legitimacy, and long-term knowledge resource, were found to

be of importance in this study, but these are not traditionally measures of system

quality.

Participants spoke mostly about structure contributing to the quality of a

system. Although at first glance this may seem similar to functionality, structure

refers to how the e-learning is structured, the modules and their order, an agenda to

guide learners, navigation around the modules and the LMS, and the format of the

LMS. In comparison, functionality refers to whether the system functions as

intended, and whether technical issues prevent the user from using the system and

navigating the LMS and/or e-learning modules to their satisfaction. Structure was

included as an element of the studies by Wu and Wang (2006) and Chen (2010),

although referred to as ‘user-friendly interface’ by Wu and Wang. Interestingly, these

two studies were also based in an organisational context, building support for

structure as a critical component of system quality in an organisational e-learning

context.

When comparing the system quality findings of this research to more recent

studies, a number of further similarities and differences were found. A key similarity

Page 157: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

143

is that all eight comparative articles reference ‘ease of use’, or similar terms, such as

usability (DeLone & McLean, 2004), user-friendly (Holsapple & Lee-Post, 2006;

Lee-Post, 2009), and buttons for operations are clearly and easily understood (Chen,

2010) as elements of system quality. Further similarities were seen with participants

reporting the desire for ‘flexibility’ in a system; flexibility is an element which was

originally used as a measure in the studies by DeLone and McLean (1992, 2003,

2004). However, this factor has not been included in more recent studies. This is

concerning, as a key benefit of e-learning is its potential to be a flexible delivery

option for training. Furthermore, the andragogical principle of self-concept of the

learner (Knowles et al., 2005) suggests that adult learners want to be responsible for

their own learning, and flexibility in the system provides a mechanism for

independent, self-directed learning.

A key difference between existing research and this study is the discovery of

‘legitimacy’ and ‘long-term knowledge resource’ as two new key elements of system

quality. Legitimacy relates to a feeling of trust on behalf of the e-learning user in the

organisation delivering the e-learning, and the authenticity of the e-learning content.

This element encompasses aspects such as voice-over accents, branding, and

assumptions that are made about the legitimacy of the e-learning depending on these

aspects. The participants voicing their concerns about the legitimacy of e-learning

could be attributed to the large amounts of new technologies given to employees, and

a desire to know ‘this isn’t just another piece of technology’. Similarly, the

functionality of the LMS as a long-term knowledge resource provides a benefit to

employees that they previously haven’t foreseen when using other forms of new

technologies. Users want the ability to create their own ‘virtual filing cabinet’ of

Page 158: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

144

resources, training, and reference materials, and see e-learning as having the

potential to provide this ongoing resource.

Information quality

The second research question of this study investigated ‘how does

information quality apply in the context of e-learning and what is the nature of this

factor?’ This research revealed that ease of understanding, content accuracy,

relevance, interaction, alignment, format, and nature of content are elements critical

to information quality in an organisational e-learning context (Refer to Figure 9 in

Chapter 4). Table 13 provides an overview of the comparison between these

elements and those relating to information quality in existing studies. As in Table 12,

similar concepts by a different name are provided in brackets. A discussion of these

comparisons follows.

Page 159: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

145

Table 13

Information Quality—Comparison of current study results to existing studies

Information

Quality Element

DeLone &

McLean (1992,

2003)

DeLone &

McLean (2004)

Holsapple & Lee-

Post (2006); Lee-

Post (2009)

Wu & Wang

(2006)

Wang et al.

(2007)

Chen (2010)

Traditional IS E-Commerce Educational Organisational:

Knowledge

Management

System Success

Organisational Organisational

Ease of

Understanding X

(Understandability) X X

(Clearly written) X

(Logical & fit;

Understandable &

practical)

X X (Precise & clear)

Content Accuracy X X (Completeness)

X (Consistent

wording)

X

Relevance X X X (Up-to-date;

Useful)

X (Available at a time

suitable for its use;

important and

helpful;

meaningful)

X (Exactly what you

need; At the right

time; Relevant to

job)

X (Complete &

sufficient; Helps

solve my problems)

Interaction Alignment Format X X

(Effectively

presented)

Nature of Content

Page 160: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

146

Traditionally described as the measurement of output from the system,

information quality stresses characteristics of the information and the way it is

presented according to the needs of the users (Nielsen, 2005). Previously,

information quality was defined as quality of the content, accuracy, precision,

currency, reliability, timeliness, completeness, relevance, and format required as

perceived by the end user (DeLone & McLean, 2003; Negash et al., 2003; Nielsen,

2005). However, an important part of the explanation by Nielsen (2005) discussed

earlier (‘according to the needs of the user’) implies that, depending on the type of

user, there may be different measurements of information quality. In the case of this

research, it was identified that organisational e-learning users potentially have needs

different to those of educational e-learning users or traditional IS users that have

dominated prior research.

A key distinction that emerged during the analysis relating to information

quality was the difference between traditional e-learning environments and simulator

environments. Although simulators are defined as a type of e-learning at Tracks, the

application of the information quality element is different to that of more traditional

forms of e-learning. For example, information quality in traditional e-learning relates

to the output from the system, and generally refers to training content. In a simulator

environment, the training content becomes the simulation experience and the impact

on the trainee in terms of how ‘real’ the experience is perceived to be. Nichols (2003,

p. 2) clarifies the difference between the two types of interactivity:

“there are two types of interactivity, indicative and simulative. Indicative

interactivity is typified by the use of button rollovers and site navigation.

Clicking a button to start an animation or turn the page is indicative

interactivity. Simulative interactivity is interactivity that enables students to

Page 161: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

147

learn from their own choices in a way that provides some form of feedback.

The ability to select between different Web pages is indicative interactivity;

the ability to fly a virtual plane in a realistic virtual environment is

simulative interactivity.”

Although the application is different, participants generally spoke about the

same elements. For example, both traditional e-learning users and simulator users

spoke about relevance and content accuracy. As can be seen in Table 13, a number of

the elements that emerged as critical for information quality in this study are also

findings in all comparison studies. A number of new elements were raised by

participants, which may be due to the needs of these particular e-learning users.

‘Ease of understanding’ and ‘relevance’ were two themes that emerged when

participants were asked how they would rate the quality of information delivered via

an e-learning system. This is in line with previous research that included these or

similar elements as constructs of information quality in their studies. Ease of

understanding (DeLone & McLean, 2004; Wang et al., 2007) was also referred to in

previous studies as understandability (DeLone & McLean, 1992, 2003), clearly

written (Holsapple & Lee-Post, 2006; Lee-Post, 2009), logical and fit,

understandable and practical (Wu & Wang, 2006), and precise and clear (Chen,

2010). Relevance was originally used in DeLone and McLean’s research (1992,

2003, 2004), however the wording has changed more recently to ‘up to date’ and

‘useful’ (Holsapple & Lee-Post, 2006; Lee-Post, 2009), ‘available at a time suitable

for its use’, ‘important, helpful, meaningful’ (Wu & Wang, 2006), ‘exactly what you

need’, ‘at the right time’ (Wang et al., 2007), and finally ‘helps me solve my

problems’ (Chen, 2010). Regardless of the label assigned to these constructs, they all

Page 162: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

148

relate to the relevance of the information to meet the organisational e-learning users’

needs.

These results are in line with Delahaye and Smith’s (1998) learning principles

unique to mature learners. These authors proposed that material must be meaningful

to the learner. This means that it must be ‘relevant’ to prior experiences and to future

learner needs. As suggested by Waight and Stewart (2005a), relevance is important

in the learning process to assist learners to transfer their knowledge and skills to the

workplace. Furthermore, many of Delahaye and Smith’s (1998) other principles,

such as spaced learning, feedback, reinforcement, and primacy and recency, all work

to ensure ease of understanding for the learner. These results add further support for

the applicability of the fundamental adult learning assumptions and principles to the

modern e-learning context.

Some clear differences were found between the results of this study and

relevant literature. Interaction, alignment, format, and nature of content were all

found to be elements crucial to information quality in the context of this study, but

were not recognised in other studies.

Interaction in e-learning has attracted increased attention recently (see

Derouin et al., 2005; Garavan, Carbery, O'Malley, & O'Donnell, 2010; Salas,

Kosarzycki, Burke, Fiore, & Stone, 2002). However, the discussions are mostly

limited to student-to-student interaction and student-to-instructor interaction (see

Muilenburg & Berge, 2005; Picciano, 2002; Sun et al., 2008). How students interact

with the training material and whether it engages them (which is the basis of this

element) has been researched to a lesser degree. However, this element does not

incorporate how users interact with the system from a technical perspective.

Interaction was found to be important, as participants indicated that interaction both

Page 163: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

149

helps to keep them engaged in e-learning, and helps them practise skills to reinforce

knowledge learnt. Once again, these findings are consistent with Delahaye and

Smith’s (1998) principles of active learning (which suggests learners should be

actively engaged in the learning process), and overlearning (learners’ experiences

need to encourage practice beyond the level of perfect recall).

Surprisingly, the format of how information can be presented in e-learning

(for example using text, diagrams, questions and answers, and videos) was not

incorporated in many of the existing measures of information quality in comparative

studies (see Table 13). It was, however, identified by participants of this study as a

critical component of information quality. Studies such as Holsapple and Lee-Post

(2006) and Lee-Post (2009) interpreted information quality as dimensions to evaluate

the course content (for example clearly written, length, usefulness and currency).

This research has shown that the decisions around the format of information to

include in e-learning are equally important. Furthermore, regardless of the type of

format, a key finding was that participants would like a blend of a number of

different types. In an organisational context, blended solutions allow organisations to

take advantage of the inherent benefits of both face-to-face training and e-learning.

In addition to the format of information, the ‘nature of content’ is an equally

important consideration. This element refers to knowledge-based versus skills-based

content, stability of content, safety critical nature, and location of trainees. The L&D

professionals in particular commented that these issues all need to be considered

when deciding whether content is appropriate for e-learning. Prior studies did not

include this as an element of information quality. However, when assessing e-

learning quality from a holistic perspective—as this study aims to do—a factor such

Page 164: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

150

as content appropriateness for e-learning becomes just as important as technical

aspects of the information.

Alignment was the final element identified by participants as an important

aspect of information quality. Again, this element was not recognised in literature as

an element of information quality. Comparative studies measure information quality

from the perspective of the user, for example:

“the e-learning system provides information you need at the right

time” (Wang et al., 2007, p. 1804);

“the information provided by the e-learning system helps to solve my

problems” (Chen, 2010, p. 1637); and

“the knowledge or information ... is meaningful, understandable, and

practical” (Wu & Wang, 2006, p. 737).

Alignment, however, relates to the development of the information before it is

delivered to the users. Alignment in organisational e-learning relates to the links

between learning goals, content, and assessment. For example, it was identified in

this study that in order for assessment to be effective, it needs to be well written and

directly assess the learning goals and content. This concept of alignment has been

researched extensively in the educational arena. Biggs (2003, pg. 27) states that a

successful teaching system aligns learning outcomes, learning and teaching activities,

and assessment. This system is called constructive alignment (Biggs, 2003, pg. 25).

Participants, particularly the L&D professionals, related the need for this alignment

for organisational e-learning to be successful.

Page 165: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

151

Support quality

The third and final research question investigated ‘how does service quality

apply in the context of e-learning and what is the nature of this factor?’ A key

finding of this study is the use of terminology for this element. Referred to as service

quality in the updated D&M IS Success Model (DeLone & McLean, 2003), the term

‘support quality’ was found to be more appropriate in this study. Service quality has

generally referred to the level of service received by users of the system, and the way

in which the service is provided by the IS department or other provider. Although

still correct in an organisational e-learning context, the term ‘support’ was deemed

more accurate than ‘service’. Indeed, in an e-learning environment, trainees are

looking for things that will support their learning and support their use of the LMS.

As such, this research reports on support quality; however, when referencing prior

studies it incorporates service quality measures.

In prior research, support quality has principally been measured as user

satisfaction with the service provided (DeLone & McLean, 2003; Pitt et al., 1995).

User satisfaction is also a separate factor of the DeLone and McLean model (2003),

and was retained as a system consequence element in this study’s research

framework. Thus, an investigation into the nature of service quality as a standalone

factor was undertaken. As such, two key elements emerged from the data: Support

‘types’ and ‘expectations’. Table 14 displays an overview of the comparison between

these elements and those relating to support quality in existing studies. As per Table

13, similar concepts by a different name are provided in brackets. A discussion of

these comparisons follows.

Page 166: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

152

Table 14

Support Quality—Comparison of current study results to existing studies

Support Quality

Element DeLone & McLean

(2003) DeLone & McLean

(2004) Holsapple & Lee-

Post (2006); Lee-

Post (2009)

Wu & Wang (2006) Did not assess

Support Quality

Wang et al. (2007) Chen (2010) Did not assess

Support Quality

Traditional IS E-Commerce Educational Organisational:

Knowledge

Management

System Success

Organisational Organisational

Types

- No support required

- Built-in

support/glossary/tutori

als

X (Online assistance

& explanation)

- Phone/hotline

- L&D plan support

- Trainer in room Expectations

- Timeliness X (Responsiveness)

X (Responsiveness)

X (Prompt;

Responsive)

- Knowledgeable X (Assurance)

X (Assurance)

X

- Effective/Useful X (Effective support

overall)

- Awareness of support

Page 167: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

153

An omission of prior research seems to be the analysis of the nature of

support quality. Comparative studies tend to focus on users’ satisfaction with support

provided, however they are yet to detail what support users desire or their

expectations about that support. As can be seen in Table 14, none of the prior studies

reviewed explicitly state that type and expectations are two critical considerations of

support quality.

This research acknowledges that the types, and therefore the expectations, of

support quality will differ depending on the context or organisational situation. Even

though some of these elements may be context specific, some of them may be also be

generalisable to other contexts. A key finding of this research is that participants

desire access to a number of different support types depending on their situation or

personal needs. For example, built-in support or tutorials, a phone number or hotline

to call, or a trainer in the room. Wang et al. (2007) similarly identified online

assistance as an item of support quality.

In contrast, in this case organisation a number of participants suggested that if

the e-learning is of a high enough quality then no support should be needed. This is

congruent with the androgogical principle of ‘self concept of the learner’ (Knowles

et al., 2005). The principle suggests that adult learners want autonomy in their

learning experience. The perception of e-learning is that it allows learners complete

autonomy and as such they either shouldn’t need or don’t want support—to be

effective, the system should not require such support.

An interesting finding, which was also not identified in prior literature, was

the desire for L&D planning support. This goes beyond purely technical support

whilst using the modules to include professional development support. This type of

support could potentially educate learners about the courses that are available, how

Page 168: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

154

they are grouped together or complement each other, and how to choose those that

are appropriate to the individual learner.

Similarities between prior research and results of this study were found when

examining the expectations elements that participants identified as important. Again,

these elements depend on the context and types of support that are provided. For

example, timeliness would be critical if the support was a hotline, but would not be

relevant if there was built-in support or tutorials. Timeliness was identified as the

most important expectation of support quality. Similarly, other studies identified

factors such as responsiveness (DeLone & McLean, 1992, 2003; Holsapple & Lee-

Post, 2006; Lee-Post, 2009), and promptness (Holsapple & Lee-Post, 2006; Lee-Post,

2009) as measures of support quality. Furthermore, that the support staff are

knowledgeable was seen as an important factor in this study and in Wang et al.’s

(2007) measure of support quality. Overall, this research has shown that support

quality does apply in the context of organisational e-learning, and that the breadth of

elements to consider is greater than that presented in prior studies.

New Factors which Emerged During Analysis

In addition to the existing system creation elements which were expected to

apply in an organisational e-learning environment, two further elements were

identified in the thematic analysis as factors integral to the success of an e-learning

system. These were learner preferences and change management.

Page 169: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

155

Learner preferences

Like support quality, learner preferences are specific to the individual learner,

and therefore it is expected that the elements of learner preferences will differ

depending on the context and demographic of employees. Learner preferences relate

to an individual learner profile: how an individual’s learning style or characteristics

could potentially impact on learning or satisfaction with e-learning in general. These

individual characteristics could be things such as age, gender, ethnic background,

and disabilities. Although elements of system quality and information quality appear

to be relevant to this factor, learner preferences is more an acknowledgement that

each learner is different, and these learner differences need to be considered at the

system creation stage of e-learning. In this case study, four elements emerged from

the data as critical to learner preferences. These were a preference for face-to-face

training, hands-on learning, presentation of information, and individual differences.

Delahaye (2011) suggests that diversity in individuals in the workplace

should be celebrated, and that this diversity can provide options and perceptions that

can enhance the organisation’s future. Due to its nature, e-learning more so than any

other training method has the ability to cater to diversity and individual learner

differences. For example, e-learning can provide the ability to have verbal

explanations, diagrams, written word, or videos to facilitate learning, selected by the

learner according to their preferences. Sections not understood can then be replayed

or revised as often as required. The flexibility that e-learning provides, has the

potential to result in improved learning outcomes for all trainees.

Page 170: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

156

Change management

Change management and the processes surrounding change (the decision to

engage in e-learning as a training method, and the design, implementation, and

evaluation of e-learning) were seen as one of the most important factors of a

successful e-learning initiative. The L&D professionals, who in many cases are the

individuals most impacted by these processes, displayed the highest level of concern

surrounding the need for change management to be considered. Five elements were

identified as critical to change management in an organisational e-learning context:

general process, vendor process, content development process, evaluation process,

and e-learning champion.

Recognising that workplace learning is a unique setting and distinctly

different to educational settings, Rylatt (2000, p. 6) proposed eight principles to help

guide organisations that want to truly engage in workplace learning and develop the

capacity of their people. These principles are fundamental to workplace learning and

are important to consider regardless of the training medium. On reflection of the

results of this study, and the elements that emerged from the data regarding change

management, they are appropriate to consider in an e-learning context. These

principles are outlined in Table 15.

Page 171: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

157

Table 15

Workplace learning change principles (Rylatt, 2000, p. 6) Workplace learning

principle Explanation of principle

1. Must be

greater than

change

To combat the effects of change on an organisation, it is argued that

learning must be rapid and continuous. The learning must not just

keep pace with change, but be able to go beyond just the basic

expectations of coping with change, to a point where individuals

use learning to capitalise on change.

2. Must be

systematic

and

interactive

Successful workplace learning requires well planned and integrated

inputs to ensure a high quality of design, delivery, and assessment.

Successful workplace learning results in positive business

outcomes, competency improvement, and highly satisfied people.

A lack of attention to inputs will quite likely result in undesired and

unplanned chaos. In addition to being systematic, workplace

learning requires constant nurturing with an interactive approach of

a blend of strategies, policies, programs, and resources. 3. Must be

geared to

business

outcomes

Clear, strong, and robust linkages between workplace learning

outcomes and both short- and long-term business needs of the

organisation are required. In order to meet this requirement, all

planning processes must understand and reflect vital internal and

external business issues.

4. Must provide

meaning, self

worth, and

sustenance

Workplace learning must focus on a wider range of issues than

traditional training material. For example, increasing self-esteem

and self-worth, conflict resolution, career planning, and

maintaining emotional and physical wellbeing. In addition, they

must be available to all employees.

5. Must be

learner-driven

Individuals will more likely see the benefits of learning and

training if they value the integrity and mental diversity in each

person. Taking time and effort to invest in learning will help

individuals develop a clearer understanding of their learning

potential.

6. Must be

competency-

based

Central to the success of workplace learning is the identification,

development, and assessment of relevant and measurable

competencies. Competency-based learning should link all desired

knowledge, skills, and attitudes to the business challenges of the

organisation.

7. Must be just-

in-time

Increasingly, learning in organisations is required ‘right now’. In

order to meet this demand and succeed, learning systems need to be

driven by a commitment to support approaches that facilitate this

delivery. Approaches suggested are: ensuring communication

channels are open and respected, and maintaining an online

helpdesk.

8. Must expand

into new

frontiers of

knowledge

Knowledge management needs to be seen as an important part of

business strategy. Whilst acknowledging the past, practices and

lessons learnt, modern workplace learning needs to push

boundaries. All people in the business should be encouraged to

share their knowledge through well thought out and flexible work

practices.

Page 172: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

158

These eight principles are applicable to e-learning, and in a number of

instances align with the benefits of e-learning: e-learning can be learner-driven,

competency-based, provides just-in-time training, and is the perfect medium for

expanding into new frontiers of knowledge.

Change management processes are not easily applied to the three system

processes (system creation, system use, and system consequences) of the e-learning

success research framework. However, it is envisaged that the three processes should

be situated within an appropriate environment of change management. This means

that organisations should recognise this as an important consideration in the process

of developing a successful organisational e-learning initiative. Figure 15 displays the

resultant Organisational E-learning Success Framework of this research, which

incorporates the guiding research framework of this study with the addition of

learner preferences and the change management context. The model also itemises the

elements of the original system creation factors—system quality, information quality,

and service quality—specific to organisational e-learning.

Page 173: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

159

Figure 15: The Organisational E-learning Success Framework

CHANGE MANAGEMENT CONTEXT

SYSTEM

CREATION

SYSTEM USE SYSTEM

CONSEQUENCES

INFORMATION

QUALITY

- Format

- Nature of

content

- Relevance

- Ease of

understanding

- Interaction

- Alignment

- Content

accuracy

SYSTEM

QUALITY

- Structure

- Ease of use

- Functionality

- Legitimacy

- Long-term

knowledge

resource

- Flexibility

- Accessibility

SERVICE

QUALITY

- Types

- Expectations

INTENTION TO

USE

USE

LEVEL 3 —

BEHAVIOUR

LEVEL 2 —

LEARNING

NET BENEFITS

LEVEL 4 —

RESULTS

LEVEL 5 — ROI

USER

SATISFACTION

LEVEL 1 —

REACTION

LEARNER

PREFERENCES

Page 174: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

160

Overall, the findings of this research have made valuable theoretical

contributions in the areas of evaluation of e-learning, e-learning in organisations, and

L&D generally. Whilst traditional L&D evaluation and IS evaluation provide a

useful basis for investigating the critical elements of e-learning evaluation, neither

area fully addresses the situation in a holistic way. This study has added to theory by

developing a holistic Organisational E-learning Success Framework (Figure 15) to

inform further research and practice. Finally, this paper has also added to e-learning

literature by expanding studies from predominantly educational contexts into

organisational settings.

Contributions to Practice

Overall, this research has highlighted the need for evaluation of e-learning to

be approached from a holistic perspective, and identified elements critical to success

of e-learning design and implementation. Organisations need to ensure that the L&D

opportunities provided to their employees, such as e-learning, support positive

learning outcomes and provide an ROI to the organisation. A number of observations

were made during this research that have practical implications for organisations.

They are outlined below.

Beyond Kirkpatrick: Although Kirkpatrick’s (1976) model of evaluation is

applicable to e-learning (as discussed in Chapter Two), a comprehensive

approach to evaluate the success of e-learning initiatives is required. The

integration of new technologies into the learning process introduces new

variables, which can be assessed by using an approach broader than that of

Kirkpatrick’s model alone.

Page 175: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

161

Evaluation versus success: The results of this thesis have highlighted that

different stakeholders evaluate e-learning in different ways. Due to the need

to justify the cost, organisations tend to evaluate training initiatives in terms

of tangible outputs and data. In contrast, users evaluate at a different level and

assess different factors. The impact is that traditional evaluation methods

assess e-learning once completed by the user. The Organisational E-learning

Success Framework developed in this study aids organisations to plan and

design for ‘success’ to meet expectations of a range of stakeholders.

LMS implementation: The results of this research have revealed that an e-

learning system is more than just the e-learning courses, and refers to both the

LMS and the e-learning courses contained within the LMS. This has

implications for organisations that decide to expand their training to include

e-learning, and will therefore need to procure an LMS. It is important that

time is taken to assess what will be required of the LMS, what courses are

intended to be uploaded on to the LMS and the outputs that will be required,

for example reporting. These issues will be critical to the functioning and

evaluation of e-learning courses in the future.

Integrating e-learning as a new training method: A critical factor of e-

learning success is the acceptance of e-learning as a new training method.

This research has identified specific factors for organisations to consider to

facilitate acceptance of e-learning as a method of training.

The model developed during this research shows that evaluation is a process

that needs to be considered from the outset of e-learning design and implementation

right through to delivery and outcomes. This means that the outcomes that are

desired from e-learning courses can influence a wider range of factors such as the

Page 176: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

162

purchasing of an LMS to support course delivery. The Organisational E-learning

Success Framework is proposed as a practical tool to guide organisations in their use

of e-learning systems.

Research Limitations

This study has a number of limitations which influenced the design and

results and therefore need to be acknowledged when discussing the significance and

generalisability of the research. Due to the time constraints of a 12-month Masters’

project it was not possible to complete a multiphase study to further explore and test

the relationships of the constructs investigated in this qualitative study. This means

that we cannot assume any relationships between the constructs of the research

framework. However, the potential for future research involving this framework is

discussed in the following section.

A further limitation is the generalisability of the results. As previously noted

in Chapter 3—Methodology, the aim of this study was to explore in depth factors of

system creation, not to explain potential relationships between factors or develop

generalisable propositions. In saying that, although generalisability is limited as data

were only collected in one organisation, it is argued that the employees of Tracks

would have similar characteristics to employees of other organisations in the same

industry, or organisations that have a geographically dispersed mix of white and blue

collar workers. This study provides a basis for replication in other organisations in

the future, thus providing an opportunity to develop greater generalisability.

Page 177: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

163

Directions for Future Research

This study aimed to investigate the critical factors to evaluate the success of

e-learning initiatives in organisations. Although these are well developed in

traditional IS literature, and have begun to be explored in educational e-learning,

they are new to the organisational e-learning environment. If organisational e-

learning is to be adequately evaluated in the future, there is a need for future research

to clearly define these concepts and create appropriate measures for them in an e-

learning environment. Furthermore, replication of this study in other contexts, as

well as extending this research to include a quantitative methodology to test for

relationships between constructs, would increase the generalisability. As summarised

by Richards (2005, p. 131): “the discoveries and accounts are not usually

generalisable beyond the small study, but the construct created can usually be tried

out in other settings. To do so is important”. Other researchers should use the results

of this research as a basis to extend the field of organisational e-learning.

At times throughout the presentation of findings and discussion in this thesis

a distinction has been made between the results of e-learning users versus those of

the L&D professionals. Empirical research involving multiple stakeholder groups in

the domain of training evaluation is limited (Michalski & Cousins, 2000). Training

evaluation in organisations today, including e-learning evaluation, tends to be

restricted to Kirkpatrick’s (1976) first level of evaluation (Reaction). In particular,

course level trainee satisfaction is assessed, and in very few cases learning and

behaviour changes (Bassi, Benson, & Cheney, 1996; Tan, Hall, & Boyce, 2003).

This is in part due to the fact that training professionals focus on what is important to

them, which are usually trainee reactions or learning measures. They often fail to

consider criteria which would be of interest to other stakeholders such as

Page 178: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

164

performance reports or financial indicators (Chapman, 2004; Nickols, 2005; Sutton

& Stephenson, 2005). Sutton and Stephenson (2005, p. 363) addressed this point

stating that the fundamental problem with training evaluations is that they are

“designed by professionals, for use by training professionals...”. Accordingly, there

would be many benefits for considering a multiple stakeholder perspective approach

to exploring the critical success factors in e-learning evaluation. Stakeholder-based

evaluation provides a useful conceptual frame to address this problem and build on

the results of the current research.

Thesis Summary

This thesis has examined e-learning in organisations, with a focus on the

elements critical to evaluation and success. The study highlighted the need for a

holistic approach to e-learning evaluation. Furthermore, it has shown that the

application of both traditional training evaluation approaches and the D&M IS

Success Model are appropriate to the organisational e-learning context, and when

combined can provide this holistic approach.

Practically, this thesis has reported the need for organisations to consider

evaluation at all stages of e-learning from design through to implementation. It has

also shown that the processes surrounding the development of e-learning are just as

important as the e-learning modules themselves. This thesis has also highlighted that

adult learning principles remain critical considerations in the design of successful e-

learning regardless of the different delivery platform, and that new technology still

requires thoughtful consideration of the learner’s needs. Ultimately, a sophisticated

system with many attractive features is no substitute for a learning approach with a

strong foundation in the core adult learning principles.

Page 179: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

165

References

Abrami, P. C., Bernard, R., Wade, A., Schmid, R., Borokhovski, R. T., & Surkes, M.

(2006). A review of e-learning in Canada: A rough sketch of the evidence,

gaps and promising directions. Canadian Review of Learning and

Technology, 32(3), 1-70.

Alliger, G. M., & Janak, E. A. (1989). Kirkpatrick's levels of training criteria: Thirty

years later. Personnel Psychology, 42(2), 331-342.

Alsabawy, A. Y., Cater-Steel, A., & Soar, J. (2011). Measuring e-learning system

success. Paper presented at the PACIS 2011: Quality Research in Pacific

Asia, Brisbane, Australia.

http://aisel.aisnet.org/cgi/viewcontent.cgi?article=1014&context=pacis2011

Alsabawy, A. Y., Cater-Steel, A., & Soar, J. (2012). A model to measure e-learning

systems success. In Z. Belkhamza & S. Azizi Wafa (Eds.), Measuring

organizational information systems success: New technologies and practices

(pp. 293-317). Hershey, PA: Business Science Reference.

Arbaugh, J. B. (2000). Virtual classroom characteristics and student satisfaction with

internet-based MBA courses. Journal of Management Education, 24(1), 32-

54.

ASTD. (2010). E-Learning glossary. Retrieved July 14, 2011, from

http://www.astd.org/LC/glossary.htm

Australian Flexible Learning Framework. (2010). 2010 Employer e-learning

benchmarking survey (pp. 25). Canberra, ACT.

Australian Flexible Learning Framework. (2011). 2011 E-learning benchmarking

survey: Final report. Canberra, ACT.

Page 180: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

166

Australian Government. (2007). National statement on ethical conduct in human

research. Canberra: Australian Government.

Baldwin-Evans, K. (2004). Employees and e-learning: What do the end-users think?

Industrial and Commercial Training, 36(6/7), 269.

Baroudi, J. J., Olson, M. H., & Ives, B. (1986). An empirical study of the impact of

user involvement on system usage and information satisfaction.

Communications of the ACM, 29(3), 232-238.

Bassi, L., Benson, G., & Cheney, S. (1996). The top ten trends. Training &

Development, 50(11), 29-33.

Bates, R. (2004). A critical analysis of evaluation practice: The Kirkpatrick model

and the principle of beneficence. Evaluation and Program Planning, 27(3),

341-347. doi: 10.1016/j.evalprogplan.2004.04.011

Beaudoin, M. F. (2002). Learning or lurking? Tracking the "invisible" online student.

Internet and Higher Education, 5, 147-155.

Becker, K. L. (2007). Unlearning in the workplace: A mixed methods study.

Queensland University of Technology, Brisbane, Queensland.

Beekhuyzen, J., Nielsen, S., & von Hellens, L. (2010). The Nvivo looking glass:

Seeing the data through the analysis. Paper presented at the 5th Conference

on Qualitative Research in IT, Brisbane, Australia.

Berge, Z., & Giles, L. (2008). Implementing and sustaining e-Learning in the

workplace. International Journal of Web - Based Learning and Teaching

Technologies, 3(3), 44.

Berthon, P., Pitt, L., Ewing, M., & Carr, C. L. (2002). Potential research space in

MIS: A framework for envisioning and evaluating research replication,

extension, and generation. Information Systems Research, 13(4), 416-427.

Page 181: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

167

Biggs, J. (2003). Teaching for quality learning at university: What the student does

(2nd Ed.). Berkshire, United Kingdom: Open University Press.

Bokhari, R. H. (2005). The relationship between system usage and user satisfaction:

A meta-analysis. Journal of Enterprise Information Management, 18(1/2),

211-234.

Bondarouk, T., & Ruël, H. (2010). Dynamics of e-learning: Theoretical and practical

perspectives. International Journal of Training and Development, 14(3), 149-

154.

Brantlinger, E. A. (1997). Knowledge, position, and agency: Activism, and inward

gaze as a natural next step in local inquiry. Paper presented at the annual

meeting of the American Educational Research Association, San Diego, CA.

Brink, B., Munro, J., & Osborne, M. (2002). Online learning technology in an SME

work-based setting. Education Technology and Society, 5(2), 81-86.

Brown, L., Murphy, E., & Wade, V. (2006). Corporate eLearning: Human resource

development implications for large and small organizations. Human Resource

Development International, 9(3), 415-427.

Burnett, K., Bonnici, L. J., Miksa, S. D., & Kim, J. (2007). Frequency, intensity and

topicality in online learning: An exploration of the interaction dimensions

that contribute to student satisfaction in online learning. Journal of Education

for Library and Information Science, 48(1), 21-35.

Burns, R. B. (2000). Introduction to research methods (4th ed.). Frenchs Forest,

NSW: Pearson Education Australia.

Byrne, R. (2002). Web-based learning versus traditional management development

methods. Singapore Management Review, 24, 59-68.

Page 182: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

168

Cavana, R. Y., Delahaye, B. L., & Sekaran, U. (2001). Applied research business:

Qualitative and quantitative methods. Milton, Queensland: John Wiley &

Sons, Inc.

Chapman, D. D. (2004). Preferences of training performance measurement: A

comparative study of training professionals and non-training managers.

Performance Improvement Quarterly, 17(4), 31-49. doi: 10.1111/j.1937-

8327.2004.tb00319.x

Chen, H. J. (2010). Linking employees’ e-learning system use to their overall job

outcomes: An empirical study based on the IS success model. Computers

&amp; Education, 55(4), 1628-1639. doi: 10.1016/j.compedu.2010.07.005

Chen, L. D., Soliman, K. S., Mao, E., & Frolick, M. N. (2000). Measuring user

satifaction with data warehouses: An exploratory study. Information &

Management, 37, 103-110.

Chiu, C. M., Hsu, M. H., Sun, S. Y., Lin, T. C., & Sun, P. C. (2005). Usability,

quality, value and e-learning continuance decisions. Computers and

Education, 45, 399-416.

Cobb, S. C. (2009). Social presence and online learning: A current view from a

research perspective. Journal of Interactive Online Learning, 8(3), 241-254.

Colton, S., & Hatcher, T. (2004). The development of a research instrument to

analyze the application of adult learning principles to online learning. Paper

presented at the Academy of Human Resouce Development International

Research Conference, Austin, TX.

Creswell, J. W. (2003). Research design: Qualitative, quantitative, and mixed

method approaches (Second ed.). Thousand Oaks, California: Sage

Publications, Inc.

Page 183: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

169

Creswell, J. W. (2012). Qualitative inquiry and research design: Choosing among

five approaches (3rd ed.). Thousand Oaks, CA: Sage Publications.

Delahaye, B. L. (2005). Human resource development: Adult learning and

knowledge management (2nd ed.). Qld, Australia: John Wiley & Sons

Australia, Ltd.

Delahaye, B. L. (2011). Human resource development: Managing learning and

knowledge capital (Third ed.). Prahran, VIC: Tilde University Press.

Delahaye, B. L., Limerick, D. C., & Hearn, G. (1994). The relationship between

andragogical and pedagogical orientations and the implications for adult

learning. Adult Education Quarterly, 44(4), 187-200. doi:

10.1177/074171369404400401

Delahaye, B. L., & Smith, B. J. (1998). How to be an effective trainer: Skills for

managers and new trainers. Brisbane: John Wiley & Co.

DeLone, W. H., & McLean, E. R. (1992). Information systems success: The quest for

the dependent variable. [Article]. Information Systems Research, 3(1), 60-95.

DeLone, W. H., & McLean, E. R. (2003). The DeLone and McLean Model of

Information Systems Success: A ten-year update. [Article]. Journal of

Management Information Systems, 19(4), 9-30.

DeLone, W. H., & McLean, E. R. (2004). Measuring e-commerce success: Applying

the DeLone & McLean Information Systems Success Model. International

Journal of Electronic Commerce, 9(1), 31-47.

Denzin, N. K., & Lincoln, Y. S. (2011). Introduction: The discipline and practice of

qualitative research. In The Sage Handbook of Qualitative Research (4th ed.,

pp. 1-19). Thousand Oaks, CA: Sage.

Page 184: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

170

Department of Education Science and Training. (2003). Adult learning in Australia:

A consultation paper. Canberra, ACT: Legislative Services.

Derouin, R. E., Fritzsche, B. A., & Salas, E. (2005). E-Learning in organizations.

Journal of Management, 31(6), 920-940.

Dey, I. (1993). Qualitative data analysis: A user-friendly guide for social ccientists.

London: Routledge.

Dick, R. (1990). Convergent interviewing (Version 3 ed.). Chapel Hill, Qld:

Interchange.

Doll, W. J., & Torkzadeh, G. (1988). The measurement of end-user computing

satisfaction. MIS Quarterly, 12, 259-274.

Downing, C. E. (1999). System usage behaviour as a proxy for user satisfaction: An

empirical investigation. Information & Management, 35, 203-216.

Eisenhardt, K. M. (1989). Building theories from case study research. The Academy

of Management Review, 14(4), 532-550.

Eisenhardt, K. M., & Graebner, M. E. (2007). Theory building from cases:

Opportunities and challenges. Academy of Management Journal, 50(1), 25-

32. doi: 10.5465/amj.2007.24160888

Etezadi-Amoli, J., & Farhoomand, A. F. (1996). A structural model of end user

computing satisfaction and user performance. Information &amp;

Management, 30(2), 65-73. doi: 10.1016/0378-7206(95)00052-6

Fisher, S. L., Wasserman, M. E., & Orvis, K. A. (2010). Trainee reactions to learner

control: an important link in the e-learning equation. International Journal of

Training and Development, 14(3), 198-208.

Gagne, R. M. (1984). Learning outcomes and their effects: Useful categories of

human performance. American Psychologist, 39, 377-385.

Page 185: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

171

Galloway, D. L. (2005). Evaluating distance delivery and e-learning: Is Kirkpatrick's

model relevant? Performance Improvement, 44(4), 21.

Garavan, T. N., Carbery, R., O'Malley, G., & O'Donnell, D. (2010). Understanding

participation in e-learning in organizations: A large-scale empirical study of

employees. International Journal of Training and Development, 14(3), 155-

168.

Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based

environment: Computer conferencing in higher education. The Internet and

Higher Education, 2(2-3), 87-105.

Glesne, C. (1999). Becoming qualitative researchers (2nd ed.). United States:

Addison Wesley Longman.

Goldstein, I. L., & Ford, J. K. (2002). Training in organizations. Needs assessment,

development and evaluation. Belsmont, CA: Wadsworth.

Goodhue, D. L., & Thompson, R. L. (1995). Task-technology fit and individual

performance. MIS Quarterly, 19(2), 213-236.

Hatch, J. A. (2002). Doing qualitative research in education settings. Albany: State

University of New York Press.

Hayes, J., & Allinson, C. W. (1997). Learning styles and training and development in

work settings: Lessons from educational research. Educational Psychology,

17(1-2), 185-193. doi: 10.1080/0144341970170114

Hernández, A. B., Gorjup, M. T., & Cascón, R. (2010). The role of the instructor in

business games: A comparison of face-to-face and online instruction.

International Journal of Training and Development, 14(3), 169-179.

Page 186: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

172

Hodges, A. (2009). Corporate e-learning: How three healthcare companies

implement and measure the effectiveness of e-learning. Doctor of

Philosophy, The University of Alabama, Tuscaloosa, Alabama.

Hogarth, K., & Dawson, D. (2008). Implementing e-learning in organisations: What

e-Learning research can learn from instructional technology (IT) and

organisational studies (OS) innovation studies. International Journal on

ELearning, 7(1), 87-105.

Holsapple, C. W., & Lee-Post, A. (2006). Defining, assessing, and promoting e-

learning success: An information systems perspective. Decision Sciences

Journal of Innovative Education, 4(1), 67-85.

Holton, E. F. (1996). The flawed four-level evaluation model. Human Resource

Development Quarterly, 7(1), 5-5.

Holton, E. F., Wilson, L. S., & Bates, R. A. (2009). Toward development of a

generalized instrument to measure andragogy. Human Resource Development

Quarterly, 20(2), 169-193. doi: 10.1002/hrdq.20014

Hutchins, H. M., & Hutchison, D. (2008). Cross-disciplinary contributions to e-

learning design: A tripartite design model. Journal of Workplace Learning,

20(5), 364.

Illeris, K. (2003). Workplace learning and learning theory. Journal of Workplace

Learning, 15(4), 167-178. doi: 10.1108/13665620310474615

Illeris, K. (2004). A model for learning in working life. Journal of Workplace

Learning, 16(7/8), 431-441.

Ismail, J. (2001). The design of an e-learning system: Beyond the hype. The Internet

and Higher Education, 4(3–4), 329-336.

Page 187: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

173

Jamieson, K. (2007). Information systems decision making: Factors affecting

decision makers and outcomes. Central Queensland University.

Johnson, R. D., Gueutal, H., & Falbe, C. M. (2009). Technology, trainees,

metacognitive activity and e-learning effectiveness. Journal of Managerial

Psychology, 24(6), 545-566.

Johnson, R. D., Hornkik, S., & Salas, E. (2008). An empirical examination of factors

contributing to the creation of successful e-learning environments.

International Journal of Human-Computer Studies, 66, 356-369.

Johnson, S. D., & Aragon, S. R. (2003). An instructional strategy framework for

online learning environments. New Directions for Adult and Continuing

Education, 2003(100), 31-43. doi: 10.1002/ace.117

Jung, I. (2010). The dimensions of e-learning quality: From the learner’s perspective.

Educational Technology Research and Development, 1-20. doi:

10.1007/s11423-010-9171-4

Kaplan, B., & Maxwell, J. (2005). Qualitative research methods for evaluating

computer information systems. In J. Anderson & C. Aydin (Eds.), Evaluating

the organizational impact of healthcare information systems (pp. 30-55):

Springer New York.

Kirkpatrick, D. L. (1976). Evaluation of training. In R. L. Craig (Ed.), Training and

development handbook: A guide to human resource development. New York,

NY: McGraw-Hill.

Klobas, J. E., & McGill, T. J. (2010). The role of involvement in learning

management system success. Journal of Computing in Higher Education,

22(2), 114-134.

Page 188: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

174

Knowles, M. S. (1980). The modern practice of education: From pedagogy to

andragogy (2nd ed.). New York: Cambridge Books.

Knowles, M. S. (1990). The adult learner: A neglected species. Houston: Gulf.

Knowles, M. S., Holton, E. F., & Swanson, R. A. (2005). The adult learner: The

definitive classic in adult education and human resource development (6th

ed.). Burlington: Elsevier.

Kraiger, K. (2002). Decision-based evaluation. In K. Kraiger (Ed.), Creating,

implementing, and managing effective training and development: State-of-

the-art lessons for practice (pp. 331-375). San Francisco, CA: Jossey-Bass.

Kraiger, K., Ford, J. K., & Salas, E. (1993). Application of cognitive, skill-based, and

affective theories of learning outcomes to new methods of training

evaluation. Journal of Applied Psychology, 78(2), 311-328. doi:

10.1037/0021-9010.78.2.311

Kramer, H. (2007). Measuring the effect of e-learning on job performance (Doctoral

Dissertation 3288849) Nova Southeastern University, Florida, United States.

Retrieved from ProQuest Central; ProQuest Dissertations & Theses (PQDT)

database.

Lee-Post, A. (2009). e-Learning success model: An information systems perspective.

Electronic Journal of e-Learning, 7(1), 61-70.

Lee, B. C., Yoon, J. O., & Lee, I. (2009). Learners’ acceptance of e-learning in South

Korea: Theories and results. Computers &amp; Education, 53(4), 1320-1329.

doi: 10.1016/j.compedu.2009.06.014

Lee, W. W., Owens, D. L., & Benson, A. D. (2002). Design considerations for web-

based learning systems. Advances in Developing Human Resources, 4(4),

405-423. doi: 10.1177/152342202237519

Page 189: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

175

Leininger, M. (1992). Current Issues, problems, and trends to advance qualitative

paradigmatic research methods for the future. Qualitative Health Research,

2(4), 392-415. doi: 10.1177/104973239200200403

Lin, K.-M. (2011). e-Learning continuance intention: Moderating effects of user e-

learning experience. Computers & Education, 56(2), 515-526. doi:

10.1016/j.compedu.2010.09.017

Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry. Beverly Hills, CA: Sage.

Lu, H. P., & Chiou, M. J. (2010). The impact of individual differences on e-learning

system satisfaction: A contingency approach. British Journal of Educational

Technology, 41(2), 307-323.

Marshall, C., & Rossman, G. B. (2011). Designing qualitative research (5th ed.).

Thousand Oaks, California: Sage Publications, Inc.

Maxwell, J. (2005). Qualitative research design: An interactive approach. Thousand

Oaks, CA: Sage Publications.

McClelland, B. (2001). Digital learning and teaching: Evaluation of developments

for students in higher education. European Journal of Engineering

Education, 26, 107-115.

McFarlan, F. W. (1987). The information aystems research challenge. Boston,

Massachusetts: Harvard Business School Press.

McGraw, K. L. (2001). E-learning strategy equals infrastructure. Retrieved from

http://learningcircuits.org/2001/jun2001/mcgraw.html

Merriam, S. B. (1987). Adult learning and theory building: A review. Adult

Education Quarterly, 37(4), 187-198. doi: 10.1177/0001848187037004001

Merriam, S. B. (1998). Qualitative research and case study applications in

education. San Francisco: Jossey-Bass.

Page 190: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

176

Michalski, G. V., & Cousins, J. B. (2000). Differences in stakeholder perceptions

about training evaluation: A concept mapping/pattern matching investigation.

Evaluation and Program Planning, 23(2), 211-230. doi: 10.1016/s0149-

7189(00)00005-7

Mohmood, M. A., Burn, J. M., Gemoets, L. A., & Jacquez, C. (2000). Variables

affecting information technology end-user satisfaction: A meta-analysis of

the empirical literature,. International Journal of Human-Computer Studies,

52, 751-771.

Moller, L., Foshay, W. R., & Huett, J. (2008). The evolution of distance education:

Implications for instructional design on the potential of the web. TechTrends,

52(3), 70-75. doi: 10.1007/s11528-008-0158-5

Moore, J. L., Dickson-Deane, C., & Galyen, K. (2011). e-Learning, online learning,

and distance learning environments: Are they the same? The Internet and

Higher Education, 14(2), 129-135. doi: 10.1016/j.iheduc.2010.10.001

Morgan, G., & Smircich, L. (1980). The case for qualitative research. The Academy

of Management Review, 5(4), 491-500.

Morse, J. M., Barrett, M., Mayan, M., Olson, K., & Spiers, J. (2002). Verification

strategies for establishing reliability and validity in qualitative research.

International Journal of Qualitative Methods, 1(2), 13-22.

Morse, J. M., & Richards, L. (2002). Readme first for a user's guide to qualitative

methods. Thousand Oaks, Calif. : Sage.

Motiwallo, L., & Tello, S. (2000). Distance learning on the internet: An exploratory

study. The Internet and Higher Education, 2, 253-264.

Muilenburg, L. Y., & Berge, Z. L. (2005). Student barriers to online learning: A

factor analytic study. Distance Education, 26(1), 29-48.

Page 191: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

177

Najmul Islam, A. K. M. (2012). The role of perceived system quality as educators'

motivation to continue e-learning system use. AIS Transactions on Human-

Computer Interaction, 4(1), 25-43.

Negash, S., Ryan, T., & Igbaria, M. (2003). Quality and effectiveness in Web-based

customer support systems. Information and Management, 40(8), 757-768.

doi: 10.1016/s0378-7206(02)00101-5

Newstrom, J. W. (1978). Catch-22: The problems of incomplete evaluation training.

Training and Development Journal, 32(11), 22-24.

Nichols, M. (2003). A theory for e-learning. Journal of Educational Technology &

Society, 6(2), 1-10.

Nickols, F. W. (2005). Why a stakeholder approach to evaluating training. Advances

in Developing Human Resources, 7(1), 121-134. doi:

10.1177/1523422304272175

Nielsen, J. L. (2005). Critical success factors for implementing an ERP system. In L.

von Hellens, S. Nielsen & J. Beekhuyzen (Eds.), Qualitative case studies on

implementation of enterprise wide systems (pp. 211-231). London: Idea

Group Publishing.

Noe, R. A., & Winkler, C. (2009). Employee training and development: For

Australia & New Zealand (1st ed.). Burr Ridge: Irwin McGraw-Hill.

Norman, A., Ngai, E. W. T., & Cheng, T. C. E. (2002). A critical review of end-user

information system satisfaction research and a new research framework.

Omega The International Journal of Management Science, 30, 451-478.

Ozkan, S., Koseler, R., & Baykal, N. (2009). Evaluating learning management

systems: Adoption of hexagonal e-learning assessment model in higher

Page 192: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

178

education. Transforming government: People, process and policy, 3(2), 111-

130. doi: 10.1108/17506160910960522

Parasuraman, A., Zeithaml, V. A., & Berry, L. L. (1988). SERVQUAL: A multiple-

item scale for measuring consumer perceptions of service quality. [Article].

Journal of Retailing, 64(1), 12-40.

Patton, M. Q. (2002). Qualitative Research & Evaluation Methods (3rd ed.).

Thousand Oaks, London: Sage Publications.

Petter, S., DeLone, W. H., & McLean, E. R. (2008). Measuring information systems

success: Models, dimensions, measures, and interrelationships. European

Journal of Information Systems, 17(3), 236-263. doi: 10.1057/ejis.2008.15

Phillips, J. J. (1996). ROI: The search for best practices. Training & Development,

50(2), 42-47.

Picciano, A. G. (2002). Beyond student perceptions: Issues of interaction, presence,

and performance in an online course. Journal of Asynchronous Learning

Networks, 6(1), 21-40.

Pitt, L. F., Watson, R. T., & Kavan, C. B. (1995). Service quality: A measure of

information systems effectiveness. MIS Quarterly, 19(2), 173-187.

Pittinsky, M., & Chase, B. (2000). Quality on the line: Benchmarks for success in

internet-based distance education. Washington, DC: National Education

Association.

Richards, L. (2005). Handling qualitative data: A practical guide. Thousand Oaks,

CA: Sage Publications Ltd.

Richardson, J. C., & Swan, K. (2003). Examining social presence in online courses

in relation to students' perceived learning and satisfaction. Journal of

Asynchronous Learning Networks, 7(1), 68-88.

Page 193: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

179

Rosenberg, M. J. (2001). E-learning: Strategies for delivering knowledge in the

digital age. New York: McGraw-Hill.

Rosenberg, M. J. (2006). Beyond e-learning: Approaches and technologies to

enhance organizational knowledge, learning, and performance. San

Fransisco: Preiffer.

Ruiz, J. G., Mintzer, M. J., & Leipzig, R. M. (2006). The impact of e-learning in

medical education. Academic Medicine, 81(3), 207-212.

Rylatt, A. (2000). Learning unlimited: Practical strategies for transforming learning

in the workplace of the 21st century (2nd ed.). Warriewood, NSW: Business

+ Publishing.

Saarinen, T. (1996). An expanded instrument for evaluating information system

success. Information & Management, 31, 103-108.

Salas, E., & Cannon-Bowers, J. A. (2001). The science of training: A decade of

progress. Annual Review of Psychology, 52, 471.

Salas, E., Kosarzycki, M. P., Burke, C. S., Fiore, S. M., & Stone, D. L. (2002).

Emerging themes in distance learning research and practice: Some food for

thought. [Article]. International Journal of Management Reviews, 4(2), 135-

153.

Saldana, J. (2009). The coding manual for qualitative researchers. Thousand Oaks,

CA: Sage Publications Inc.

Sambrook, S. (2003). E-learning in small organisations. Education & Training,

45(8/9), 506-516.

Santa, R. (2009). An investigation of the alignment between technological innovation

effectiveness and operational effectiveness. CQ University.

Page 194: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

180

Saunders, M., Lewis, P., & Thornhill, A. (2009). Research methods for business

students (5th ed.). England: Pearson Education Limited.

Savenye, W. C., Olina, Z., & Niemczyk, M. (2001). So you are going to be an online

writing instructor: Issue in designing, developing, and delivering an online

course. Computers and Composition, 18, 371-385.

Servage, L. (2005). Strategizing for workplace e-learning: Some critical

considerations. Journal of Workplace Learning, 17(5/6), 304-317.

Shivetts, C. (2011). E-Learning and Blended Learning: The importance of the learner

a research literature review. International Journal on E-Learning, 10(3), 331-

337.

Silverman, D. (2007). A very short, fairly interesting and reasonably cheap book

about qualitative research. London: SAGE Publications Ltd.

Sitzmann, T., Brown, K. G., Casper, W. J., Ely, K., & Zimmerman, R. D. (2008). A

review and meta-analysis of the nomological network of trainee reactions.

Journal of Applied Psychology, 93(2), 280-295. doi: 10.1037/0021-

9010.93.2.280

Smith, L. J. (2001). Content and delivery: A comparison and contrast of electronic

and traditional MBA marketing planning courses. Journal of Marketing

Education, 23, 35-44.

Somers, T. M., Nelson, K., & Karimi, J. (2003). Confirmatory factor analysis of the

end-user computing satisfaction instrument: Replication within an ERP

domain. Decision Sciences, 34(3), 595-621.

Stake, R. E. (2005). Qualitative case studies. In N. K. Denzin & Y. S. Lincoln (Eds.),

The sage handbook of qualitative research (3rd ed., pp. 443-466). Sage:

Thousand Oaks, CA.

Page 195: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

181

Stone, R. J. (2010). Human resource management (7th ed.). Brisbane, Australia:

John Wiley & Sons.

Straub, D., Limayem, M., & Karahanna-Evaristo, E. (1995). Measuring system

usage: Implications for IS theory testing. Management Science, 41(8), 1328-

1342.

Sun, P.-C., Tsai, R. J., Finger, G., Chen, Y.-Y., & Yeh, D. (2008). What drives a

successful e-Learning? An empirical investigation of the critical factors

influencing learner satisfaction. Computers & Education, 50(4), 1183-1202.

Sutton, B., & Stephenson, J. (2005). A review of 'return on investment' in training in

the corporate sector and possible implications for college-based programmes.

Journal of Vocational Education and Training, 57, 355-374.

Symon, G., & Cassell, C. (2012). Qualitative organizational research: Core methods

and current challenges. Thousand Oaks, CA: Sage Publications Inc.

Tan, J. A., Hall, R. J., & Boyce, C. (2003). The role of employee reactions in

predicting training effectiveness. Human Resource Development Quarterly,

14(4), 397-411. doi: 10.1002/hrdq.1076

Tannenbaum, S. I., Mathieu, J. E., Salas, E., & Cannon-Bowers, J. A. (1991).

Meeting trainees' expectations: The influence of training fulfillment on the

development of commitment, self-efficacy, and motivation. Journal of

Applied Psychology, 76(6), 759-769. doi: 10.1037/0021-9010.76.6.759

Teo, T. S. H., & Wong, P. K. (1998). An empirical study of the performance impact

of computerization in the retail industry. Omega, 26(5), 611-621. doi:

10.1016/s0305-0483(98)00007-3

Tracks. (2012). Project and Curriculum Process. Australia: Tracks.

Page 196: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

182

Twitchell, S., Holton, E. F., & Trott, J. W. (2000). Technical training evaluation

practices in the United States. Performance Improvement Quarterly, 13(3),

84-109. doi: 10.1111/j.1937-8327.2000.tb00177.x

Tynjälä, P. (2008). Perspectives into learning at the workplace. Educational

Research Review, 3(2), 130-154. doi: 10.1016/j.edurev.2007.12.001

Tynjala, P., & Hakkinen, P. (2005). E-learning at work: Theoretical underpinnings

and pedagogical challenges. Journal of Workplace Learning, 17(5/6), 318-

336.

Waight, C. L., & Stewart, B. L. (2005a). Valuing the adult learner in e-learning: Part

one—a conceptual model for corporate settings. Journal of Workplace

Learning, 17(5/6), 337.

Waight, C. L., & Stewart, B. L. (2005b). Valuing the adult learner in e-learning: Part

two—insights from four companies. The Journal of Workplace Learning,

17(5/6), 398-414.

Walker, S. L., & Fraser, B. J. (2005). Development and validation of an instrument

for assessing distance education learning environments in higher education:

The distance education learning environments survey (DELES). Learning

Environments Research, 8, 289-308.

Wang, M., Ran, W., Liao, J., & Yang, S. J. H. (2010). A performance-oriented

approach to e-learning in the workplace. Educational Technology and

Society, 13(4), 167-179.

Wang, Y. S., Wang, H. Y., & Shee, D. Y. (2007). Measuring e-learning systems

success in an organizational context: Scale development and validation.

Computers in Human Behavior, 23, 1792-1808.

Page 197: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

183

Welsh, E. T., Wanberg, C. R., Brown, K. G., & Simmering, M. J. (2003). E-learning:

Emerging uses, empirical results and future directions. International Journal

of Training and Development, 7(4), 245-258.

Wixom, B. H., & Watson, H. J. (2001). An empirical investigation of the factors

affecting data warehousing success. MIS Q., 25(1), 17-32. doi:

10.2307/3250957

Wu, J.-H., & Wang, Y.-M. (2006). Measuring KMS success: A respecification of the

DeLone and McLean's model. Information and Management, 43(6), 728-739.

doi: 10.1016/j.im.2006.05.002

Yin, R. K. (2003). Case study research: Design and methods (Third ed.). Thousand

Oaks, California: Sage Publications, Inc.

Yin, R. K. (2009). Case study research: Design and methods (4th ed.). Thousand

Oaks, CA: Sage Pub.

Zviran, M., & Pliskin, N. (2005). Measuring user satisfaction and perceived

usefulness in the ERP Context. Journal of Computer Information Systems,

45(3), 43-52.

Page 198: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

184

Appendix 1—Interview protocol

This is the consent form which you have already seen and read. Do you have any

questions about anything on the form?

In general terms, my research is about e-learning, which is any training or development

initiative that uses a computer to deliver content when and where required. In

particular, I am looking to identify what is most important in evaluating e-learning

courses and programs. Your organisation is spending a lot of time and money investing

in e-learning and we want to find out how it can be used to its greatest advantage and

what are the key elements used when judging the effectiveness or otherwise of this type

of technology.

You have been identified to me by [referrer’s name] as a user of e-

learning/manager of employees that have completed e-learning. Firstly, could I

please clarify your role—what you do here?

Throughout this interview, as you answer questions, I would like you to think

about them from the point of view of an e-learning user/simulator user/learning

and development professional. (May include traditional e-learning and

simulators—explain questions could have two answers.)

1. What exposure have you had to e-learning in the past?

I would like to ask you about some specific aspects of e-learning.

Initially I’ll ask about the quality of an e-learning system. From what I

understand, at Tracks an e-learning system means both the LMS and the e-

learning modules (or for simulator users the entire simulation system). Does

that make sense?

2. What do you believe makes a good e-learning system (simulation system)?

3. What elements would you look for in assessing system quality?

Simulator Users Only: I have some questions specifically about your experience

using the simulator

4. How does a simulator environment meet your (employees’) training needs?

(practice skills, respond in real-time etc.)

Page 199: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

185

5. If you were to later compare the training you have just completed (offered at

Tracks) to another simulator, what things would you use to compare your

training outcomes?

PROBE: How do you assess the training/what do you look at when you come

out of training/rate experience/what skills you’ve gained, i.e., You might

consider how confident you are with training?

Now I’ll ask you about information delivered via e-learning:

6. What type of training content do you prefer to have delivered (to your

employees) via e-learning?

7. What format do you prefer to have information delivered (to your employees)

via e-learning?

8. From a user/L&D perspective, if I asked you to rate the quality of how

information is delivered in an e-learning system, what would be the factors that

you would look at in giving your rating?

The last question for this section is about service or support:

9. What service or support do you think users should be provided as part of e-

learning?

10. From an L&D perspective, what do you expect to be provided to support users

of e-learning?

11. What support do you expect when your employees are doing e-learning?

Probe: How would you measure the quality of service or support provided?

12. Do you know how is e-learning currently evaluated?

If yes, PROBE: Can you explain the evaluation process to me?

Do you receive feedback on your results/what your staff’s results were?

That was the final formal question I have for you. Is there anything else you

would like to add that you think would be of use?

THANK YOU FOR YOUR TIME

Page 200: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

186

Appendix 2 - Questions for pilot and revised for research

Element of Model Pilot Questions Comments from Pilot Interview Revised Questions

Information quality - - What types of information do you prefer to have

delivered in e-learning?

- - If I was going to ask you to rate the quality of

how information is delivered in an e-learning

system, what would be the factors that you would

look at in giving your rating?

-

- - What did you like about the information that can

be delivered via e-learning?

- - What did you dislike about the information that

can be delivered via e-learning?

This question confused interviewees.

What do you mean by information?

Was suggested this could be broken into

content and format to make it clearer.

This question worked better.

Didn’t ask these two—the interviewees

were struggling with the concept of

‘information’. And the questions are too

similar to what do you like/dislike about

e-learning—couldn’t distinguish

difference.

- - What types of training content do you prefer to

have delivered (to your employees) in e-learning?

-

- - What format do you prefer to have information

delivered (to your employees) via in e-learning?

- - If I was going to ask you to rate the quality of

how information is delivered in an e-learning,

what would be the factors that you would look at

in giving your rating? From a user/L&D

perspective?

System quality - - What do you believe makes a good e-learning

system?

- - What do you believe makes a poor quality e-

learning system?

- Worked ok, but could be worded

differently for better understanding.

- - What do you believe makes a good e-learning

system?

- - What elements would you look for in assessing

system quality?

Service quality - - What service or support do you expect to be

provided as a part of e-learning?

- - Who provides that service? Who would you like

to provide that service/support?

Felt awkward asking this 2nd

question;

mostly came out in the question before

- - What service or support do you expect to be

provided as a part of e-learning?

- PROBE: How would you measure the quality of

service or support provided?

Page 201: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

187

Current Evaluation 1. - Do you know how is e-learning currently

evaluated?

If yes, PROBE:

- - Can you explain the evaluation process to me?

- - Do you receive feedback on your results/what

your staff’s results were?

Good. 2. - Do you know how e-learning is currently

evaluated?

If yes, PROBE:

- - Can you explain the evaluation process to me?

- - Do you receive feedback on your results/what

your staff’s results were?

Other questions

- - Is there anything else you would like to add that

you think would be of use?

- - USER: Can you describe the most recent e-

learning course in which you have participated?

- - L&D: Can you describe the most recent e-

learning course in which your employees have

participated?

Got them thinking about e-learning, but

didn’t really add anything to finding out

about the model.

- OPENING Q: What exposure have you had to e-

learning in the past?

-

- CLOSING Q: Is there anything else you would

like to add that you think would be of use?

-

Page 202: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

188

Appendix 3 - Coding classification

Node Sub Node Description

Attitude

Positive Positive attitude associated with response.

Negative Negative attitude associated with response.

Participant

E-learning user Has used/been the trainee in an e-learning environment.

L&D professional Has been part of the design, implementation of e-learning.

Traditional E-learning

Environment

Levels 1–3 of e-learning at Tracks.

Simulator Environment Level 4 of e-learning at Tracks.

System Quality The desired characteristics of the system itself:

- typically measures – ease of use, functionality, reliability, flexibility,

data quality, portability, integration, and importance.

Structure Modules sections, order of modules, agenda, navigation, format of LMS,

simulator functionality, appearance.

Ease of use User-friendly interface, pace – easy clicking through.

Functionality Works as intended, technical errors, timing out (ability to pause course).

Legitimacy Trust, authenticity, voice over accents, branding, confidence in organisation.

Long-term knowledge

resource

Track training, personal records, re-access to past e-learning (personal

resource library), retention of information.

Flexibility Ability to choose various information types e.g. turn sound on/off.

Accessibility Technical elements that mean it can be used on a variety of computers at

different locations.

Page 203: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

189

Information Quality Measure of output from the system.

- Typically defined as quality of content, accuracy, precision, currency,

reliability, timeliness, completeness, relevance, and format required.

Format Text, video, multimedia, blended, etc.

Nature of content Systems-based, skills versus knowledge, static information (not changed

often), safety content.

Relevance To target audience, to job, of content, assessments.

Ease of understanding Language (plain English), appropriate to content.

Interaction Does it keep interest, engaging, opportunities for practice (not how the user

interacts with the system).

Alignment Defined learning goals, clearly set out, alignment between content and

learning goals, alignment between content and assessment.

Content accuracy Is information correct, are assessment questions and answers correct.

Support Quality Level of service received by the users, and the way in which the service is

provided:

- Typically measured via satisfaction with service.

Types Helpline, glossary, Q & A, training development plan support, technical, no

support required, trainer in room

Expectations Timeliness, accessibility, usefulness, decreased over time, does it get used?

Awareness (Is support advertised? Do participants know it’s available?),

knowledgeable assistance.

Learner Preferences Typically learning styles, adult learner principles.

Preference for face-to face As compared to e-learning.

Hands on learning Preference for, blended learning.

Presentation of information Not just a preference for format, but how it enhances learning (not

categorised as learner styles).

Page 204: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

190

Individual differences Different cultural/ethnic backgrounds, language skills, learning difficulties,

disabilities etc.

Change management .

General process Issues surrounding the process of implementation and maintenance of e-

learning.

Vendor process Coordinating with external vendors and processes surrounding this.

Content development Process for what content to develop into e-learning.

Evaluation process Process, what to evaluate, when, how often.

E-learning champion An advocate of e-learning to champion within the organisation.

Page 205: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

191

Appendix 4 - Participant information sheet and consent form

PARTICIPANT INFORMATION

FOR QUT RESEARCH PROJECT – Interview –

A MODEL OF E-LEARNING EVALUATION: A CASE STUDY

QUT Ethics Approval Number 1200000100

RESEARCH TEAM

Principal

Researcher:

Kristal Reynolds, Masters student, QUT

Associate

Researchers:

Dr Karen Becker, QUT

A/Prof. Cameron Newton, QUT

Dr Kieren Jamieson, Central Queensland University

DESCRIPTION

This project is being undertaken as part of a Masters study for Kristal Reynolds,

investigating the use and evaluation of learning technologies in the rail industry.

The purpose of this project is to identify what factors are important in evaluating e-learning

courses/programs.

The research team requests your assistance because you can give us valuable insight into

your experiences with and perceptions of the use of technology in a learning environment.

PARTICIPATION

Your participation in this project is voluntary. If you do agree to participate, you can

withdraw from participation at any time during the project without comment or penalty.

Your decision to participate will in no way impact upon your current or future relationship

with your employer, QUT or with the CRC for Rail Innovation. Your participation will

involve an audio recorded interview at an agreed location that will take approximately 1 hour

of your time.

EXPECTED BENEFITS

It is expected that this project will not directly benefit you. However the reports will provide

your organisation with an insight into how learning technologies can be used most

effectively in the rail industry.

RISKS

There are no risks beyond normal day-to-day living associated with your participation in this

project.

PRIVACY AND CONFIDENTIALITY

All comments and responses will be treated confidentially. The project is funded by the CRC

for Rail Innovation however the funding body will not have access to the raw data obtained

during the project.

The interviews will be recorded using an audio device, and will be transcribed and then

deleted at the end of the project. Transcripts will only be accessed by the research team and

individual names will not be stored with the interview transcripts for reasons of

confidentiality. If you wish to read the transcript for verification purposes prior to final

inclusion, please indicate this on the attached information sheet.

The findings from this research will be reported within a Masters’ thesis, and elements of it

may be reported at conferences and in journals. In all of these situations, neither individuals

nor organisations will be identified, and although it might be possible for you to identify

your own comments, the level of information provided about participants will not allow for

identification by other people.

CONSENT TO PARTICIPATE

Page 206: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

192

We would like to ask you to sign a written consent form (attached) to confirm your

agreement to participate.

QUESTIONS/FURTHER INFORMATION ABOUT THE PROJECT

If have any questions or require any further information about the project please contact one

of the research team members below.

Kristal Reynolds – Masters student Dr. Karen Becker – Senior Lecturer

School of Management – QUT Business

School

School of Management – QUT Business

School

3138 5218

[email protected] 3138 2743 [email protected]

CONCERNS/COMPLAINTS REGARDING THE CONDUCT OF THE PROJECT

QUT is committed to research integrity and the ethical conduct of research projects.

However, if you do have any concerns or complaints about the ethical conduct of the project

you may contact the QUT Research Ethics Unit on 3138 5123 or email

[email protected]. The QUT Research Ethics Unit is not connected with the research

project and can facilitate a resolution to your concern in an impartial manner.

Thank you for helping with this research project.

Please keep this sheet for your information.

Page 207: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

193

CONSENT FORM FOR QUT

RESEARCH PROJECT – Interview –

A MODEL OF E-LEARNING EVALUATION: A CASE STUDY

QUT Ethics Approval Number 1200000100

RESEARCH TEAM CONTACTS

Kristal Reynolds – Masters student Dr Karen Becker – Senior Lecturer

School of Management – QUT Business

School

School of Management – QUT Business

School

[email protected] [email protected]

3138 5218 3138 2743

A/Prof Cameron Newton – Associate

Professor Dr Kieren Jamieson – Senior Lecturer

School of Management – QUT Business

School

School of Information & Communication Technology

Central Queensland University

[email protected] [email protected]

STATEMENT OF CONSENT

By signing below, you are indicating that you:

have read and understood the information document regarding this project

have had any questions answered to your satisfaction

understand that if you have any additional questions you can contact the research team

understand that you are free to withdraw at any time, without comment or penalty

understand that you can contact the Research Ethics Unit on 3138 5123 or email [email protected] if you have concerns about the ethical conduct of the project

understand that the project will include audio recording

agree to participate in the project

Please indicate:

I wish to read a copy of the transcript from my interview for verification purposes prior to final

inclusion.

I wish to receive a copy of the final report from this research project.

Name

Signature

Date

Please return this sheet to the investigator.

Page 208: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

194

Appendix 5 — Example narratives relating to descriptions of system quality

Elements of

System

Quality

Narratives

Structure Allows the participant to interface at their own pace [Percy]

It's easy to navigate... whether it's logical and easy to follow [Annie]

Not too many sequences backwards and forwards [Billy]

When you log into our e-learning... there's a lot of things to go into, but people don't necessarily use those different aspects of the program. People

only go in there to do training, and there's a few other things associated with the e-learning tool. It becomes quite confusing... the website itself has got

to be formatted so that people know where to access things easily [Alfie]

The front page isn't too busy but it has the information behind it... I think just changing colour or appearance when you've completed things... I think

it's good to have it in sections....like sub-modules. The one that I did was quite good because it sort of had an agenda on the left hand side and then

you, it played on the right hand side so you could sort of track where you were going and how the different bits were going to fit together, and it sort of

built, I thought, quite a convincing case of what is was teaching you. I think keep it quite structured, I think is the best way to go [Flora]

It's got to be similar across the board, across all training packages. The systems' got to be common, if you like so that there's not different systems in

different e-learning packs. So if they are using one pack then they familiarise themselves with one system, it's got to be the same as when they do a

different e-learning course [Bertie]

You know how long this is going to take... Therefore you can pull out of it at key, relevant points... you can do chunks, you can go if you want sort of

one hour, two hours, but sort of fairly small, maybe 30 minute components. Because it can be quite tiring [Henrietta]

With our simulators... the layouts are pretty much exactly the same as if you were sitting on a train. So, the only difference is pretty much that you’re

driving, you feel like you’re in a train and it’s got the movements and everything [James]

It captures footage, real-time footage, all the control inputs, we can see the screen of what they were seeing at the time, we can see any - I guess the

inputs that they make on our procedural trainer, so if they had to go to a location or virtual train and rectify a fault, we can chase them and follow them

in the replay. We can put markers in there so that we can easily go back to that time in the assessment and say, hang on, they fell below the mark here,

bang, bang, bang and you can see it all the way along. You can just pull it up any time [Thomas]

Ease of Use Usability would be the big one for me [Emily]

User-friendly, easy to operate [Henry]

If it’s pretty basic to use [James]

For me it's something that's clear and easy to navigate... whether it's logical and easy to follow... easy instructions, and you keep it simple [Annie]

It's definitely easy to use. While I'm fairly familiar with computers and operating systems, other users aren't so proficient. So a systems' got to be easy

to use [Bertie]

For me, a large part of it is usability. So, in terms of the LMS... is it intuitive? Are users able to kind of access courses, search for courses, find courses,

Page 209: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

195

see their training histories and do all of those sort of functions that they need to do without too much training or instruction [Emily]

Well, it's got to be user-friendly. When I say that, it's got to be easy to login for starters because that's where we had some initial problems. Then the

website itself has got to be formatted so that people know where to access things easily [Alfie]

Basically one that's easily accessible. The person can sign into the program and start e-learning [Billy]

From the backend point of view from an administrator's point of view, is it easy to upload courses? Is it easy to replace courses? For managers, is it

easy to, you know, generate reports, get the sort of reporting information that you need? [Emily]

Functionality Delivery environment so things like bandwidth, technical environment that users will be accessing, the standard of computers, and whether they've got

access to soundcards for audio, headphones and that sort of thing [Emily]

I think obviously good technology, because when I did the e-learning module for [manager] the videos didn't work... So obviously, good seamless

technology, you know minimal technical error is usually best because I think it's frustrating if things don't work. So maybe a less is more approach,

until you get confidence about how your technology works [Flora]

I guess support in terms of making sure the server is running on enough bandwidth so you can actually get through - you know just that practical

element of trying to make sure the e-learning package actually works and doesn’t slow down or freezes [Henrietta]

It's got to be easy to login for starters because that's where we had some initial problems [Alfie]

I find sometimes with anything computer-based, it can be difficult getting your login sorted out and those kinds of administration issues can be a bit of

a nightmare sometimes [Annie]

One of my staff the other day just mentioned this to me. He was doing an online e-learning module around safety and then I needed him to do

something urgently and he's like do I close it down and look incomplete, and I think whatever you decide your system is for timing and timing out, and

as long as you make that clear when the person starts, that this how it works. If you save it or if you don't, or if you log out, this is what it will mean. I

feel unsure about if I leave this it will look like it took me five hours to do it. Which it didn't, I just minimised it and had to do other things. And I

don’t know, I think people are concerned about what I also think is a reasonable concern about the timing and time out feature works... Because I think

you have to be clear that the reason we're going to this is so it's better integrated into the working day. So it's not time out of the office as it has been

with face-to-face classes. So to have that you have to have an easy use for going in and out of it, perhaps, throughout the day [Flora]

For whatever reason if he walks away from his desk, it goes into sleep mode or whatever, but when he comes back it's still there for him. Or one where

he can go away to do a job and actually close out of it, but when he goes back in, he goes back into the same spot that he was up to last time [Billy]

Legitimacy I think a bit of branding so that you feel confident about it being connected to your company or your organisation... I don't know about whether other

people have commented on accent or the presenter or the voice. The one I did was American. Then perhaps if it was an Australian accent or even other

accents, or a mix of accents. I don't know if that's important or not... I definitely remember noticing, oh this is an American product. Off the shelf and

I made a whole lot of assumptions about what that meant, so the validity of the training course, which were perhaps were accurate or inaccurate, we'll

never know... It may be, when things do have, there is a foreign element... that could be distracting [Flora]

If it was a perfect world where you had every opportunity for everything, yes I would possible say a voice over, as long as it wasn't a real broad yank

talking [Billy] Long-term The only drawback I find is the fact that I don't have anything to take away with me to refer to later... I don't really remember any of the contents so if I

had to undertake an investigation, I'd have to go and source where I can re-read the information [Annie]

Page 210: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

196

knowledge

resource

Being able to keep records of what you've done, being able to see your own training history... If it's offered to you as a one off, once you've done it you

can't access it again, that's not ideal. But if you can perhaps do the course, gain your accreditation for it, but then in the future go yes, I remember I

learnt that, and go back and log in. It could add a lot of value... I know that having done sort of Microsoft Office based training in the past, I would

love that because you think it makes perfect sense to me while I'm doing pivot tables and I get back to my desk and I'm like what was that? And you

go to your book but that's not quite the same as being on your computer [Flora]

My preference would be to capture that digitally because you've got that opportunity. So these simulators here, for example, when we do have

assessments, if somebody's not meeting the outcomes of the course, we actually record that assessment as evidence of why they didn't meet the

outcomes of the course. We can go back to it at any time and it captures everything, even the footage... Well it captures footage, real-time footage, all

the control inputs, we can see the screen of what they were seeing at the time, we can see any - I guess the inputs that they make on our procedural

trainer, so if they had to go to a location or virtual train and rectify a fault, we can chase them and follow them in the replay. We can put markers in

there so that we can easily go back to that time in the assessment and say, hang on, they fell below the mark here, bang, bang, bang and you can see it

all the way along. You can just pull it up any time. [Thomas] Flexibility Flexibility, self-paced... Something that allows the participant to interface at their own pace... “From a trainer's point of view, you really need to have

the flexibility to be able to inject faults and - or events, let's call it events, into that scenario and also be able to record that... And I think what’s been

successful about our training is that the trainers control the faults on the simulator, and they interact with the participant, and based on the participant’s

decision making processes, that’s what triggers off what they’re going to do next. So it’s not controlled... they can mix it up, and based on the

participant’s strengths and weaknesses... The part task simulator which is interactive software...We aren’t very fond of it because it’s very rigid; it

hasn’t got any flexibility in it... that’s the positive I see out of the simulator training. Its fluid and you can engage with the participant, whereas

something like that, it’s fixed, and it might not necessarily be a wrong step, but if you do something and it doesn’t like it, you have to start the whole

process all over again [Percy]

So it's not just a yes no or A, B, C, D answer and you've got a number of different options and you can cruise around in different branches. Because

there's no such thing as a linear response in the workplace, so I might make a decision two steps into the process that I could make four steps into the

process, or six steps in and still have the same outcome. So there's lots of opportunity there [Thomas] Accessibility It's got to be able to be used in different locations. Some of our areas that don't actually have access to computers, so a bit of flexibility is involved

with that. I know in Tracks we're looking at setting up sites for that, for the remote learning or e-learning. So that we’ve got have a number of

computers where they can go and actually sit at these computers with a standard logon and logon on to do the e-learning at these sites [Bertie]

Something that's accessible by everyone... coming from a crewing background, crew have a lot of down time for example, stand by, waiting for trains

and they don't have a lot of computer access anyway. They can do their job without ever touching a computer ever. But if it's accessible through

desktops, I think you'll find a lot of people would use it, so it really needs to be accessible... I think there's a big hole in what we do, because you could

have desktops in a depot or in a location where train crew are and you could say to them, look guys, you've got six months go to through and do this

training and assessment. A lot of crew really are keen to keep their skills up and often complain that they don't have that opportunity.[Thomas]

Note: Entries are displayed as transcribed.

Page 211: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

197

Appendix 6 — Example narratives relating to descriptions of information quality

Elements of

Information

Quality

Narratives

Format I think it's probably good to have a blend so that it's a bit of variety in it. I'm pretty sure that's ours did have that; it had some videos, it had some

hyperlinks [Annie]

It’s probably better to have a blended approach [Henrietta]

Diagrams... Question and answering using diagrams et cetera. Pictures relating to different parts of the train and mechanisms and duties [Henry]

Not reading much, because reading is just like reading a book and just stress out your eyes. So I think normal video would be better [Toby]

I actually like the audio so I don’t have to read the instructions or read the text on the screen [Henrietta]

Some text you read, some voices you hear, some videos you see [Flora]

What I liked about that particular presentation was the fact that there was a script... it basically said to you what was seen... The videos are also good

[Alfie]

My only preference is, where possible, graphics to go with it, not just words. So you could actually - see what he should have seen, what he did each

page [Billy]

Nature of

content

If it’s not safety critical, and it doesn’t matter how you do it, you’re generally heading in the right direction... Training content that is safety critical

related, I think, should not be on e-learning... Because you need the human interaction in regards to question answers. When you use an e-learning tool

you can never ever get an appropriate answer from a computer... If you put something on an e-learning tool they might misinterpret it. So you've got to

be particularly careful about how you use an e-learning tool, particularly if - when it comes to legislation and safety [Edward]

It's good for theoretical content; for example, for management courses... [My manager] actually helped me to do something on e-learning online in

regards to management a couple of weeks ago. It was a short course, but I actually got something out of it. I felt that... it’s not safety critical related, so

even if I did misinterpret or misunderstand it, it's not going to kill anyone [Alfie]

In terms of the type on content, whether or not the content is sort of stable in terms of you know, does it get updated often and if it does, then it

probably isn't that suitable for e-learning... Whether it needs to be delivered to a wide audience that is geographically dispersed because if it does that

would be quite suitable... Whether or not it's safety critical or not is one of the key considerations within this organisation as to whether or not they

would consider it suitable for e-learning delivery [Emily]

Safety will be a great one to do because it’s something that involves a high level of participant base, a high universal base need. But if you are having

to update the content and that involves a huge cost each time every six weeks, then it doesn’t become cost effective... For example, currently we are

about to launch a pre-induction component to our induction. So that’s quite an effective way of one taking e-learning and incorporating it into a sort of

blended approach [Henrietta]

From my perspective of development... it currently is not just the learner needs but also the budget, the cost of it. So therefore you need to look at

taking content that will need to be changed on a highly regular basis, so therefore it will end up being so - it doesn’t make it cost effective [Henrietta]

Page 212: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

198

Whether it needs to be delivered to a wide audience that is geographically dispersed because if it does that would be quite suitable [Emily]

Anything that's really just information based where you don't need to have the two-way communication. And that really means that most of our

courses could really be converted over to e-learning... most of our induction courses could be e-learning [Annie]

It's probably induction type material, because I don't think that needs to be particularly interactive [Flora]

I know lots of people think a lot of the corporate induction type training or generic training on the SMS training is a good avenue. I think we tend to do

a lot of briefings, and I think that’s really what it should be for as opposed to briefings that are generally delivered by anyone – they don’t have any

trainer assessed quals, and usually the quality is pretty average. It’s just an information dump, and people can just read it. For example, we had to get

5,000 staff to watch a particular video that went for five minutes, and then obviously the benefits of putting it on LMS and then we can keep track of

who has and hasn’t watched it [Percy]

“In terms of management development, you know things like code of conduct, fraud and corruption. Those sort of policy and procedure type

compliance - that level of compliance driven, policy and procedural information knowledge based - that can work quite well as well in terms of

updates and briefings.”(Henrietta)

If it’s straightforward stuff, it’s not too technical. It’s just - we don’t sort of use computers that often here. Some blokes do but most of us in general

don’t. I’m speaking on my behalf, my personal experience, I’m not much of a computer freak or whatever. There’s a couple that are. Just the basics. If

it’s pretty basic to use. I know enough to do what I need to do [James]

I guess non-technical. I think for things like driver trainers and things like that, that they need to have that trainer upfront so they can ask and question,

you know the theories and the systems. So your non-technical is I guess what I think [Annie]

We've got a safety system. So it's sort of already quite a paper based form filling, fill in fields and you've got to fill in the right fields and code things.

That sort of stuff I think is quite good for - but it's also like when you're doing that task, you're doing it at the computer, so doing it as an e-learning

module is probably quite effective. Whereas doing an e-learning module on managing performance, you don't do that at a computer. You do that with

people... Perhaps if it's people based skills, or communication based skills, e-learning might be only part of a broader training package for people. But

if it's systems based, or computer-based tasks, then it's probably quite well suited to e-learning. Especially if you can go back to it [Flora]

If you're creative, anything... this is all from the crewing perspective, essentially any part of the job can be trained and assessed, theoretically. So any

of it, to be honest, any of the crew job can be done - can be trained and assessed or upskilled using the e-learning. As I said, it just depends on the

approach you take with it [Thomas]

I think any e-learning program - any lesson that could have been taught by a person standing at the front of the room just talking to me can be done by

e-learning. The only difficulties with e-learning I think is where you've actually got to have a hands-on component, because it's difficult for e-learning

to know if the person actually got the grasp of what you had to do in the hands-on part [Billy]

Relevance Relevant to the job. Relevance definitely... With the traditional e-learning it’s basically computer-based and it’s you on the computer so you don’t get -

for us, you don’t get the practical side... relevant to what you’re doing to your job I think that’s the main thing... Specifically related to the job... So as

long as it’s relevant to the job and you know how to use it that’s the main thing for me and you can identify it and actually carry it out in your actual

day to day work... How relevant it is to your work [Henry]

I know that there's a lot of stuff out there that anyone can get into. Everyone's got access to it now. But most of what they do is things like computer

program training, so how to use Excel or how to use Word properly or whatever. That's all well and good if you work in an office. For our train crew,

Page 213: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

199

not so much; there's not a lot out there that they have - that is valuable to their role at this point in time, so it really needs to be targeted to the role...

For the average person in an office who uses Excel all the time or uses Word, great. For train crew who never use Excel or Word, it's not targeted at

them. So you're not actually reaching out and engaging them, saying we've got this really cool system, we want you to be involved in it and this is

what's in it for you. You're not getting buy in from them [Thomas]

With my experience on that one that we did, like I said it was basically - you had a picture and there was noises to go with it so you just clicked the

mouse and go to different parts of the train. So it’s close to reality as possible I guess... The older ones... The tracks and lines we were going through

were fictional where they didn’t exist, whereas the new Waratah one is computer generated images in front of you are actually based on actual lines.

So you were already familiar with what’s going on. I think it’s much improved. I think it gets back to the reality aspect [Gordon]

A good simulation system should be as practically equivalent to the real life gadgets as possible [Toby]

Realism... The closer you get to real the better. So it's got to really reflect the environment that they're working in. You need good graphics, you need

realistic lines. So a track that they work over on a daily basis would be good [Thomas]

If it’s relevant to your day to day job... how real is it to what you do on a daily basis... with our simulators, the layouts are pretty much exactly the

same as if you were sitting on a train. So, the only difference is pretty much that you’re driving, you feel like you’re in a train and it’s got the

movements and everything [James]

Ease of

understanding

It should be easy to understand [Toby]

I guess the outcome at the end, whether I can actually understand the information and pass the assessment; that'd be part of I'd consider whether it's a

reasonable sort of a program or not on e-learning. So if I was not successful, I would question whether it was me or whether it was the actual program

[Annie]

It's got to be in plain English [Billy]

Generally the literacy level needs to be there. So you need to be clear about which language you’re going to use et cetera [Henrietta]

Interviewee: I think the other thing that they were interesting in when I did the test for them was about language, and I thought that the language was

fine.

Facilitator: So whether it was appropriate, do you mean, for the person?

Interviewee: Yeah, exactly. I didn't think that they used any complex language or technical HR type language or yeah, I didn't think that the language

was in any way disengaging. It was very plain speak, which I think is the point, or the goal, when you want to talk about managing performance

[Flora] Interaction Something that’s interactive, engaging [Percy]

I would rate it by whether it kept my interest, whether - so I can immerse myself into it and generally I find no - for those sorts of programs. I guess

that comes to the interactive part. How interactive I am in it, you know, whether or not I am just reading which usually is - well, for the two courses

that I saw it was read the page, click for the next page, do a couple of model questions before undertaking your assessment. I didn't find it terribly

exciting, you know. It wasn't a really riveting topic and maybe if it had have had a bit more video, maybe, might have helped for the particular topic

[Annie]

In the long-term I'd like to see programs that are designed where it becomes quite interactive [Thomas]

There should an interactive part attached to that simulation as well [Toby]

Page 214: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

200

A lot of hands-on using the systems would be ideal. Visual representations of how to do things followed by doing it themselves would be useful. It's

the old see and do training technique. So you see it first and then you do it [Bertie]

In terms of the actual sort of content side of things, it just needs to be well designed and it needs to be the sort of interactivity that's embedded in the

content... because a lot of what you do see is just basically online information... you might as well just send out a PDF, if you're going to that. So,

meaningfully interactivity [Emily]

Because the approach needs to really give them opportunities to practice the knowledge or skills that they need to you know, be able to attain once

they've actually completed it [Emily]

A lot of the e-learning models here; like you just basically refer to your Powerpoint slides, and they ask some questions, and you think... I felt, I don’t

even know what I’ve read for the last 10 slides because I haven’t been paying any attention. And then it’s just like a guessing game. You can usually

get most of them or if you only get one wrong, they’ll send you back and you do it all again [Percy]

Because some e-learnings, as you are doing, it explains you could do this better, this and this better - though it was more like children playing games

on computers... We have a simulator. If you don't do it correctly it will say, no, go back... It can be like different modules. First you show them what

you have to do, if it is a task-based thing. So it can be some demonstration of a task. How it can be done. Then the second part would be that you do

that task online, or on a computer yourself. Then, as I said before, that interactive part comes into play, if the computer interactively tells the person

that, no you are wrong here. Or instead of using a negative word 'wrong', it might say that this would have been a better way of doing this task. That

way the computer can also play a part in e-learning... So that is interactive. That's good. It has to be interactive, so that it gives you constant feedback

[Toby]

But it’s actually quite kinaesthetic because you then have to kind of follow through various quizzes and tests... So that was really quite a clever

approach. It isn’t the standard, you know, read a bit of text on the screen and then move from one text driven scenario to another text driven scenario.

So I prefer when it’s not just - so the best e-learning for me is when it’s a high level of interaction and it’s pretty sophisticated in terms of when you

are being assessed it’s not obvious. You do really need to keep your wits about you. When you are assessed, you get immediate feedback [Henrietta] Alignment First of all being very clear on the learning outcomes and the learning objectives that you kind of want out of the e-learning to begin with. Then just

really, then making sure that all of the activities that you design within the e-learning course contribute to meeting those objectives. I guess it comes

down to really making sure that you are giving them opportunities to develop the knowledge they need to be able to - ultimately, it's to be able to do

their, part of their job; any activities that you design within the e-learning [Emily]

A badly designed assessment is as bad as having nothing at all, really. I'm actually of the thinking that an assessment at the end is not necessarily

always required if you've got relevant and meaningful interactivity throughout the actual module. I mean an assessment can be good to provide, you

know, if you want to compare - but it needs to be well - the questions need to be well written, they need to be, they need to actually measure the

learning objectives that you sort of have in mind to begin with. A lot of assessments just measure recall, really...if their badly written questions then

they're not really going to be affective as an evaluation tool anyway [Emily]

Basically an application that relates – is clearly... has very clearly defined goals in terms of learning outcomes and a clear understanding of the trainer

profile [Henrietta]

There was one tool in there, an assessment tool, which I thought sort of was just kind of put in there and I don't know that it was very well integrated in

terms of how you would use it... I thought sort of was just kind of put in there and I don't know that it was very well integrated in terms of how you

Page 215: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

201

would use it [Flora]

According to the computer, you are judged on time. I mean, again, work doesn’t want to judge you on time. They don’t want you to hurry up, work,

they want you to take your time. Safety is critical. So, I don’t know how they mark people on time on the computer then [Edward] Accuracy When we actually did the assessment - well, we actually had a lot of problems with the actual way it was set up in regards to the assessment... what

happened was we had a different series of questions that we - we had a base of 500 questions. From that base of 500 questions a candidate will get

asked 100. The questions will be multiple choice; select one answer or different responses. None were written; no written response answers. From

those questions we had a lot of problems because the questions themselves didn't have the correct answers, or they had multiple answers or other

problems [Alfie]

A good example would be braking into a station. You can imagine a train - the pure size of it, the braking capabilities of it is lower than say, a

motorcar or something. With a simulator one thing you notice when you’re coming to a platform at 90ks an hour, put the brakes on and you would be

stopping halfway along the platform. You do that in the real world … Choom, out the other end you go [Gordon]

Note: Entries are displayed as transcribed.

Page 216: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

202

Appendix 7 — Example narratives relating to descriptions of support quality

Elements of System Quality Narratives

Types No Support Required In an ideal world, they wouldn't need any support. It should be simple enough to use and - or simple enough that you

can give them very basic support and instructions. I mean having somebody on the other end of the phone would

probably be the best outcome, would be the best opportunity I guess for them. If they're not sure, they can come back

to it [Thomas]

I think the concept of e-learning is to not have a trainer at the end of the day, so for you to try and understand the

content you need to have other people with you in the room [Alfie]

Built-in

Support/Glossary/tutorials

Support to be able to actually know how to access the e-learning; if it's going to be via the LMS because that's

another system that they're going to have to know how to navigate in order to actually get access to the content...

From a content point of view to actually have support to, if they have any questions about the actual content itself, so,

to at least have you know, a statement saying refer to your manager or contact this department or a number or

something. Or if they have any requests for further training in that area, to have some sort of a contact or support

number [Emily]

For those people who aren’t as comfortable with technology, it definitely helps to have a little intro as to how it

works. I guess the other thing is sometimes navigation all through the site. Even sometimes, as obvious as one

would think, so having a how-to thing... I quite like that kind of pre-learner thing. Where you know it has a little

avatar and it takes you to this bit and takes you to that bit, so I don’t actually have to read instructions. It literally just

kind of - it’s a visual demo [Henrietta]

Certainly there's a need for a tutorial based on how to actually get around the system. The actual user functions

[Bertie]

Maybe some sort of question portal as well. That you can say I didn't understand something and that goes to a Q and

A question board or an administrator who can get back to you with an answer [Flora]

Interviewee: As far as the content, so if you were training and assessing safe working, I think that needs to be built-

into the program. So if people are getting a bit off track or aren't getting to the level they should be, they should be

guided by the program.

Facilitator: Almost a built-in help system to guide you through it.

Interviewee: Exactly.

Facilitator: So rather than a person, the system itself becomes the support.

Interviewee: Yeah, because otherwise it becomes too labour intensive and the whole point of it is to reduce the - I

mean training is a hugely labour-intensive process, so that's the whole point of e-learning, is to reduce that intensity

[Thomas]

Page 217: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

203

Phone/Hotline I think if you had somebody on the other end of the phone pick up and say, okay, I'm having problems actually

operating or I don't understand how to get into it, you need to have that facility [Thomas]

Certainly when you're completing an assessment of any sort and to be to know whom to call to relay any issues that

are coming about. If it's shutdown or if it's not working or not responding as it should be. There should be some

support in that area [Bertie]

I think there should be a hot phone number, where you can ring - if you get in any difficulty or you don't understand

something, even though you've read it three times, if you still can’t understand it there should be someone you can

ring up who's an expert on that e-learning topic and say, look, I just don't get this, can you explain it to me more

simple terms [Billy]

In an ideal world, a helpline. When I went to log in to do the course, I couldn't see the course to click on. So I had to

phone the administrator of the test case and she had to talk me through, and I imagine that would happen with a lot of

people if it's the first one or two times that they log on, because it won't be familiar [Flora]

I think what would be good would be like a helpdesk where you could ring and say you know I'm experiencing this

issue or some other issue and maybe that could be related to the content of the course or it might be actually

navigating your way through the course [Annie]

I know you said to focus on myself as the user and I use a computer every day. But the reality of Tracks is that's not

the case for all staff. So I think a helpline just in terms of tech support is probably, at least to start up [Flora]

Learning and

Development Plan

Support

The list of courses and how they're grouped together, or perhaps a good - being able to set up a bit of a training plan

for yourself... Maybe someone - because the first kind of helpline I suggested was more of a tech support, so if my

video isn't playing or my click isn't working, what do I do, but perhaps there's also access to someone to say well I

know that I'm now on this new job and I need to do a set of training courses but I'm not exactly sure which ones are

suited to me. And I know you'll have some of that conversation with your manager and with your team that you work

with but perhaps that's something for L and D as well, to offer some guidance around which courses are relevant to

you or that type of thing [Flora] Trainer in the room The good thing about having the trainer as well is because they’re interacting, and the trainer plays basically like a

narrator and a role play at the same point in time. So if they have an issue, they’re in the signal box, they’re doing it

with the trainer who’s playing the signaller, so it’s quite realistic [Percy]

Well I think that the way they had it set up was pretty spot on. When we were in the cab we had an instructor outside,

like a little control room and they were able to communicate to us through a loudspeaker within the cab and we could

talk to them and there was that support base there. If we had a question right there and then you’d go, hey what’s this,

and they’d speak to us [Gordon] Expectations Timeliness You'd rate it on the timeliness of it. If you took an hour to respond to it or take a day to respond to it [Bertie]

The good parts would be it's 24 hours, and it's in a format where you don't have to wait for the person to come back to

you, you know, two hours later. Because if you're e-learning you really want the answer there and then on the spot,

Page 218: Critical Success Elements for the Design and ...eprints.qut.edu.au/60240/1/Kristal_Reynolds_Thesis.pdf · “Critical Success Elements for the Design and Implementation of Organisational

204

so you can move onto the next module. That's the whole – to me that’s the whole concept of e-learning. I've got 10

minutes downtime, I want to spend some time on a computer learning a program, but I don't want anything to

interfere with it [Billy]

So it needs to be a pretty quick response... that phone call is actually answered in a timely way and not - because if

you're sitting in front of the computer and you're doing something and you've got a query, it's no good getting a call

back half an hour later because you're already in the middle of you know, whatever the course is [Annie]

Instant support. You don't want to send an email and wait for somebody to come back to you two days' later and say,

yeah, this is how you do it [Thomas]

Knowledgeable I would be looking for the person knows the content of the courses, that they're a specialist in the area and can answer

questions that come up [Annie]

Support? First of all they have to obviously know how to use the thing so they can actually teach you [Henry]

Because if I was to think that hypothetically that the trainer hypothetically had no idea what he was doing, I would be

saying, this is a waste of time. But because they knew what they were doing, it was very beneficial [Gordon] Effective/Useful I'm talking about the initial tutorials on the system. You could be rated on the effectiveness of the tutorial, whether it

was clear, precise. If you understood it straight away. The need to review it multiple times because you didn't

understand it because it wasn't as clear... Whether it resolves your issue or not. You would rate it against that [Bertie]

I guess a good measure as well would be that the calls placed would reduce over time. Because you're not then just

answering one offs but you're enabling users. That the answers you're giving are not just saying click here but saying

the reason to click here is because, you know, it means you're doing this or it means you can access that, and so it's

sort of giving those quality answers that mean you understand what you're doing, not just being told how to fix it this

time. So I think that would be good quality support. That people would actually use it and not just ask the person

next to them who did it last week. Yeah because I think sometimes if you ring up and you get put on hold, or you

can't reach anyone, then you'll stop using something like that [Flora]

Interviewee: Yeah, now I don't know whether we do that. Do we?

Facilitator: I'm not sure.

Interviewee: I'm not sure either and if I'm not sure that means if we do it we don't advertise it very well [Annie]

Note: Entries are displayed as transcribed.