96
Methods to Increase ITIL Adoption Nuno Encantado Faria Thesis to obtain the Master of Science Degree in Information Systems and Computer Engineering Supervisors: Prof. Miguel Leit ˜ ao Bignolas Mira da Silva Prof. Rui Ant´ onio dos Santos Cruz Examination Committee Chairperson: Prof. Francisco Jo˜ ao Duarte Cordeiro Correia dos Santos Supervisor: Prof. Miguel Leit ˜ ao Bignolas Mira da Silva Member of the Committee: Prof. R´ uben Filipe de Sousa Pereira October 2018

Methods to Increase ITIL Adoption

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Methods to Increase ITIL Adoption

Methods to Increase ITIL Adoption

Nuno Encantado Faria

Thesis to obtain the Master of Science Degree in

Information Systems and Computer Engineering

Supervisors: Prof. Miguel Leitao Bignolas Mira da SilvaProf. Rui Antonio dos Santos Cruz

Examination Committee

Chairperson: Prof. Francisco Joao Duarte Cordeiro Correia dos SantosSupervisor: Prof. Miguel Leitao Bignolas Mira da Silva

Member of the Committee: Prof. Ruben Filipe de Sousa Pereira

October 2018

Page 2: Methods to Increase ITIL Adoption
Page 3: Methods to Increase ITIL Adoption

Acknowledgments

I would like to express all my gratitude to everybody that contributed to make this thesis possible.

To my supervisors, Prof. Miguel Mira da Silva, who with his skill, knowledge and experience accom-

panied and helped me, specially making the bridge with the companies involved in this work; and Prof.

Rui Cruz for all his commitment and pleasure on supporting and advising me. I thank both for all the

knowledge, motivation, critical thinking and advises given to me, which increased my experience and

helped me a lot in this work.

To the teams in both involved companies, for their support, shared experience, dedicated time and

precious data to the demonstration and evaluation of my proposal.

To my friends for all the amazing moments in my life and for their special support and comprehension

in this journey.

And finally, i would like to thank with all my heart to my mother, father, brother and family for all their

sacrifice, love, support and presence in all my life. No words can express all my gratitude to them.

Page 4: Methods to Increase ITIL Adoption
Page 5: Methods to Increase ITIL Adoption

Abstract

Besides the many benefits Information Technology Infrastructure Library (ITIL) can provide to compa-

nies there is still lack of its adoption. The barriers, specially the difficulty on its implementation lead

companies to make mistakes and to abandon it. The appearance of Critical Success Factors (CSFs),

adoption models and road-map to help achieving success in this complicate process is representative

of the increased effort to solve this problem, but at some point, they are still high-level solutions. Using

Design Science Research Methodology (DSRM), two methods that contribute to increase ITIL adoption

through technology and evaluation, focused on people and processes, by improving the effects of two

CSFs are proposed in this thesis. These methods were demonstrated in two companies to assess its

relevance: one from the bank sector (for the first method) and the other from the Information Technol-

ogy (IT) consulting area (for the second method). Using Osterle principles, critical analysis and Moody

and Shanks quality framework, the two methods were validated and evaluated, showing their effective

potential to increase ITIL adoption.

Keywords

Information Technology Infrastructure Library (ITIL), ITIL Implementation, ITIL Adoption, Critical Success

Factor (CSF), Design Science Research Methodology (DSRM).

iii

Page 6: Methods to Increase ITIL Adoption
Page 7: Methods to Increase ITIL Adoption

Resumo

Apesar dos inumeros benefıcios que o ITIL pode fornecer as empresas, ainda ha falta da sua adocao.

As barreiras, principalmente a dificuldade na sua implementacao, levam a que as empresas cometam

erros, abandonando-a. O aparecimento de CSFs, modelos de adocao e road-map para ajudar a

alcancar o sucesso neste processo complicado e representativo do crescente esforco para resolver

este problema, sendo ainda, em certos aspectos, solucoes de alto-nıvel. Usando DSRM, sao propostos

nesta tese dois metodos que, pela melhoria dos efeitos de dois CSFs, contribuem para aumentar a

adocao do ITIL atraves da tecnologia e avaliacao centradas nas pessoas e processos. Estes metodos

foram demonstrados em duas empresas para avaliar a sua relevancia: uma do sector bancario (para o

primeiro metodo) e a outra da area de consultoria de IT (para o segundo metodo). Usando os princıpios

de Osterle, a analise crıtica e a framework de qualidade de Moody e Shanks, os dois metodos foram

validados e avaliados, mostrando o seu potencial efectivo para aumentar a adocao do ITIL.

Palavras Chave

Information Technology Infrastructure Library (ITIL), Implementacao de ITIL, Adocao de ITIL, Critical

Success Factor (CSF), Design Science Research Methodology (DSRM).

v

Page 8: Methods to Increase ITIL Adoption
Page 9: Methods to Increase ITIL Adoption

Contents

1 Introduction 1

1.1 Research Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

1.2 Document Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

2 The Problem 7

3 Theoretical Background 11

3.1 Information Technology Infrastructure Library (ITIL) . . . . . . . . . . . . . . . . . . . . . . 13

3.2 Multiple Criteria Decision Analysis (MCDA) . . . . . . . . . . . . . . . . . . . . . . . . . . 14

3.2.1 Outranking Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

3.2.2 Analytical Hierarchy Process (AHP) . . . . . . . . . . . . . . . . . . . . . . . . . . 15

3.2.3 Measuring Attractiveness by a Categorical Based Evaluation Technique (MACBETH) 15

3.3 Monitoring and Evaluation Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

3.3.1 Implementation-Focused Monitoring and Evaluation Systems . . . . . . . . . . . . 16

3.3.2 Results-Based Monitoring and Evaluation Systems . . . . . . . . . . . . . . . . . . 16

4 Related Work 17

4.1 Critical Success Factor (CSF) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

4.1.1 Origin and Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

4.1.2 Relations between CSFs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

4.2 ITIL Adoption Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

4.2.1 ITIL Adoption Model with TAM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

4.2.2 ITIL Adoption Model with UTAUT . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

4.3 ITIL Implementation Roadmap . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

5 Research Proposal 29

5.1 Objective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

5.2 ITIL Tool Selection Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

5.2.1 Identify the Criteria and Define their Performance Levels . . . . . . . . . . . . . . . 32

5.2.2 Weight the Criteria and Evaluate their Performance Levels . . . . . . . . . . . . . . 33

5.2.3 Test the Tools and Analyze their Documentation . . . . . . . . . . . . . . . . . . . 34

vii

Page 10: Methods to Increase ITIL Adoption

5.2.4 Analyze the Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

5.3 ITIL Processes’ Performance Evaluation Method . . . . . . . . . . . . . . . . . . . . . . . 35

5.3.1 Select the Evaluation Criteria and Metrics . . . . . . . . . . . . . . . . . . . . . . . 35

5.3.2 Define the Analysis Period and its Metrics’ Targets . . . . . . . . . . . . . . . . . . 36

5.3.3 Calculate the Metrics in the Analysis Period . . . . . . . . . . . . . . . . . . . . . . 36

5.3.4 Analyze the Results and Evaluate them According to the Selected Criteria . . . . . 36

5.4 Principles of the Evaluation Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

5.4.1 Design Science Research Evaluation Framework . . . . . . . . . . . . . . . . . . . 37

5.4.2 Four Principles of Osterle et al. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

5.4.3 Moody and Shanks Quality Framework . . . . . . . . . . . . . . . . . . . . . . . . 38

6 ITIL Tool Selection Method 39

6.1 Demonstration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

6.1.1 Identify the Criteria and Define their Performance Levels . . . . . . . . . . . . . . . 41

6.1.2 Weight the Criteria and Evaluate their Performance Levels . . . . . . . . . . . . . . 41

6.1.3 Test the Tools and Analyze their Documentation . . . . . . . . . . . . . . . . . . . 43

6.1.4 Analyze the Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

6.2 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

6.2.1 Design Science Research Evaluation Framework . . . . . . . . . . . . . . . . . . . 45

6.2.2 Four Principles of Osterle et al. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46

6.2.3 Moody and Shanks Quality Framework . . . . . . . . . . . . . . . . . . . . . . . . 46

7 ITIL Processes’ Performance Evaluation Method 49

7.1 Demonstration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

7.1.1 Select the Evaluation Criteria and Metrics . . . . . . . . . . . . . . . . . . . . . . . 51

7.1.2 Define the Analysis Period and its Metrics’ Targets . . . . . . . . . . . . . . . . . . 51

7.1.3 Calculate the Metrics in the Analysis Period . . . . . . . . . . . . . . . . . . . . . . 52

7.1.4 Analyze the Results and Evaluate them According to the Selected Criteria . . . . . 53

7.2 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

7.2.1 Design Science Research Evaluation Framework . . . . . . . . . . . . . . . . . . . 54

7.2.2 Four Principles of Osterle et al. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

7.2.3 Critical Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55

7.2.4 Moody and Shanks Quality Framework . . . . . . . . . . . . . . . . . . . . . . . . 57

8 Conclusion 59

8.1 Lessons Learned . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62

8.2 Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62

8.3 Main Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63

viii

Page 11: Methods to Increase ITIL Adoption

8.4 Communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64

8.5 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65

Bibliography 67

A Criteria Weighting Judgments Matrix 73

B Sensitivity Analysis 75

C Performance Analysis 77

ix

Page 12: Methods to Increase ITIL Adoption

x

Page 13: Methods to Increase ITIL Adoption

List of Figures

1.1 Design Science Research Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

3.1 ITIL v3 core . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

4.1 Relations between key factors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

4.2 ITIL adoption model using TAM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

4.3 ITIL adoption model using UTAUT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

4.4 Roadmap for ITIL implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

6.1 MACBETH judgments matrix and numerical scale for criterion “Activities”. . . . . . . . . . 42

6.2 Weighting scale for the criteria for each process or data source presented in Table 6.1. . . 43

6.3 Overall value scores of the alternatives. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

6.4 Robustness analyses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

7.1 Performance analysis support system. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

7.2 Distances to the ITIL process’s targets from the demonstration of the method . . . . . . . 55

7.3 Prediction’s precision and process’s statuses deviation from the demonstration of the

method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

8.1 Validation and evaluation profiles of the ITIL tool selection method (blue line) and the ITIL

processes’ performance assessment method (orange dots). . . . . . . . . . . . . . . . . . 64

A.1 Measuring Attractiveness by a Categorical Based Evaluation Technique (MACBETH) judg-

ments matrix for the criteria weighting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74

B.1 Complete sensitivity analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76

xi

Page 14: Methods to Increase ITIL Adoption

xii

Page 15: Methods to Increase ITIL Adoption

List of Tables

4.1 CSF in ERP implementations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

4.2 Comparison of CSF in ITIL implementations . . . . . . . . . . . . . . . . . . . . . . . . . . 20

4.3 Comparison of Pollard and Cater-Steel CSF in ITIL . . . . . . . . . . . . . . . . . . . . . . 20

4.4 Mapping between ITIL CSF and their classes . . . . . . . . . . . . . . . . . . . . . . . . . 22

6.1 Mapping between assessment criteria and process/data sources. . . . . . . . . . . . . . . 42

6.2 Mapping between evaluation criteria and ITIL recommendations for the selected pro-

cesses and data sources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43

7.1 Mapping between selected metrics and criteria and their implementation. . . . . . . . . . 52

C.1 First subperiod performance’s results. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78

C.2 Second subperiod performance’s results. . . . . . . . . . . . . . . . . . . . . . . . . . . . 78

C.3 Performances’ comparison between subperiods. . . . . . . . . . . . . . . . . . . . . . . . 78

xiii

Page 16: Methods to Increase ITIL Adoption

xiv

Page 17: Methods to Increase ITIL Adoption

Acronyms

AHP Analytical Hierarchy Process

AT Attitude Towards use

BI Behavioral Intention to use

CSF Critical Success Factor

DACH Deutschland, Austria and Switzerland

DM Decision Maker

DSRM Design Science Research Methodology

ERP Enterprise Resource Planning

IS Information System

IT Information Technology

ITIL Information Technology Infrastructure Library

ITSM Information Technology Service Management

KPI Key Performance Indicator

MACBETH Measuring Attractiveness by a Categorical Based Evaluation Technique

MCDA Multiple Criteria Decision Analysis

OGC Office of Government Commerce

PEU Perceived Ease of Use

PU Perceived Usefulness

TAM Technology Acceptance Model

xv

Page 18: Methods to Increase ITIL Adoption

U system Use

UK United Kingdom

USA United States of America

UTAUT Unified Theory of Acceptance and Use of Technology

xvi

Page 19: Methods to Increase ITIL Adoption

1Introduction

Contents

1.1 Research Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

1.2 Document Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

1

Page 20: Methods to Increase ITIL Adoption

2

Page 21: Methods to Increase ITIL Adoption

The evolution of technology is leading the world to a “second machine age” [1] where service quality

is becoming more than an option. By investing in strategic quality improvement, companies can better

face disruption, along with economic difficulties and increased demanding costumers.

Information Technology Infrastructure Library (ITIL) has many benefits [2–5] that increase the interest

in many countries [4, 6–11], making it a widely accepted methodology [12]. However, many companies

still make mistakes when trying to implement it [13], due to the great amount of barriers for its adop-

tion [14]. One of the most impactful barriers is the difficulty in its implementation, mostly because ITIL

does not provide advice on how to implement its best practices [15]. Because of that, many compa-

nies abandon their intentions to implement ITIL [9], leading to the research problem: the lack of ITIL

adoption.

To address this problem, two methods focused on two Critical Success Factors (CSFs) for ITIL adop-

tion are proposed, due to the importance of these elements for a successful ITIL implementation, as

evidenced in the literature review (cf. Chapter 4). The objective is to create a mechanism that through

technology and evaluation focused on people and processes contributes to increase ITIL adop-

tion.

The first method focuses on selecting ITIL tools based on Measuring Attractiveness by a Categorical

Based Evaluation Technique (MACBETH), increasing the positive effects of the “ITIL tool selection”

CSF; the second one is a method to evaluate the performance of the selected ITIL processes based

on results-based monitoring and evaluation systems’ building actions that aims to increase the positive

effects of the “monitoring and evaluation of ITIL implementation” CSF.

To demonstrate their use, the two methods were applied in two companies (one for each method).

For the method to select ITIL tools, a company of the bank sector was selected since they had doubts

about the software tool they would use to implement four ITIL processes. The method to evaluate the

performance of ITIL processes was demonstrated in an Information Technology (IT) consulting company

since they wanted to improve the performance of their ITIL processes.

To validate and evaluate the proposal and its results, the four principles of Osterle et al. [16], critical

analysis, and the Moody and Shanks quality framework [17] were used. From that evaluation, it could

be concluded that the methods can contribute to increase ITIL adoption, since they are able to improve

the two CSFs for ITIL adoption that they are focused in.

To communicate the results to proper audiences and obtain scientific appraisal, demonstrations of

the two methods to their practitioners were made, along with the submission and presentation of a

scientific paper in an international conference [18].

3

Page 22: Methods to Increase ITIL Adoption

1.1 Research Methodology

The methodology applied in this research was the Design Science Research Methodology (DSRM).

DSRM incorporates principles, practices and procedures required to carry out design science research,

providing a nominal process model for doing it and a mental model for presenting and evaluating this

kind of research in Information Systems (ISs) [19].

Aiming to create and evaluate IT artifacts such as constructs (vocabulary and symbols), models

(abstractions and representations), methods (algorithms and practices) and instantiations (implemented

and prototype systems) [20], this methodology intends to overcome explanatory research paradigms

such as descriptive and interpretive ones borrowed from social and natural sciences [19].

DSRM is an iterative methodology including the following phases [19]:

• Problem identification and motivation: definition of the specific research problem and justifi-

cation of the value of a solution. Atomizing this problem conceptually should be done in order to

develop an artifact that can effectively provide a solution capturing its complexity. In this research,

the identified problem is the lack of ITIL adoption (Chapter 2).

• Define the objectives for a solution: identification of the objectives of a solution from the problem

definition and knowledge of the state of the problem and possible and feasible solutions. The

objectives can be quantitative or qualitative. For the identified problem, the objective of the solution

is the creation of a mechanism that, through technology and evaluation focused on people and

processes, contributes to increase ITIL adoption (Chapter 5), and the related work shows the most

relevant concepts and models to address this problem (Chapter 4).

• Design and development: determination of the artifact’s desired functionality and its architecture

followed by its construction. A design research artifact can be any designed object with embedded

research contribution. This research’s artifacts are two methods focused on two CSFs (Chapter 5).

• Demonstration: usage of experimentation, simulation, case study, proof or other appropriate

activity to demonstrate the use of the artifact to solve one or more instances of the problem. The

demonstrations of the artifacts were made in two companies: the first one from the bank sector

(Chapter 6) and the second one was an IT consulting company (Chapter 7).

• Evaluation: observation and measurement of the adequacy of the artifact to a solution to the

problem. Requires knowledge of relevant metrics and analysis techniques in order to compare the

results observed from the use of the artifact to the objectives of a solution. The validation and

evaluation of the artifacts were made using the four principles of Osterle et al. [16], critical analysis

and the Moody and Shanks quality framework [17] (Chapter 6 and Chapter 7).

4

Page 23: Methods to Increase ITIL Adoption

• Communication: communication of the problem and its importance, the artifact, its utility and

novelty, the rigor of its design, and its effectiveness to researchers and other relevant audiences.

This step was accomplished with the demonstrations of the two methods to its practitioners and the

submission and presentation of a paper in an international conference (as described in Chapter 8).

With a great focus on organizational context in the solution design, this methodology was the ap-

propriate for this research (see Figure 1.1) since a solution for the identified problem needs to combine

theory with organizational acceptance in order to extend the boundaries of their capabilities [20].

Figure 1.1: Design Science Research Methodology (DSRM).Adapted from [19] to this research work.

1.2 Document Structure

This document is divided in eight different chapters and three appendixes, described as follows:

1. Introduction (Chapter 1) provides the context of the thesis, explains the used research methodol-

ogy and describes the structure of the document.

2. The Problem (Chapter 2) details the motivation and the research problem.

3. Theoretical Background (Chapter 3) explains the theory behind the construction of the artifacts.

4. Related Work (Chapter 4) presents an overview of the literature on the research area, explaining

the most relevant concepts and models to solve the identified problem.

5. Research Proposal (Chapter 5) identifies the objective of the solution, describes the produced

artifacts and explains the principles used in the proposed methodology to evaluate the artifacts.

5

Page 24: Methods to Increase ITIL Adoption

6. ITIL Tool Selection Method (Chapter 6) explains how this artifact was used to prove its capability

to solve one or more instances of the research problem and presents the results of applying the

evaluation methodology to this method.

7. ITIL Processes’ Performance Evaluation Method (Chapter 7) explains how this method was used

to prove its capability to solve one or more instances of the research problem and presents the

results of applying the evaluation methodology to this artifact.

8. Conclusion (Chapter 8) describes how the results were communicated to proper audiences, sum-

marizes the main conclusions, lessons learned, limitations and main contributions of this research,

and presents some proposals for future work.

9. Appendixes present:

A - Criteria Weighting Judgments Matrix

B - Sensitivity Analysis

C - Performance Analysis

6

Page 25: Methods to Increase ITIL Adoption

2The Problem

7

Page 26: Methods to Increase ITIL Adoption

8

Page 27: Methods to Increase ITIL Adoption

This chapter defines the specific research problem and justification of the value of a solution, corre-

sponding to the first step of DSRM: problem identification and motivation.

As technology innovation increases, constraints are removed and new possibilities created, which

affects people’s lives and enterprises [21].

In recent years, technology evolved so much that this new era is now being called “the second

machine age” [1], in comparison to the first one with the rise of Industrial Revolution.

But, besides the positive aspects that this can bring to business, which goes from better forecasts to

quicker ways to modify processes and structures, only a small part of companies have mastery on using

technology to improve their productivity, performance and profit levels [21]. One of the characteristics

identified in these “digital masters” is that they see technologies as tools to transform their processes,

empowering their employees and improving their relations with customers, instead of being just goals or

signals to send to their investors [21].

Service quality is intrinsically related with improving relations with customers by listening to their

needs. Lewis and Booms [22] defined it as a “measure of how well the service level delivered matches

customer expectations” and Cronin and Taylor [23], besides removing expectations from the equation,

considered the customers’ perceptions as a basis to service quality level. Both considered the same

objective for service quality: improve the satisfaction of customers.

In this new era, where even more companies face disruptive technologies that constantly change the

rules of the game along with economic difficulties and increased demanding from customers, method-

ologies that improve service quality in a strategical way become more than just an option.

ITIL is now a widely accepted methodology to improve service quality by increasing its effectiveness

and efficiency. Benefits from its adoption were identified in [2–5] being more than service quality im-

provement and going from reduction in IT downtime to raising of IT staff morale and documented and

consistent IT processes across the organization, which all led to an increased interest on ITIL adop-

tion in many countries as evidenced in studies in China [7], Australia [6, 8, 9], United States of Amer-

ica (USA) [6, 9], Norway [4], United Kingdom (UK) [9, 10], Malaysia [11] and Deutschland, Austria and

Switzerland (DACH) [9].

But, besides the benefits of using ITIL and its wide range increased interest, many organizations are

still far from a full adoption of this methodology or did not have implemented it at all [9].

Shang and Lin [14] identified some barriers to ITIL adoption in their multi-case study:

• Dissatisfied customers due to the gap between the degree of improved service quality and cus-

tomers’ perception;

• Inability to satisfy customers’ specific needs in time;

• Extra costs occurred in education and management;

9

Page 28: Methods to Increase ITIL Adoption

• Time lag between investment in ITIL project and performance outcome;

• Conflicts between urgent needs for quality improvement and cost consideration;

• Difficulties in implementation;

• Employee resistance;

• Lack of integration ability.

Difficulties in implementation is one of the most common and impactful barriers to ITIL adoption,

mostly because ITIL indeed offers a set of best practices but does not provide advice on how to imple-

ment them [15].

This absence of a guide to successful implementation leads many companies on making mistakes

which compromises the entire investment on ITIL with consequent spent of time and money with no

benefits. Some of these mistakes were identified in [13]:

• Lack of management commitment;

• Excess of time spent on complicated process diagrams;

• Not creation of work instructions;

• Not assignment of process owners;

• Too much concentration on performance;

• Exaggerated ambition;

• Failure on momentum maintenance;

• Allowance of departmental demarcation;

• Ignorance of ITIL maintenance importance;

• ITIL implementation based only on book memorization.

This dark side of ITIL can make companies turn off their intentions on its adoption, specially because

it requires a lot of effort and resources which can lead to no benefits.

In a short, the problem is the lack of ITIL adoption.

10

Page 29: Methods to Increase ITIL Adoption

3Theoretical Background

Contents

3.1 Information Technology Infrastructure Library (ITIL) . . . . . . . . . . . . . . . . . . . 13

3.2 Multiple Criteria Decision Analysis (MCDA) . . . . . . . . . . . . . . . . . . . . . . . . 14

3.3 Monitoring and Evaluation Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

11

Page 30: Methods to Increase ITIL Adoption

12

Page 31: Methods to Increase ITIL Adoption

This chapter is divided in three sections: ITIL, Multiple Criteria Decision Analysis (MCDA) and Moni-

toring and Evaluation Systems.

In Section 3.1, the origin of ITIL and how it evolved through time is explained, starting with its defini-

tion and the goal leading to its first version, to then detail the components of its last version (ITIL v3).

Section 3.2 explains some of the most used MCDA methods. Finally, Section 3.3 provides a summary

on the most used types of monitoring and evaluation systems.

3.1 Information Technology Infrastructure Library (ITIL)

ITIL is a set of good practices to be applied on infrastructure, operations and management of IT services,

being now “the most widely accepted approach to Information Technology Service Management (ITSM)

in the world” [12]. In this section the origin and evolution of ITIL is first explained to then detail its last

version (ITIL v3).

Its origin comes from the 80’s, being introduced and distributed by the Office of Government Com-

merce (OGC) in the UK in order to promote efficient and cost-effective IT operations as a consequence

of growing dependence on IT.

The initial version (ITIL v1) consisted on a set of more than 30 volumes covering all the IT Service

Management, leading to its recognition as a reference framework in this area.

In order to make it more accessible and affordable, ITIL v2 was created in 2000 as a consolidation

of the v1 volumes into logical sets divided into two main areas: service delivery (focused on services

needed to adequately support business) and service support (focused on ensuring costumer access to

the appropriate services).

ITIL v3 appeared in 2007 as an extension of v2 and is now the current version with its update from

2011. Contrary to ITIL v2 which was more focused on process, v3 gives more importance to lifecycle with

its holistic perspective on the full service cycle that covers all IT parts of organizations and supporting

components needed to deliver services to the customer.

Five components constitute the core of ITIL v3 as seen in Figure 3.1.

• Service Strategy: provides the view of business and information technology alignment and the

guidance for service management as a strategic asset of the organization. Includes the processes:

Financial Management, Service Portfolio Management, Demand Management [24].

• Service Design: provides guidance for design and development of services and their manage-

ment processes. Includes the processes: Service Level Management, Service Catalogue Manage-

ment, Supplier Management, Availability Management, Capacity Management, IT Service Conti-

nuity, Information Security [25].

13

Page 32: Methods to Increase ITIL Adoption

• Service Transition: provides guidance for the management of complexity related to changes in

services and their management processes executed in Service Operation from the encoding in

Service Design as a consequence of their requirements’ changes in Service Strategy. Includes

the processes: Change Management, Service Asset and Configuration Management, Release

and Deployment Management, Knowledge Management, Transition Management and Support,

Service Validation and Testing, Evaluation [26].

• Service Operation: provides guidance on delivering and supporting services with effectiveness

and efficiency in order to increase value for the customer and the service provider. Includes the

processes: Incident Management, Problem Management, Request Fulfillment, Access Manage-

ment, Event Management [12].

• Continual Service Improvement: provides guidance on ways to increase quality of design, intro-

duction and operation of services and their linkage with Service Strategy, Design and Transition.

Includes the processes: Service Measurement, Service Reporting, Service Improvement [27].

Figure 3.1: ITIL v3 core. Source: [12]

3.2 Multiple Criteria Decision Analysis (MCDA)

MCDA is “a collection of formal approaches which seek to take explicit account of multiple criteria in

helping individuals or groups explore decisions that matter ” [28]. In this section, a summary of some of

the most used MCDA methods is provided.

14

Page 33: Methods to Increase ITIL Adoption

3.2.1 Outranking Methods

For each criterion, partial preference functions are defined, which may correspond to natural attributes

on a cardinal scale, or may be constructed as ordinal scales, not needing to satisfy all the properties

of value functions. Only the ordinal preferential independence is necessary. In this method, if there is

enough evidence to justify that an alternative a is least as good as another alternative b and no strong

argument to the contrary, taking all criteria i into account, we can conclude that a outranks alternative b

if Zi(a) ≥ Zi(b) for all criteria i [28].

A state of indifference is not necessarily implied when this does not occur. When comparing two

alternatives, the result can be one outranking the other (definitive preference), indifference or incompa-

rability [28].

3.2.2 Analytical Hierarchy Process (AHP)

AHP uses additive preference functions to evaluate alternatives. First, a hierarchy of criteria (value

tree) and identification of alternatives is made. Then, assuming ratio scales for all judgments, pairwise

comparison is used to score alternatives on each criterion and weight the criteria. Finally, using weighted

summation of its scores on the different criteria, an overall score for each alternative is obtained, allowing

to compare all the alternatives [28,29].

3.2.3 Measuring Attractiveness by a Categorical Based Evaluation Technique

(MACBETH)

MACBETH is a method for multicriteria value measurement [30, 31]. For each alternative, the Decision

Maker (DM) quantifies its relative attractiveness with the help of semantic judgments about the differ-

ences in attractiveness of several stimuli. Two elements are compared at a time, in an initial, iterative

questioning procedure that requests only a qualitative preference judgment. The consistency of those

answers is then automatically verified by the MACBETH decision support system [32].

By solving a linear programming problem, this system can also generate a numerical scale, repre-

sentative of the DM’s judgments, and weighting scales for all criteria [33–35]. It is then possible to obtain

overall value scores for all the alternatives to make sensitivity and robustness analyses, which allow the

elaboration of an informed recommendation.

3.3 Monitoring and Evaluation Systems

A monitoring and evaluation system provides an organization with information on progress toward

achieving stated targets and goals but also evidence as the basis for any necessary corrections in

15

Page 34: Methods to Increase ITIL Adoption

policies, programs, or projects [36]. This section summarizes two different types of monitoring and

evaluation systems.

3.3.1 Implementation-Focused Monitoring and Evaluation Systems

This is a traditional type of monitoring and evaluation of systems used for projects and designed to

address compliance. It is focused on monitoring and assessing how well the execution of a project,

program or policy is being made [36].

Some of the key features of this type of systems are: systematic reporting on provision of inputs

and production of outputs, provision of information on administrative, implementation and management

issues and data collection on inputs, activities and immediate outputs [36].

3.3.2 Results-Based Monitoring and Evaluation Systems

This is a more recent type of monitoring and evaluation systems which help answering questions like

“what are the goals of the organization?”, “are they being achieved?” and “how can achievement be

proven?”. Its focus goes on providing feedback on the outcomes and goals, comparing how well a

project, program or policy is being implemented against the expected results [36].

Some of the key features of this type of system are: data collection on outputs and how and whether

they contribute toward achievement of outcomes, reporting with more qualitative and quantitative in-

formation on the progress toward outcomes and provision of information on success or failure of the

strategy in achieving desired outcomes [36].

The essential actions to build a system like this are [36]:

• Formulate outcomes and goals;

• Select outcome indicators to monitor;

• Gather baseline information on the current condition;

• Set specific targets to reach and dates for reaching them;

• Analyze and report the results.

16

Page 35: Methods to Increase ITIL Adoption

4Related Work

Contents

4.1 Critical Success Factor (CSF) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

4.2 ITIL Adoption Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

4.3 ITIL Implementation Roadmap . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

17

Page 36: Methods to Increase ITIL Adoption

18

Page 37: Methods to Increase ITIL Adoption

This chapter covers the first half of the definition of the objectives for a solution step of DSRM, in

which the state of the problem and feasible solutions are presented, to then identify the objective for a

solution (Chapter 5) from the problem definition (Chapter 2). The chapter begins with the presentation

of some of the most relevant work about CSFs of ITIL and how they can contribute to successful ITIL

implementations. An overview of their origin and evolution is first provided to then explain their relations

and classifications. Finally, an analysis over the models and roadmap for ITIL implementation based on

those factors is made.

4.1 Critical Success Factor (CSF)

In this section, the origin and evolution of CSFs is provided with an overview of the most relevant contri-

butions from the definition of CSFs to their classification and relations.

4.1.1 Origin and Classification

The concept of CSF was first proposed by Daniel [37] in 1961 and popularized by Rockart [38], 18 years

later. In his definition “Critical success factor (CSF) is the term for an element that is necessary for an

organization or project to achieve its mission. It is a critical factor or activity required for ensuring the

success of an organization or a company” [37].

Somers and Nelson [39] developed in 2001, a list of CSFs across stages of Enterprise Resource

Planning (ERP). In their study, responses from 86 organizations that completed or were in the pro-

cess of completing ERP implementation were used associated with an extensive review of the literature

on IT implementation, business process reengineering and project implementations and descriptions,

resulting in a list of 22 CSFs ranked by their importance (see Table 4.1).

Table 4.1: CSFs in ERP implementations [39]. Source: [6]

19

Page 38: Methods to Increase ITIL Adoption

Besides the focus on ERP implementation, it seemed “reasonable to expect that some of those

CSFs will also be important in the successful implementation of an enterprise-wide service and process

framework such as ITIL” [6], since “ITSM involves organization-wide IS planning” [6] and consequently,

some ERP.

Studies about CSFs in ITIL implementations were made by Hochstein et al. [40] in their study of

six large German organizations and by Tan et al. [41] in a study of an Australian large public-sector

organization. Table 4.2 shows the comparison of those results as in [6].

Table 4.2: Comparison of CSFs in ITIL implementations [40,41]. Source: [6]

Lately, in 2009, Pollard and Cater-Steel [6] studied the implementation of ITSM using ITIL v2 frame-

work in four companies: two located in USA and two in Australia. By analyzing CSFs presented in

Tables 4.1 and 4.2, they compared them with those attributed to the four successful ITIL implementa-

tions studied. Some CSFs were confirmed, and new ones were identified. Table 4.3 shows the results

of this study compared with those of Somers and Nelson [39], Hochstein et al. [40] and Tan et al. [41].

Table 4.3: Comparison of Pollard and Cater-Steel CSFs in ITIL [6] with previous ones [39–41]. Source: [6]

20

Page 39: Methods to Increase ITIL Adoption

Pollard and Cater-Steel [6] concluded that the three new CSFs identified “align well with the core ITIL

philosophy: the need to extend IT thinking beyond the technology to include people and process” [6]

as well as “emphasize the broad reach of ITSM beyond the concerns of IT infrastructure to viewing

IT as a service organization that supports end-to-end business operations” [6]. The new “customer-

focused metrics” CSF evidences that, by giving importance to the rising change from technology-focus

to costumer-focus metrics.

The first steps on creating classes of CSFs for ITIL appeared in 2011 with a study on adoption of

this framework in order to create a model that could help understand it [42]. Through qualitative meta-

Synthesis, seven key factors were identified based on ITIL CSFs, being them:

• Top management support;

• Communication and cooperation;

• Training and competence of involved stakeholder in ITIL project;

• Change management and organizational culture;

• Project management and governance;

• ITIL process implementation and applied technology;

• Monitoring and evaluation.

Later on, in 2013, Ahmad et al. [15] used those seven factors as classes for ITIL CSFs identified in a

extensive literature review. 18 CSFs were identified and mapped into seven classes (see Table 4.4).

4.1.2 Relations between CSFs

In 2002, one year after Somers and Nelson [39] developed a list of CSFs for ERP, Akkermans and van

Helden [43] studied the relations between those CSFs, proposing that they would not act isolated. Using

an ERP implementation in the aviation industry, they not only validated the importance of those factors

on explaining initial failure and eventual success of the implementation, but also found that those CSFs

appeared highly correlated, with changes in one influencing others.

With the evolution of CSFs’ research in ITIL implementations, Pollard and Cater-Steel [6] identified

relations between CSFs for ITIL, evidencing dependencies among them which affected their efficiency

and effectiveness. ITIL-friendly culture was seen as a crucial first-step in any ITIL implementation in

order to increase its possible success, as well as a careful software selection needed process addressing

in order to best succeed, otherwise the effects could be negative for the implementation.

The relations between CSFs were studied in more detail by Mehravani et al. [42] in their examination

of CSFs and their effect on ITIL adoption. The result was a model that illustrates not only the key factors

21

Page 40: Methods to Increase ITIL Adoption

Table 4.4: Mapping between ITIL CSFs and their classes [15]. Adapted from [15].

identified from the CSFs, which were the basis for the creation of classes, but also the relations between

those key factors (see Figure 4.1).

In this model, top management support is the most important key factor since it impacts directly

change management and organizational culture, communication and cooperation, project management

and governance, and monitoring and evaluation, and indirectly ITIL process implementation and tech-

nology. This makes top management support the root of every ITIL adoption, which when increased

gives a boost to most of other factors, but when decreased will affect them negatively.

As a second line, directly influenced by top management support, there are communication and co-

operation, project management and governance, monitoring and evaluation, and change management

and organizational culture. Some relations between them are identified: change management and or-

ganizational culture is not influenced by any other factor of the second line, but it is by training and

competence of stakeholder which is a less important first line key factor than top management support,

with the same happening with communication and cooperation; project management and governance

is directly influenced by communication and cooperation and has a direct impact on monitoring and

evaluation which does not affect any factor.

Finally on the third line there is only ITIL process implementation and technology which does not

influence any other factor and is directly affected by project management and governance.

Based on this model, Ahmad et al. [15] created a similar one for their classes of CSFs with only one

change: they considered that training and competence of stakeholder is part of change management

22

Page 41: Methods to Increase ITIL Adoption

Figure 4.1: Relations between key factors. Adapted from ITIL adoption model using TAM [42].

and organizational culture and consequently influenced by them. This way, training and competence

of stakeholder would be a third line factor directly affected by change management and organizational

culture but maintaining its direct impact on another second line factor: communication and cooperation.

The position of training and competence of stakeholder in the relational model is still somehow con-

troversial since both interpretations are valid: training indeed affects the way organizations understand

ITIL and consequently their culture, but the way training is applied is part of the change management

process. A two-way influence relation would be an hypothesis to consider.

4.2 ITIL Adoption Models

Besides being a less explored field in ITIL research, two adoption models appear as main references

based on technology adoption models in order to explain the effect of ITIL CSFs on behavioral intention

to use this framework which is crucial to actually put people using it, since most CSFs relate to user

acceptance instead of just application selection. This section explains and discusses those models.

4.2.1 ITIL Adoption Model with TAM

Mehravani, Sarvenaz in [42], the proposed ITIL adoption model combines CSFs with the well known

TAM in order to represent their influence as external variables on its components, and consequently on

ITIL adoption.

TAM is the most widely applied technology adoption model [15] proposed by Davis [44] and Davis et

al. [45] to address the reasons for rejection or acceptance of information technology and its consequent

23

Page 42: Methods to Increase ITIL Adoption

use. It is an adaptation of the Theory of Reasoned Action from Fishbein and Ajzen [46] to explain and

predict the behaviors of people in a specific situation [47].

In TAM, the effect of external variables is traced trough beliefs which affect attitudes, intention to

use and consequently, the actual system use. Its original version was composed by five components:

Perceived Usefulness (PU), Perceived Ease of Use (PEU), Attitude Towards use (AT), Behavioral Inten-

tion to use (BI) and actual system Use (U).

In this proposed ITIL adoption model, only PU, PEU, AT and BI are present, being PU, PEU and AT

the only ones linked to ITIL CSFs (see Figure 4.2).

Figure 4.2: ITIL adoption model using TAM. Source: [42]

Top management support is the most important CSF since affects all others directly or indirectly and

all the components of TAM. PEU is directly affected by training and competence of stakeholders, change

management and organizational culture and process implementation and technology, and PU is directly

impacted by communication and cooperation, project management and governance and also change

management and organizational culture.

Contrary to the first TAM version where only PU and PEU are affected by external variables, this

model shows that monitoring and evaluation can directly influence the AT since the simple fact that

people know their feedback will be used to review the implementation performance will make them feel

more obliged to support and cooperate.

The model gives a first approach to the relations between CSFs and adoption factors in a simple

way. Still, some problems may arise since its simplicity comes from explaining those relations in a very

high level which is a known critic on TAM and practical validation is still needed, since this model has

lack of it.

24

Page 43: Methods to Increase ITIL Adoption

4.2.2 ITIL Adoption Model with UTAUT

Ahmad, Norita in [15], the proposed ITIL adoption model uses another acceptance model: Unified

Theory of Acceptance and Use of Technology (UTAUT).

UTAUT is a technology adoption model proposed by Venkatesh et al. [48] as a result of reviewing and

synthesizing eight models explaining information systems usage behavior, including TAM, and presents

a unified view of user acceptance composed by four key constructs:

• Performance expectancy: “The degree to which an individual believes that the new system is

helping him/her in performing the tasks in an easy and efficient way.” [48].

• Effort expectancy: “The degree to which an individual believes the new system is easy to use.” [48].

• Social influence: “The degree to which an individual perceives that important others believe he

or she should use the new system.” [48].

• Facilitating conditions: “The degree to which an individual believes that an organizational and

technical infrastructure exists to support the use of the system.” [48].

The first three directly determine the behavioral intention which indirectly influences the use behavior,

and the fourth only has impact on use behavior. Gender, age, experience, and voluntariness of use

moderate the effect of the four key constructs on intention and use behavior. As a consequence of

linking the ITIL CSFs to UTAUT, the following model was proposed (see Figure 4.3).

Again, top management support, due to its influence on all the CSFs, is considered the most impor-

tant one, having impact on all the key UTAUT components. Monitoring and evaluation is the only CSF

to have direct impact on BI for the same reason it had on AT in TAM. Technology applied, for instance,

has increased importance not only as improving the PEU (effort expectancy) but also as a facilitating

condition that can make the difference on using the framework.

The model gives a more detailed view on the effects of CSFs on ITIL adoption, but some problems

may arise from applying UTAUT which uses many variables to predict intentions and behaviors making

it way more complex than TAM. Adding to that, the model doesn’t consider impact of CSFs on inherent

characteristics of the user, like experience (training should be considered as affecting it) and still needs

more practical appliance validation, specially on successful ITIL implementations.

4.3 ITIL Implementation Roadmap

Using the findings of applying the proposed ITIL adoption model based on UTAUT [15] to a failed ITIL

implementation, Ahmad et al. [15] created a roadmap for future ITIL implementations (see Figure 4.4).

25

Page 44: Methods to Increase ITIL Adoption

Figure 4.3: ITIL adoption model using UTAUT. Source: [15]

This roadmap uses lessons from failure to provide a step-by-step guide to success, with the following

steps:

1. Management and employee commitment: by providing necessary resources, giving importance

to the project and noticing employees, management commitment and support help them getting

committed to the success of the initiative.

2. Consultant selection: it is important to select a consulting company that has the expertise nec-

essary for a smooth ITIL implementation. Ideally should be some with ITIL Service Management

Certification.

3. Process identification and selection: organization and consulting company should identify and

select the main processes to adhere to ITIL standards, understanding the business, the roles and

the culture of the organization in order to prepare a smooth change.

4. Understand the current processes, functions and roles: it is important to understand the exis-

tent resistance in the organization and define and document well the processes to implement and

the roles and responsibilities of every person involved in the ITIL initiative.

5. Identifying and understanding the key customers: it is crucial to know the customers that the

organization is investing on, since they can give valuable information in order to understand if the

project is a good investment and drive it to success.

26

Page 45: Methods to Increase ITIL Adoption

Figure 4.4: Roadmap for ITIL implementation. Source: [15]

6. Construct a project plan: adoption of a project management methodology and development of

an implementation plan (includes communications, training and awareness and a metric program

to map improvements) that clarifies the present situation of the organization and creates a vision

for the future.

7. Redesign processes to adhere to ITIL standards: in order to integrate ITIL best practices, pro-

cesses need to be re-engineered. Since ITIL processes have internal dependencies it is important

to understand and follow process governance best practices to ensure that implemented ITIL pro-

cesses will contribute to IT organization’s goals and better use the assigned resources.

8. ITIL tool selection: in order to avoid mistakes and implement ITIL processes in a smoother way,

a proper tool should be selected from the proper vendor.

9. Transition plan and designing training: enough time should be given to carefully consider the

training requirements and goals as well as design the transition plan and training activities.

10. Training the employees: proper training should be provided to employees using a common lan-

guage and understanding of best practices to insure that policy adherence, roles, and responsibil-

ities are understood and procedures are followed.

11. Implementation of ITIL process and technology: consists on the actual ITIL implementation

27

Page 46: Methods to Increase ITIL Adoption

that varies based on scale, required customization and degree of complexity.

12. Evaluation and improvement: this final step includes analysis of the management of the project

and identification of lessons learned, considering any additional benefits and unexpected prob-

lems. In order to do that, the new processes should be roll out with detailed monitoring for ob-

servation and continuous improvement. Progress measurement must be also done, using defined

improvement criteria before and after the implementation.

Besides being a roadmap to success, this guide mixes the roles of the organization and the external

help and it is created based on what should not be done without a practical appliance in an ITIL imple-

mentation in order to actually make it succeed, which is its weakest point. Another problem is that, in

some steps, it does not provide methods to execute them, like when selecting a tool, it does not suggest

any criteria to be applied. Still, this guide provides valuable information from lessons taken from a real

world implementation.

28

Page 47: Methods to Increase ITIL Adoption

5Research Proposal

Contents

5.1 Objective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

5.2 ITIL Tool Selection Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

5.3 ITIL Processes’ Performance Evaluation Method . . . . . . . . . . . . . . . . . . . . . 35

5.4 Principles of the Evaluation Methodology . . . . . . . . . . . . . . . . . . . . . . . . . 36

29

Page 48: Methods to Increase ITIL Adoption

30

Page 49: Methods to Increase ITIL Adoption

This chapter is divided in four sections. Section 5.1 covers the second half of the definition of

the objectives for a solution step of DSRM and introduces the third phase of DSRM, in which the

artifacts’ desired functionality is presented. From the problem identified in Chapter 2 and the state

of the problem and feasible solutions presented in Chapter 4, the objective for a solution is inferred.

Section 5.2 to Section 5.3 correspond to the third phase of DSRM, in which the artifacts’ desired

functionality is presented in detail, which aim to achieve the identified objective to solve the problem

stated in Chapter 2. Section 5.4 explains the principles used in the proposed methodology to evaluate

the artifacts.

5.1 Objective

The problem, as stated in Chapter 2 is the lack of ITIL adoption, which results from the many barriers

and mistakes on ITIL implementation.

The objective of this proposal is to present a mechanism that through technology and evaluation

focused on people and processes contributes to increase ITIL adoption.

In order to fulfill the objective mentioned before, two methods are proposed to increase ITIL adoption

focused on two CSFs:

1. Tool selection: proper tool selection helps users see the system as not hard to use [15].

2. Monitoring and evaluation of ITIL implementation: the aim is to determine the relevance and

fulfillment of objectives, efficiency and effectiveness by assessing the strengths and weaknesses

of an ongoing or completed project, program or policy [49].

The choice of these factors comes not only from their importance but mostly from the knowledge that

most of ITIL CSFs are totally dependent on the organizational side (ex. top management support) being

very difficult to be controlled by external entities that support the organization on ITIL implementation.

For that reason, the choice is to the factors that can be controlled externally (but always looking to the

organization’s context) and that are focused on technology and evaluation. Processes and people are

also taken into account but as agents to these CSFs. This is why process priority and people’s feedback

are considered when choosing a tool and creating an ITIL processes’ performance evaluation system.

Since CSFs do not act isolated, the two selected ones are affected by others difficult to control, which

makes it important to take some assumptions in this proposal:

• Top management supports the ITIL initiative;

• IT staff is up to be part of a changing environment and has the inherent ability to adapt to change;

• Organization has the capability of communication and cooperation between departments;

31

Page 50: Methods to Increase ITIL Adoption

• Change management, project management and governance consider external advices for ITIL

initiative;

• Good quality staff is allocated for ITIL;

• Organization knows their processes and customers;

• Processes to implement are previously selected;

• Training is properly provided to the staff.

With those assumptions, the methods correspond to procedures to be incorporated in the implemen-

tation plan to improve both selected CSFs, being a guide for the organizations and for the entities that

help them implement ITIL.

In the following sections, these methods are explained in detail. Section 5.2 details the ITIL tool

selection method based on MACBETH and Section 5.3 explains the method to evaluate the performance

of the selected ITIL processes based on results-based monitoring and evaluation systems’ building

actions.

5.2 ITIL Tool Selection Method

This section is divided in four subsections, corresponding to the sequential process that composes this

method. MACBETH was chosen as the basis technique for this method since only requires qualitative

judgments, instead of quantitative ones to score alternatives and weight criteria, and with the support

of its powerful decision support system M-MACBETH can automatically compute the overall values of

alternatives and make robustness and extensive sensitivity analysis. In the first step, the identification

of the evaluation criteria and definition of performance levels are made (Section 5.2.1). The second

step is the criteria weighting and evaluation of their performance levels on which criteria weights are

assessed and a value function for each criterion is built (Section 5.2.2). The third step corresponds to

the tool testing and document analysis for each criterion (Section 5.2.3). The final step is the analysis

of the results, on which overall value scores are obtained for all alternatives. Sensitivity and robustness

analyses are also performed to help giving a selection recommendation (Section 5.2.4).

5.2.1 Identify the Criteria and Define their Performance Levels

This first step consists on identifying the criteria to evaluate the software tools for ITIL and define the

performance levels. For this proposal a focus on the functionality is proposed to compare tools according

to their core, including processes and people along with technology.

Three groups of criteria are proposed in this method:

32

Page 51: Methods to Increase ITIL Adoption

• Processes: tools are useful to provide help on aligning company’s needs with ITIL’s implementa-

tions as well as a way to perform processes more efficiently, improving results of ITIL implementa-

tion [50]. This way, tool selection must depend on process selection and has to fully focus on tools

that best provide help on executing the selected processes. Three criteria compose this group,

being them: information (data used by processes), activities (tasks that compose the processes)

and measures (quantification of the processes’ performance using metrics and Key Performance

Indicators (KPIs)).

• Exporting Formats: it is important, for each ITIL tool, to consider how data can be extracted from

processes, reports and knowledge base to be used outside. This group is then composed by one

criterion: exporting formats, which is applied to tickets, reports and knowledge base to make an

analysis on the compatible exporting formats for their data.

• Costumers: considering the costumer view over the ITIL tool by emphasizing “the broad reach

of ITSM beyond the concerns of IT infrastructure to viewing IT as a service organization that

supports end-to-end business operations” [6] is also important, following ITIL core philosophy. The

criterion of this group focuses on data available to costumers which come from diverse sources

like knowledge base, processes and their metrics.

Each tool is assessed according to the presence of each criterion as recommended by ITIL best

practices for each selected ITIL process. The levels of performance are then defined considering the

percentage of ITIL recommendations in the tool for the corresponding criterion: level A (>= 75%), level

B (50% - <75%), level C (25% - <50%) and level D (<25%). In any case, a DM can add more relevant

criteria and change the number and range of performance levels to customize this method to more

specific organization’s needs.

This step can also be less human dependent if criteria and performance levels to be applied become

standardized. That way, every company would use the same criteria and number of performance levels,

automating this step.

5.2.2 Weight the Criteria and Evaluate their Performance Levels

In this step, a value function is built for each criterion from the preferences of the DM. For each criterion,

two reference performance levels are defined (“neutral” and “good”). Then, using MACBETH semantic

categories: very weak, weak, moderate, strong, very strong or extreme, the DM judges the differences

in attractiveness between each two levels of performance, choosing one or more of those categories.

Finally, M-MACBETH, the decision support system, uses a linear programming problem to generate a

numerical value scale, representative of the DM’s judgments, which is then analyzed by the DM to make

corrections if necessary. Using the validated value scales, M-MACBETH computes their value functions.

33

Page 52: Methods to Increase ITIL Adoption

Each criterion is also weighted according to ranks attributed by the DM. First, their neutral-good

swings are ranked, then, just like happens with the performance levels, the DM uses the MACBETH

semantic categories to judge the difference in attractiveness between each two neutral-good swings,

which M-MACBETH uses to create a weighting scale for all criteria. In the end, the DM can validate the

proposed weights, adjusting them if necessary.

This is a step that needs a lot of human interaction, turning it both manual and automatic (supported

by a calculating system). Contrary to the first step that can be totally automated using standard criteria

and performance levels, this is a step that translates the company’s preferences, making human interac-

tion a crucial element. By making their judgments, companies specify which criteria and performances

best match their needs according to what was defined in the previous step. Only the generation of

numerical scales for each criterion and the criteria weights are automated.

5.2.3 Test the Tools and Analyze their Documentation

In this third step, tool testing is made for each criterion using free trial versions, which have the purpose of

allowing some tool evaluation before obtaining them. Since these versions can present some limitations

compared to the paid ones, their documentation is also analyzed to obtain additional information. Using

the ITIL recommendation, a mapping between each tool and the ITIL best practices for each criterion is

made, using the percentage scales defined in step 1 for the performance levels.

5.2.4 Analyze the Results

With the performance levels for each criterion attributed to all the alternatives, their conversion into value

scores must be done. In this last step, value functions built in step 2 for each criterion are used for this

purpose. Using weighted summation of its value scores, an overall value score is obtained using the

support system for each alternative, achieving a final ranking of alternatives. Finally, sensitivity and

robustness analyses are made followed by the validation from the DM.

Sensitivity analysis uses criteria weight variations within the limits allowed by the DM’s weighting

judgments to show what changes may be produced in the results by those variations.

The robustness analysis allows detecting the existence of dominance or additive dominance be-

tween alternatives, examining the implication into the global results from variation of all or some of the

parameters of the model.

Dominance of an alternative a over b occurs when a is better than b in at least one criterion and

not worse in any criterion. In M-MACBETH, this case is represented by a red triangle, meaning that the

alternative in row dominates the one in column.

Additive dominance of an alternative a over b occurs when a is always globally more attractive than

34

Page 53: Methods to Increase ITIL Adoption

b using additive aggregation. In M-MACBETH, this case is represented by a green cross, meaning that

the alternative in row additively dominates the one in column.

There are three sub-analyses in the robustness analysis, either using local information (criteria value

scales) or global information (criteria weights), being them:

• Ordinal: using local information, considers the ranking of the options within each criterion. Using

global information, considers the ranking of the criteria “neutral-good” swings.

• MACBETH: considers the judgments in the matrices of judgments for the value scales when using

local information or for the criteria weights when using global information.

• Cardinal: uses the criteria value scales when considering local information or the criteria weighting

scale when considering global information.

With these analyses completed it is then possible to recommend an alternative.

5.3 ITIL Processes’ Performance Evaluation Method

This section details the method to evaluate the performance of the selected ITIL processes. Results-

based monitoring and evaluation systems’ building actions were selected as the basis for this method

since their purpose is to create a system that provides feedback on the outcomes and goals, comparing

how well a project, program or policy is being implemented against the expected results [36]. In the

first step, a selection of the evaluation criteria and their ITIL metrics for the selected processes is made

(Section 5.3.1). The second step is the definition of the analysis period and its metrics’ targets (Sec-

tion 5.3.2). The third step corresponds to the calculation of the metrics’ values in the analysis period

(Section 5.3.3). The final step is the analysis of the results, on which the values for the analysis pe-

riod are compared with their target values and the performance is evaluated according to the evaluation

criteria (Section 5.3.4).

5.3.1 Select the Evaluation Criteria and Metrics

This step is based on the “formulate outcomes and goals” and “select outcome indicators to monitor”

actions to build results-based monitoring and evaluation systems [36]. Therefore, criteria and its metrics

are first chosen to evaluate the performance of the selected ITIL processes. ITIL proposes a set of

metrics for each process according to two criteria: effectiveness and efficiency (the other metrics are

considered as important only for control) with the goal to achieve higher performances [12]. Those met-

rics and evaluation criteria should be used as recommended by ITIL as a basis for evaluation purposes.

Other metrics derived or not from those as well as other criteria can also be added to customize this

35

Page 54: Methods to Increase ITIL Adoption

method. After selecting the evaluation criteria and metrics to use, a mapping between them is made in

order to categorize the metrics for further analysis of the results in step 4.

5.3.2 Define the Analysis Period and its Metrics’ Targets

This step consists on defining the analysis period and specifying its targets. Based on the “gather

baseline information on the current condition” and “set specific targets to reach and dates for reaching

them” actions to build results-based monitoring and evaluation systems [36], targets for each selected

metric are defined based on the current condition of the company and the analysis period, which must

be a relevant one to assess the ITIL processes’ performance. Those targets will then be crucial to step

4, on which the performance will be analyzed and evaluated.

5.3.3 Calculate the Metrics in the Analysis Period

This is an automatic step which consists only on using a support system that periodically calculates

the selected performance metrics during the analysis period. This system must show the updated

values of the metrics, but also give information about how distant are the current metrics’ values to the

defined performance targets for the analysis period. The goal is to provide the company with crucial

performance data during that period so that actions can be taken to achieve the targets, but also be the

basis for step 4.

5.3.4 Analyze the Results and Evaluate them According to the Selected Criteria

The final step is based on the “analyze and report the results” action to build results-based monitoring

and evaluation systems [36]. Consists on analyzing the results of the whole analysis period, on which

the defined targets are used to evaluate the performance by comparing the calculated metrics with their

target values. All the analysis and evaluation is made using the performance criteria (groups of metrics),

giving insides about the strengths and weakness of the ITIL processes’ performance.

5.4 Principles of the Evaluation Methodology

To evaluate each artifact, it is proposed a methodology divided in three steps: the first one corresponds

to the description of the execution conditions of the evaluation, the second step is the validation of the

artifact, and finally, the last step is the evaluation of the artifact’s functionality.

In this section it is provided a brief explanation of the principles used in this methodology, starting

with the Design Science Research Evaluation framework, followed by the four principles of Osterle to

then finalize with the Moody and Shanks quality framework.

36

Page 55: Methods to Increase ITIL Adoption

5.4.1 Design Science Research Evaluation Framework

Evaluation is a crucial step in DSRM because it is what verifies the adequacy of the artifact to a solution

to the problem, comparing its objectives with the results observed. In context of DSRM [20], five design

evaluation methods were defined, being them: observational, analytical, experimental, testing and

descriptive. However, not much more guidance was provided on how to accomplish each of these

evaluation paths.

Taking into account prior research in the area of DSRM evaluation, Pries-Heje et al. [51] proposed a

framework to fill this gap by helping design science researchers build strategies for evaluation, achieving

improved rigor in DSRM.

This framework distinguishes evaluation in three dimensions, each one having two aspects. The first

dimension is the time of the evaluation, which can be done ex ante (the evaluation takes place before

the artifact is developed), or ex post (the evaluation occurs with the artifact already developed). The

second dimension is the form of the evaluation that can be artificial (the evaluation considers a solution

in a non-realistic way), or naturalistic (the evaluation explores the performance of a solution within its

real environment). The third and final dimension distinguishes the artifact between design process

(result of a particular process that can be considered tangible) and design product (set of activities,

tools, methods and practices that can be used to guide the flow of production).

A strategy to evaluate the artifact is also proposed based on three questions:

• When does the evaluation take place?

• How is it evaluated?

• What is actually evaluated?

5.4.2 Four Principles of Osterle et al.

With the objective of providing a contribution to the rigor of research such as criteria for journal and

conference reviewers work, criteria for evaluation of researchers and research organizations and design-

oriented information systems research in the international research community, these principles result

from a memorandum written by 10 authors and supported by 111 full professors to validate artifacts.

These principles are [16]:

• Abstraction: the artifact must be applicable to a class of problems.

• Originality: the artifact must substantially contribute to the advancement of the body of knowl-

edge.

37

Page 56: Methods to Increase ITIL Adoption

• Justification: the artifact must be justified in a comprehensible manner and must allow for its

validation.

• Benefit: the artifact must yield benefit, either immediately or in the future, for the respective stake-

holder groups.

5.4.3 Moody and Shanks Quality Framework

As a result of research on how to evaluate and improve the quality of data models, the Moody and

Shanks quality framework uses the perspective of stakeholders for that purpose and proposes the fol-

lowing quality factors [17]:

• Completeness: refers to the containment of all users and information requirements in the model.

• Integrity: refers to the definition by the model of all applied business rules.

• Flexibility: consists on the ease of applying changes in requirements without changing the model

itself.

• Understandability: is the ease of perceiving the concepts and structures in the model.

• Correctness: refers to whether the model conforms to rules and conventions.

• Simplicity: refers to the containment of the minimum number of entities needed for the model,

turning it easy to follow and apply.

• Integration: refers to the consistency of the model with the rest of the organization.

• Implementability: is the ease to implement the model according to defined constraints.

38

Page 57: Methods to Increase ITIL Adoption

6ITIL Tool Selection Method

Contents

6.1 Demonstration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

6.2 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

39

Page 58: Methods to Increase ITIL Adoption

40

Page 59: Methods to Increase ITIL Adoption

This chapter is divided in two sections and details the demonstration and evaluation steps of DSRM

for the ITIL tool selection method.

Section 6.1 explains how experimentation, simulation, case study, proof or other appropriate activity

was used to demonstrate the capacity of this artifact to solve one or more instances of the problem,

corresponding to the fourth step of DSRM: demonstration.

Section 6.2 details how the evaluation methodology explained in Chapter 5 was applied to the artifact

and presents its results, corresponding to the fifth step of DSRM: evaluation.

6.1 Demonstration

This section demonstrates the use of the ITIL tool selection method based on MACBETH, corresponding

to the demonstration phase of DSRM for this artifact.

A company from the bank sector that wanted to implement four ITIL processes and had doubts

about the software to use, was selected for this purpose. The four processes that this company wanted

to implement were: incident management, request fulfillment, problem management and change man-

agement.

The DM in the field study here reported was the systems manager of the company. The author of

this dissertation was the decision analyst in this method.

The software solutions assessed were BMC Remedy, ServiceNow, ZenDesk and JIRA SD, which

were selected due to their representativeness in the market as consequence of characteristics like an-

tiquity, usability, popularity and potential of expansion.

The following subsections explain the demonstration for each step of the method.

6.1.1 Identify the Criteria and Define their Performance Levels

In this first step, meetings with the company’s DM were made to validate the criteria and performance

levels to be used in the model. All the proposed criteria and performance levels were validated by the

DM and joint with the four selected ITIL processes or the three data sources on which they would be

applied. The result was a validated mapping between the list of criteria and the selected processes or

data sources (see Table 6.1).

Note that the four proposed and validated performance levels are applied on each criterion.

6.1.2 Weight the Criteria and Evaluate their Performance Levels

In this second step, the M-MACBETH decision support system was used to help the DM define reference

performance levels, weight the criteria and evaluate their performance levels.

41

Page 60: Methods to Increase ITIL Adoption

Table 6.1: Mapping between assessment criteria and process/data sources.

First, the DM was asked to select neutral (neither positive nor negative) and good (significantly attrac-

tive) reference levels. It was defined that all criterion would have the same neutral and good reference

levels, which means that if a level C corresponded to neutral reference level in one criterion, all others

would have level C as the neutral one. For all criteria, the DM chose level A as the good reference and

C as the neutral one.

Then, choosing one or more MACBETH semantic categories, the DM judged the attractiveness

differences between each two performance levels. The DM defined that the judgments would be the

same for all criteria. Figure 6.1 presents the validated DM’s judgments matrix and the numerical scale

computed by the M-MACBETH for the criterion “Activities” for the process “Incident Management”.

Figure 6.1: MACBETH judgments matrix and numerical scale for criterion “Activities”.

The numerical scales were anchored on the value scores 0 and 100 which were assigned to the two

reference levels “neutral” and “good”, respectively. Those scales were proposed by the M-MACBETH

decision support system based on the set of judgments made by the DM, who then analyzed and

validated them. Using the validated value scales, M-MACBETH computed their value functions.

To weight the criteria, neutral-good swings were ranked by the DM for all the criteria by their overall

42

Page 61: Methods to Increase ITIL Adoption

attractiveness. Then, the DM used MACBETH semantic categories to judge the differences in attrac-

tiveness between each two of them as shown in Figure A.1 (Appendix A).

Finally, with those judgments, the M-MACBETH created a weighting scale that was validated by the

DM and shown in Figure 6.2.

Figure 6.2: Weighting scale for the criteria for each process or data source presented in Table 6.1.

6.1.3 Test the Tools and Analyze their Documentation

To test the tools, free trial versions were used, since their purpose is to show some functionality to help

the DM make his/her decision. Complementing that, tools’ documentation was also analyzed since trial

versions have limitations on what can be tested.

With this information and looking to ITIL recommendations for each criterion, a mapping between all

criteria and ITIL recommendations was made, obtaining the performances for all the four selected tools.

The results are presented in Table 6.2.

Table 6.2: Mapping between evaluation criteria and ITIL recommendations for the selected processes and datasources.

43

Page 62: Methods to Increase ITIL Adoption

6.1.4 Analyze the Results

The performances obtained in the third step were inputted in M-MACBETH. Using the value functions

built in the second step, this software transformed the performances into value scores and calculated

the overall scores for all selected tools (see Figure 6.3). JIRA SD ranked first with 73.03 overall units

followed by ServiceNow with 72.24 overall units. BMC Remedy became third with 69.46 overall units

and ZenDesk was the worst with 68.26 overall units. The results clearly show that none has a good

performance in all the criteria, since all scores are below 100 overall units. However, JIRA SD has the

closest score to the overall score of the hypothetical alternative “Good at all”.

Figure 6.3: Overall value scores of the alternatives.

JIRA SD does not have the highest score in only three criteria: “Metrics/KPIs” for Incident Manage-

ment process, “Metrics/KPIs” for Change Management process and “Exporting Formats” for reports. A

sensitivity analysis on the weight of criterion “Metrics/KPIs” for Incident Management showed that the

weight of this criterion needed to be raised up from 3.17% to 4.2% to see ServiceNow be ranked first

and to 9.0% to see ZenDesk on top. The same analysis showed that for the criterion “Metrics/KPIs”

for Change Management, the weight needed to be raised up from 3.17% to 9.0% to see ZenDesk be

ranked first, and for the criterion “Exporting Formats” for reports the weight needed to be raised up from

1.59% to 4.6% to make ServiceNow the first choice; to 5.1% to put BMC Remedy on top; and to 17.4%

to see ZenDesk be ranked first. However the DM opted to not change the weights. All the analysis here

detailed is shown in Figure B.1 (Appendix B).

Robustness analyses were also made with M-MACBETH. In the first robustness analysis, only ordinal

data in local and global information was considered, concluding that this information was insufficient to

select the best alternative as shown in the left side of Figure 6.4.

The second robustness analysis was made using MACBETH judgments. There were no changes to

the obtained results of the previous analysis.

Finally, the third robustness analysis was made using simultaneous variations of ±1% on the weights

of all criteria, not allowing negative weights. This analysis showed that JIRA SD continues to be the best

alternative within these variations on the criteria weights. The right side of Figure 6.4 shows the results

44

Page 63: Methods to Increase ITIL Adoption

of this analysis, where the green crosses in the cells mean that the alternative in row, JIRA SD, additively

dominates all the other alternatives in columns BMC Remedy, ServiceNow and ZenDesk.

Figure 6.4: Robustness analyses (left uses ordinal scale for local and global information, and right uses ordinal,MACBETH and cardinal with ±1% of global uncertainty).

Taking into account all the defined criteria and the judgments of attractiveness made by the DM,

JIRA SD was recommended to the company, since it is the best alternative considering the overall value

scores and the sensitivity and robustness analyses.

6.2 Evaluation

In this section, the evaluation methodology is applied to the ITIL tool selection method, starting with the

Design Science Research Evaluation to describe the execution conditions of the evaluation, followed by

the four principles of Osterle to validate the artifact, and finally, the Moody and Shanks quality framework

to evaluate the functionality of the artifact. In this evaluation process, feedback from the DM of the

company was collected during the demonstration of the method along with an interview after applying it.

The DM was a certified systems manager of the company with more than 9 years of experience.

6.2.1 Design Science Research Evaluation Framework

This framework was used to describe the execution conditions of the evaluation of the ITIL tool selection

method. The results were the following:

45

Page 64: Methods to Increase ITIL Adoption

• When did the evaluation take place? The evaluation was ex post, meaning that the method was

evaluated after its construction and demonstration.

• How was it evaluated? The evaluation was naturalistic since it was conducted in a real company

facing real problems.

• What was actually evaluated? The method was considered a design process, being the result

of a particular process and not a final product.

6.2.2 Four Principles of Osterle et al.

The four principles of Osterle et al. [16] were applied to validate the method, with the following results

(see Figure 8.1):

• Abstraction: the method can be applied to any company having doubts on choosing an ITIL tool,

giving the option to add criteria and performance levels to meet all the company’s requirements.

• Originality: the method was seen as an original solution, since the DM didn’t know about a similar

research or product for this purpose.

• Justification: the method is justified by the motivation of the problem and the related work. It is

also described with clear steps and instructions and demonstrated using graphical representations

of its appliance.

• Benefit: the method yields benefit, since provides an easier and complete evaluation of ITIL tools,

confirmed by the demonstration, where it helped the DM select an ITIL tool, leading to its intention

to continue to use this method.

The four principles were achieved, thus showing the validity of the method.

6.2.3 Moody and Shanks Quality Framework

The following results were obtained from the appliance of this framework to the demonstration of the

ITIL tool selection method and the interview with the DM after applying the method. The results were

(see Figure 8.1):

• Completeness: The method is complete since the used criteria contain all the DM’s requirements,

and the DM can include or remove criteria and change their performance levels to customize the

model to his needs.

46

Page 65: Methods to Increase ITIL Adoption

• Integrity: The method combines interviews and observation with literature review to define criteria

and their performance levels. This way, a basis composed by some constraints is introduced upon

which the specific organization’s needs are taken into account to mitigate possible errors without

losing flexibility.

• Flexibility: The method is flexible since the DM can adjust it to his organization’s strategies.

• Understandability: The method uses concepts of the ITIL language, which turns it easier to

understand, but the DM lacks knowledge of the used decision analysis process. Guidance is

needed to overcome this difficulty.

• Correctness: According to DM’s intentions, the method is valid and correct.

• Simplicity: The method is simple since it is easy to follow and apply.

• Integration: The method helps organizations make the best decision, being consistent with the

problem.

• Implementability: The method implementability is dependent on factors such as organization’s

policies and laws. The company on which this method was demonstrated used this as a decision

auxiliary tool.

Almost all the quality factors were accomplished. Only understandability and implementability

were not totally accomplished. The first factor was just half accomplished since the method was not

easy to understand at the beginning due to some unfamiliarity with the decision analysis process itself,

which was solved by a period of adaptation. The second factor was not verified since there were several

bureaucracies to implement this solution, specially in a company of the bank sector. These results show

that this method is suitable for evaluating software tools for ITIL.

47

Page 66: Methods to Increase ITIL Adoption

48

Page 67: Methods to Increase ITIL Adoption

7ITIL Processes’ Performance

Evaluation Method

Contents

7.1 Demonstration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

7.2 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

49

Page 68: Methods to Increase ITIL Adoption

50

Page 69: Methods to Increase ITIL Adoption

This chapter is divided in two sections and details the demonstration and evaluation steps of DSRM

for the method to evaluate the performance of the selected ITIL processes.

Section 7.1 explains how experimentation, simulation, case study, proof or other appropriate activity

was used to demonstrate the capacity of this artifact to solve one or more instances of the problem,

corresponding to the fourth step of DSRM: demonstration.

Section 7.2 details how the evaluation methodology explained in Chapter 5 was applied to the artifact

and presents its results, corresponding to the fifth step of DSRM: evaluation.

7.1 Demonstration

This section demonstrates the use of the ITIL processes’ performance evaluation method based on

results-based monitoring and evaluation systems’ building actions, corresponding to the demonstration

phase of DSRM for this artifact.

An IT consulting company that wanted to improve the performance of its ITIL processes was selected

for this purpose. The processes that this company wanted to improve were all categorized as request

fulfillment.

The following subsections explain the demonstration for each step of the method.

7.1.1 Select the Evaluation Criteria and Metrics

In this first step, criteria and its metrics were chosen to evaluate the performance of the selected ITIL

process. To do that, meetings in the company along with literature review were made, combining the

ITIL’s recommendations for the selected ITIL process with company’s strategy and interests. Only one

metric from the recommended by ITIL was not chosen due to not being a main priority to the company

associated with the required complexity to implement it. Furthermore, there were not additional neither

derived metrics from the ITIL ones.

Therefore, all the chosen metrics were selected as recommended by ITIL and mapped with the ITIL

suggested criteria: effectiveness and efficiency. Along with that, another criterion was proposed and

used: load (a control criteria focused on the amount of processes’ instances). This way, all the metrics

became linked to a criterion (see Table 7.1 where green means the metric was implemented and red

means it wasn’t) to then analyze the results in step 4.

7.1.2 Define the Analysis Period and its Metrics’ Targets

In this second step, the analysis period and its metrics’ targets were defined. It was agreed that the

analysis period would be divided in two subperiods, both using the same metrics and criteria defined in

51

Page 70: Methods to Increase ITIL Adoption

Table 7.1: Mapping between selected metrics and criteria and their implementation.

step 1 and lasting two weeks each.

In the first subperiod, the company would be asked to perform the processes with their “as is” metrics

(only having access to their current metrics) and at same time the performance would be externally

analyzed using the “to be” metrics defined in step 1. After that, the values of the “to be” metrics would

be provided to the company as a performance report of that subperiod.

In the second subperiod, the company would continue to perform the same processes but this time,

would have access and use the metrics defined in step 1 to analyze the performance during the two

weeks period. After that, the final results would be analyzed and compared with those from the first

subperiod.

The metrics’ targets were also defined based on the performance condition of the company and the

analysis period. It is with those values that the performance can be analyzed and evaluated in terms of

the distance to the aimed targets. Note that the values for the second subperiod were defined only after

concluding the first superiod and access its performance report. All the values can be seen in Table C.1

and Table C.2 (Appendix C).

7.1.3 Calculate the Metrics in the Analysis Period

To calculate the performance metrics defined in step 1, a support system was used (see Figure 7.1).

This system was created using the database of the company’s ITIL software tool to calculate and present

the selected metrics with the distance to their subperiod target values.

The presented data was periodically updated during the analysis period and the target values changed

according to the respective subperiod of analysis.

The access to this system was only provided to the company during the second subperiod of analysis,

on which had the role of supporting the monitorization of the processes’ performance.

Along with the metrics defined in step 1, their targets and distance to them, more data was provided

by the system such as processes’ instances by assignees, types of processes and historical data to give

a better inside on the processes’ performance.

52

Page 71: Methods to Increase ITIL Adoption

Figure 7.1: Performance analysis support system.

7.1.4 Analyze the Results and Evaluate them According to the Selected Criteria

In this final step the results from both analysis subperiods were analyzed.

First, a performance analysis was made for each subperiod, comparing their values with the re-

spective targets according to the selected criteria defined in step 2 (see Table C.1 and Table C.2 from

Appendix C).

Then, using those criteria, performance evaluation was made comparing both subperiods’ perfor-

mances in terms of value distance to the defined targets, which allowed to identify their strengths and

weaknesses (see Table C.3 from Appendix C). It was concluded that the second subperiod had a bet-

ter performance than the first one with clear effectiveness improvement (one target accomplished and

shorter distance to the one that was not fulfilled in this criterion).

7.2 Evaluation

In this section, the evaluation methodology is applied to the ITIL processes’ performance evaluation

method, starting with the Design Science Research Evaluation to describe the execution conditions

of the evaluation, followed by the four principles of Osterle to validate the artifact, and finally, critical

analysis and the Moody and Shanks quality framework to evaluate the functionality of the artifact. In

this evaluation process, feedback was collected during the demonstration of the method and interviews

53

Page 72: Methods to Increase ITIL Adoption

were made with practitioners after its appliance. The interviewed practitioners were a senior software

architect with more than 15 years of experience, a business analytics consultant with more than 10 years

of experience and a SAP business unit manager with more than 7 years of experience.

7.2.1 Design Science Research Evaluation Framework

This framework was used to describe the execution conditions of the evaluation of the method to assess

the performance of the selected ITIL processes. The results were the following:

• When did the evaluation take place? The evaluation was ex post since the method was evalu-

ated after its construction and demonstration.

• How was it evaluated? The evaluation was naturalistic, being conducted in a real company

facing real problems.

• What was actually evaluated? The method was considered a design process, being the result

of a particular process and not a final product.

7.2.2 Four Principles of Osterle et al.

The four principles of Osterle et al. [16] were applied to validate the method, with the following results

(see Figure 8.1):

• Abstraction: the method can be applied to any company that wants to evaluate the performance of

their ITIL selected processes, giving the option to add criteria and metrics to meet all the company’s

requirements.

• Originality: the method was not seen as totally original since practitioners knew about similar

approaches to evaluate the performance of the processes. However, the formalism and rigor

associated made the difference and were seen as a new and better way to apply an empirical

known method.

• Justification: the motivation of the problem and the related work justify the method which is

described with clear instructions and demonstrated with illustrations of its appliance.

• Benefit: the method is beneficial since provides a rigorous, complete and easy way to evaluate

the performance of the ITIL selected processes, confirmed by the demonstration where it actually

helped the company to evaluate and improve the performance of their ITIL processes leading to

the practitioners’ intention to continue to use it.

54

Page 73: Methods to Increase ITIL Adoption

Almost all the four principles of Osterle were achieved. Only originality was partially achieved since

the method was not totally new to the practitioners. The novelty lied in its rigor and formalism, since it

was an empirically known approach. Besides not being completely original, the method can be applied

to other companies, is well justified and has great benefit, making it a valid one for its purpose.

7.2.3 Critical Analysis

This analysis evaluates the capability of the method to assess ITIL processes’ performance, meaning the

capability to determine the fulfillment of its defined objectives, efficiency and effectiveness and identify

its strengths and weaknesses.

To do that, the results of its demonstration (see Appendix C) were analyzed.

From the analysis of Figure 7.2 a clear improvement of the selected ITIL process’s performance can

be seen as a result of its good evaluation provided by the method.

Figure 7.2: Distances to the ITIL process’s targets from the demonstration of the method. The axis corresponds tothe target and the colors mean target fulfilled (green) or not fulfilled (red). If the distance value is 0, thevalue of the metric is equal to its target. The left bar corresponds to phase 1 and the right correspondsto phase 2.

The metrics with higher distances to the respective target in phase 1 suffered the greatest improve-

ments in phase 2, not only shortening their distances to the target, but in some cases, fulfilling it. Backlog

size and mean handling time are the metrics that best illustrate those improvements. This means that

the method not only allowed the determination of the fulfillment of the defined targets, but also a good

55

Page 74: Methods to Increase ITIL Adoption

identification of what could be improved (the weaknesses) and what was already good (the strengths) in

the effectiveness and the efficiency of the process.

From the analysis of Figure 7.3 the contribution of the method to the improvement of the process’s

performance is more prominent.

Figure 7.3: Prediction’s precision and process’s statuses deviation from the demonstration of the method. The leftgraph shows the percentage of requests bellow or above the predicted total amount. The right graphshows the relative distribution of the process’s statuses between the two phases.

On the left side of the figure, the process’s load on each demonstration subperiod is compared with

its respective prediction. The smaller bar in the second subperiod (phase 2) shows an improvement

compared with the first subperiod (phase 1), meaning that the prediction was more accurate in the

second subperiod (phase 2). This result shows that the method helped to a better analysis of the

variation of the process load, which ultimately contributed to a more precise definition of the performance

targets.

On the right side of the figure, the relative distribution of the process’s statuses from the first sub-

period to the second one is presented. The statuses are organized from the initial ones (“open” and in

“progress”) to the final ones (“resolved” and “closed”). The results evidence a clear shift from initial pro-

cess’s statuses to most final ones as the biggest reason to a better performance. This is complemented

with the values from Figure 7.2 that show problems with the backlog size and specially with the mean

handling time, due to process’s instances stuck in initial statuses.

The improvement on this field clearly supports the contribution of the method to the detection of this

major weakness in the process performance. There is still room for improvement in this company for

56

Page 75: Methods to Increase ITIL Adoption

this particular process, since some targets are not yet accomplished as seen in Figure 7.2, which can

be explained by the increasing amount of process’s instances in waiting statuses like “to test”, “pending

information” and “resolved” as seen in Figure 7.3.

From this analysis, it is clear that the method has a great capability to evaluate ITIL processes’

performance with a big focus on the detection of its strengths and weaknesses as evidenced by the

positive results from its demonstration where it contributed to a big boost on the selected ITIL process’s

performance.

7.2.4 Moody and Shanks Quality Framework

The following results were obtained from the appliance of this framework to the demonstration of the

method to assess the performance of the selected ITIL processes and the interviews with the practition-

ers after applying the method. The results were (see Figure 8.1):

• Completeness: The method is complete since the used criteria and metrics contain all the com-

pany’s requirements and there is the possibility to adapt the model to its needs, including or re-

moving some of the criteria and metrics.

• Integrity: The method combines interviews with literature review to define criteria and related

metrics. This way, a combination between constraints and organization’s needs is made, mitigating

possible errors without losing flexibility.

• Flexibility: The method is flexible since the practitioners can adjust it to their company’s strategies.

• Understandability: ITIL language is the only one used in this method, making it understandable.

• Correctness: The method is valid and correct according to the practitioners’ intentions.

• Simplicity: The method is easy to follow and apply, making it simple.

• Integration: The method helps organizations evaluate the ITIL processes’ performance, being

consistent with the problem.

• Implementability: The implementability of the method is dependent on factors such as organiza-

tion’s policies. The company on which this method was demonstrated used this as a performance

evaluation auxiliary tool.

Almost all the quality factors were accomplished. Only implementability was not fulfilled since there

were some bureaucracies to implement this solution. Besides that, these results reinforce the great

capability of the method to evaluate the performance of ITIL processes.

57

Page 76: Methods to Increase ITIL Adoption

58

Page 77: Methods to Increase ITIL Adoption

8Conclusion

Contents

8.1 Lessons Learned . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62

8.2 Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62

8.3 Main Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63

8.4 Communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64

8.5 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65

59

Page 78: Methods to Increase ITIL Adoption

60

Page 79: Methods to Increase ITIL Adoption

In a world of growing technological evolution, where even more companies face disruption along with

economic difficulties and increased demanding from customers, strategic service quality improvement is

more than just an option. ITIL is a widely accepted methodology to improve service quality [12], having

many benefits [2–5] that led to an increased interest in many countries [4,6–11]. Still, there are barriers

to its adoption [14] that cause mistakes on companies [13], being the difficulties in implementation one

of the most common and impactful barriers [15]. Those are the reasons that make companies abandon

their ITIL investment, revealing a problem: the lack of ITIL adoption.

A lot of research has been done to solve this problem, mainly related with CSFs and their role as

important elements for a successful ITIL implementation. Through literature review, the relations and

classification of those factors were found along with their application in adoption models and roadmap

for successful ITIL implementation.

Following this, the proposal is composed by two methods that through technology and evaluation

focused on people and processes contribute to increase ITIL adoption. To fulfill that objective, the

methods focus on two CSFs: tool selection and monitoring and evaluation of ITIL implementation. For

the first CSF, a method to select ITIL tools based on MACBETH is proposed. For the second CSF,

results-based monitoring and evaluation systems’ building actions are the basis to the proposed method

to evaluate the performance of the selected ITIL processes.

To access the usefulness of the artifacts, two demonstrations were made, one for each proposed

method. The method to select ITIL tools was demonstrated in a company of the bank sector that wanted

to implement four ITIL processes and had doubts about the software to use. For the method to evaluate

the performance of ITIL processes, an IT consulting company that wanted to improve the performance

of its ITIL processes was selected.

To validate and evaluate the artifacts and their results, the following were used:

• The four principles of Osterle et al. [16];

• Critical analysis;

• The Moody and Shanks quality framework [17].

From this evaluation, it was shown that the proposed methods are generic enough to be applied in

different companies that want to invest in ITIL. Both artifacts had very positive results, being able to

improve the two CSFs in ITIL implementations, that they focus. As a consequence, it was concluded

that these methods contribute to increase ITIL adoption.

The methods and their results were communicated to proper audiences through demonstrations

to their practitioners and submission and presentation of a scientific paper in an international confer-

ence [18].

61

Page 80: Methods to Increase ITIL Adoption

In the next sections, the conclusions are detailed, presenting the lessons learned during the research

(Section 8.1), the identified limitations (Section 8.2), the main contributions of the proposed methods

(Section 8.3) and the related future work (Section 8.5). The last step of DSRM: communication,

in which the problem, its importance, the artifacts, their utility and novelty, the rigor of their designs

and their effectiveness are communicated to researchers and other relevant audience is also detailed

(Section 8.4).

8.1 Lessons Learned

During this research, lessons were learned from the related work, construction of the artifacts, demon-

stration and evaluation phases.

From the related work, it was noted a great effort to apply CSFs on known adoption models such

as TAM and UTAUT, leading to the creation of a roadmap to implement ITIL. Still, the problem stays

without a solution. Besides the many benefits that ITIL can provide to companies, they still have doubts

on how to implement it, due to lack of guidelines for that purpose. It was learned that the selection of

ITIL software tools and the evaluation of the ITIL implementation are two of the most important CSFs for

ITIL adoption and must not be underestimated.

During the construction of the artifacts it was learned that there isn’t yet a consensus on the criteria to

select ITIL tools, leading to an even bigger dependence on the companies. Along with the related work

it was also learned the usefulness of the MACBETH to make fundamental decisions and the importance

of results-based monitoring and evaluation systems to provide feedback on the outcomes and goals

of a project, program or policy, comparing its implementation against the expected results. These two

approaches weren’t well known in the companies where they were applied.

Finally, from the demonstration and evaluation phases, the learned lessons were mainly practical.

The accuracy of the ITIL tool selection method is highly dependent on the opinion of the DM. Because

of that, DMs have to know exactly their needs before starting the decision process. Regarding the

method to evaluate the performance of ITIL processes, it was learned that its effectiveness depends

on the evaluation of the company’s situation, since it is with that data that the performance targets can

be more accurate and meaningful. Finally, besides recognizing the value of the proposed methods,

companies showed resistance to change and start using them.

8.2 Limitations

The limitations associated with the proposed methods are related with the demonstrations and the used

M-MACBETH software.

62

Page 81: Methods to Increase ITIL Adoption

Regarding the demonstrations, it is not possible to state that the methods are applicable to every

company with every size in every industry, since the demonstrations were only made in two companies

(one for each method). Besides that, the results and the provided feedback and interviews from those

demonstrations give indications that the method to select ITIL tools can be applied to all organizations

that want to choose a software tool for ITIL besides being focused only on its functionality, and the

method to evaluate the performance of ITIL processes can be applied to all organizations that want to

increase the performance of their ITIL processes, thus giving to these artifacts the potential to be applied

to all the organizations that want to increase their ITIL adoption.

Regarding the second aspect, M-MACBETH has two limitations. The first one is its still old appear-

ance that can negatively impact the method to select ITIL tools and the software learning process. The

second one is the absence of hierarchical analysis support, where there is the need to give weights to

all criteria along a tree. All this lead to the necessary support from a decision analyst in the ITIL tool

selection process, since the DM does not have the necessary knowledge.

Finally, also from the demonstrations, another limitation comes. This one affects the results from the

Moody and Shanks quality framework and the four principles of Osterle. From the Moody and Shanks

quality framework, the factors understandability and implementability were not totally accomplished for

the ITIL tool selection method, but for the method to evaluate the performance of ITIL processes, only

implementability was not due to bureaucracies from the companies. Besides that, all the four principles

of Osterle were achieved by the ITIL tool selection method, contrary to the method to evaluate the

performance of ITIL processes that did not totally achieve the originality principle.

8.3 Main Contributions

The proposed methods can bring a valuable contribution in the context of ITIL adoption. By focusing

on two of the most important CSFs for ITIL implementation, these artifacts aim to increase their positive

effects.

The first one is a method to select ITIL software tools that, using the MACBETH decision analysis

methodology together with ITIL criteria focused on functionality to compare and assess the tools, pro-

vides a complete and original way to make that choice as evidenced by the results of its demonstration

(see Figure 8.1) and the scientific appraisal. That way, this method positively increases the effects of the

“tool selection” CSF.

The second one is a method to evaluate the performance of selected ITIL processes that, using steps

based on monitoring and evaluation systems’ building actions together with ITIL metrics and criteria,

provides an effective and simple way to evaluate and increase the performance of ITIL processes as

shown by the results of its demonstration (see Figure 8.1). These positive results show that this method

63

Page 82: Methods to Increase ITIL Adoption

is capable of increasing the positive effects of the “monitoring and evaluation of ITIL implementation”

CSF.

By increasing the effects of those two CSFs, it is proven that the appliance of these methods can help

improving ITIL adoption, since these two factors are important elements to solve that problem, making

these artifacts important contributions.

Along with that, these methods can also provide detailed guidance to companies that intend to invest

in ITIL, reducing the difficulties in its implementation which is one of the biggest barriers to ITIL adoption.

Figure 8.1: Validation and evaluation profiles of the ITIL tool selection method (blue line) and the ITIL processes’performance assessment method (orange dots).

8.4 Communication

Since the proposed methods have a strong practical component, they were demonstrated to practitioners

in two different companies as detailed in Chapter 6 and Chapter 7.

There was also the need to obtain a more theoretical scientific appraisal of the method to select ITIL

tools due to its strong theoretical nature. To do that a research paper was submitted and presented in

an international conference [18]:

• N. Faria and M. Mira da Silva, “Selecting a Software Tool for ITIL using a Multiple Criteria Decision

64

Page 83: Methods to Increase ITIL Adoption

Analysis Approach,” in International Conference on Information Systems Development (ISD), 2018.

Lund, Sweden, Ranking: A. (Accepted).

Note that due to its rank, the conference on which this method was accepted provides an impor-

tant feedback with prestigious recognition from the scientific community that considered it as a good

appliance of a MCDA approach.

8.5 Future Work

For future work, further research can be performed to overcome the mentioned limitations.

First of all, more demonstrations of both methods have to be made, applying them to more organiza-

tions from different industries and sizes to verify if they really can be used by all kinds of organizations

in all industries.

Along with that, an effort on researching criteria that take into account other ITIL best practices,

like recommended roles and knowledge base components must be done to create criteria catalogs to

expand ITIL tool analysis.

The final aspect is related with the M-MACBETH software limitation that influences the understand-

ability of the ITIL tool selection method. For that, a software tool specific to evaluate tools for ITIL should

be developed and evaluated with support from the DMs.

65

Page 84: Methods to Increase ITIL Adoption

66

Page 85: Methods to Increase ITIL Adoption

Bibliography

[1] E. Brynjolfsson and A. McAfee, The Second Machine Age: Work, Progress and Prosperity in a

Time of Brilliant Technologies. New York: W.W. Norton & Company, 2014.

[2] M. Marrone and L. M. Kolbe, “Uncovering ITIL claims: IT executives’ perception on benefits and

Business-IT alignment,” Information Systems and e-Business Management, vol. 9, no. 3, pp. 363–

380, 2011.

[3] S. D. Galup, R. Dattero, J. J. Quan, and S. Conger, “An overview of IT service management,”

Communications of the ACM, vol. 52, no. 5, pp. 124–127, 2009.

[4] J. Iden and L. Langeland, “Setting the Stage for a Successful ITIL Adoption: A Delphi Study of IT

Experts in the Norwegian Armed Forces,” Information Systems Management, vol. 27, no. 2, pp.

103–112, 2010.

[5] B. Potgieter, J. Batha, and C. Lew, “Evidence that use of the ITIL framework is effective,” in 18th

Annual Conference of the National Advisory Committee on Computing Qualifications, 2005, pp.

160–167.

[6] C. Pollard and A. Cater-Steel, “Justifications, Strategies, and Critical Success Factors in Success-

ful ITIL Implementations in U.S. and Australian Companies: An Exploratory Study,” Information

Systems Management, vol. 26, no. 2, pp. 164–175, 2009.

[7] W. Zhen and Z. Xin-yu, “An ITIL-based IT Service Management Model for Chinese Universities,” in

Fifth International Conference on Software Engineering Research, Management and Applications,

2007, pp. 493–497.

[8] A. Cater-Steel and W.-G. Tan, “Implementation of IT Infrastructure Library (ITIL) in Australia:

Progress and success factors,” in 2005 IT Governance International Conference, 2005, pp. 39–

52.

67

Page 86: Methods to Increase ITIL Adoption

[9] M. Marrone, F. Gacenga, A. Cater-Steel, and L. Kolbe, “IT service management: A cross-national

study of ITIL adoption,” Communications of the Association for Information Systems, vol. 34, no. 1,

pp. 865–892, 2014.

[10] L. Shwartz, N. Ayachitula, M. Buco, M. Surendra, C. Ward, and S. Weinberger, “Service provider

considerations for IT service management,” in 10th IFIP/IEEE International Symposium on Inte-

grated Network Management 2007, IM ’07, 2007, pp. 757–760.

[11] M. Ayat, M. Sharifi, S. Sahibudin, and S. Ibrahim, “Adoption factors and implementation steps of

ITSM in the target organizations,” in Proceedings - 2009 3rd Asia International Conference on

Modelling and Simulation, AMS 2009, 2009, pp. 369–374.

[12] OGC, ITIL:Service Operation. Norwich, UK: TSO publications, 2011.

[13] M. Sharifi, M. Ayat, A. A. Rahman, and S. Sahibudin, “Lessons learned in ITIL implementation

failure,” in Proceedings - International Symposium on Information Technology 2008, ITSim, vol. 1,

2008, pp. 8–11.

[14] S. S. C. Shang and S.-F. Lin, “Barriers to Implementing ITIL-A Multi-Case Study on the Service-

based Industry,” Contemporary Management Research, vol. 6, no. 1, pp. 53–70, 2010.

[15] N. Ahmad, N. Tarek Amer, F. Qutaifan, and A. Alhilali, “Technology adoption model and a road map

to successful implementation of ITIL,” Journal of Enterprise Information Management, vol. 26, no. 5,

pp. 553–576, 2013.

[16] H. Osterle, J. Becker, U. Frank, T. Hess, D. Karagiannis, H. Krcmar, P. Loos, P. Mertens, A. Ober-

weis, and E. J. Sinz, “Memorandum on design-oriented information systems research,” European

Journal of Information Systems, vol. 20, no. 1, pp. 7–10, 2011.

[17] D. L. Moody and G. G. Shanks, “Improving the quality of data models: Empirical validation of a

quality management framework,” Information Systems, vol. 28, no. 6, pp. 619–650, 2003.

[18] N. Faria and M. Mira da Silva, “Selecting a Software Tool for ITIL using a Multiple Criteria Deci-

sion Analysis Approach,” in International Conference on Information Systems Development, Lund,

Sweden, 2018.

[19] K. Peffers, T. Tuunanen, C. E. Gengler, M. Rossi, V. Virtanen, and J. Bragge, “A Design Science

Research Methodology for Information Systems Research,” Journal of Management Information

Systems, vol. 24, no. 3, pp. 45–77, 2008.

[20] A. R. Hevner, S. T. March, J. Park, and S. Ram, “Design Science in Information Systems Research,”

MIS Quarterly, vol. 28, no. 1, pp. 75–105, 2004.

68

Page 87: Methods to Increase ITIL Adoption

[21] G. Westerman, D. Bonnet, and A. McAfee, Leading Digital – Turning Technology into Business

Transformation. Boston, MA: HBR Press, 2014.

[22] R. Lewis and B. Booms, “The Marketing Aspects of Service Quality,” in Emerging Perspectives on

Services Marketing. Chicago: American Marketing, 1983, pp. 99–107.

[23] J. J. Cronin and S. A. Taylor, “Measuring Service Quality: A Reexamination and Extension,” Journal

of Marketing, vol. 56, no. 3, pp. 55–68, 1992.

[24] OGC, ITIL:Service Strategy. Norwich, UK: TSO publications, 2011.

[25] ——, ITIL:Service Design. Norwich, UK: TSO publications, 2011.

[26] ——, ITIL:Service Transition. Norwich, UK: TSO publications, 2011.

[27] ——, ITIL:Continual Service Improvement. Norwich, UK: TSO publications, 2011.

[28] V. Belton and T. Stewart, Multiple Criteria Decision Analysis: An Integrated Approach. Boston:

Kluwer Academic Publishers, 2002.

[29] T. Saaty, The Analytic Hierarchy Process: Planning, Priority Setting, Resource Allocation. New

York: McGraw-Hill, 1980.

[30] R. L. Keeney and H. Raiffa, Decisions with Multiple Objectives: Preferences and Value Tradeoffs.

New York: John Wiley & Sons, 1976.

[31] D. von Winterfeldt and W. Edwards, Decision Analysis and Behavioral Research. Cambridge:

Cambridge University Press, 1986.

[32] C. A. Bana e Costa, J.-M. de Corte, and J.-C. Vansnick, “M-MACBETH Version 2.5.0 User’s

Guide,” 2017. [Online]. Available: http://www.m-macbeth.com

[33] ——, “MACBETH,” International Journal of Information Technology & Decision Making, vol. 11, pp.

359–387, 2012.

[34] ——, “MACBETH (Measuring Attractiveness by a Categorical Based Evaluation Technique),” in

Wiley Encyclopedia of Operations Research and Management Science, J. J. Cochran, Ed. New

York: John Wiley & Sons, 2011, vol. 4, pp. 2945–2950.

[35] C. A. Bana e Costa and J.-C. Vansnick, “The MACBETH approach: Basic Ideas, Software, and an

Application,” in Advances in Decision Analysis, N. Meskens and M. Roubens, Eds. Dordrecht:

Kluwer Academic Publishers, 1999, vol. 4, pp. 131–157.

[36] J. Z. Kusek and R. C. Rist, Ten Steps to a Results-Based Monitoring and Evaluation System.

Washington, D.C.: World Bank, 2004.

69

Page 88: Methods to Increase ITIL Adoption

[37] D. R. Daniel, “Management Information Crisis,” Harvard Business Review, vol. 39, no. 5, pp. 111–

121, 1961.

[38] J. F. Rockart, “Chief executives define their own data needs,” Harvard Business Review, vol. 57,

no. 2, pp. 81–93, 1979.

[39] T. M. Somers and K. Nelson, “The Impact of Critical Success Factors across the Stages of En-

terprise Resource Planning Implementations,” in System Sciences, 2001. Proceedings of the 34th

Hawaii International Conference on System Sciences, 2001, pp. 1–10.

[40] A. Hochstein, G. Tamm, and W. Brenner, “Service oriented IT-managment: benefit, cost and suc-

cess factors,” in ECIS 2005 Proceedings, 2005.

[41] W. Tan, A. Cater-Steel, M. Toleman, and R. Seaniger, “Implementing Centralised IT Service Man-

agement : Drawing Lessons from the Public Sector,” in ACIS 2007: 18th Australasian Conference

on Information Systems, 2007, pp. 1060–1068.

[42] S. Mehravani, N. Hajiheydari, and M. Haghighinasab, “ITIL Adoption Model based on TAM,” in

International Conference on Social Science and Humanity IPEDR, vol. 5. ACSIT PRESS, 2011,

pp. 33–37.

[43] H. Akkermans and K. van Helden, “Vicious and virtuous cycles in ERP implementation: a case

study of interrelations between critical success factors,” European Journal of Information Systems,

vol. 11, no. 1, pp. 35–46, 2002.

[44] F. D. Davis, “Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information

Technology,” MIS Quarterly, vol. 13, no. 3, pp. 319–340, 1989.

[45] F. D. Davis, R. P. Bagozzi, and P. R. Warshaw, “User Acceptance of Computer Technology: A

Comparison of Two Theoretical Models,” Management Science, vol. 35, no. 8, pp. 982–1003, 1989.

[46] M. Fishbein and I. Ajzen, Belief, Attitude, Intention, and Behavior: An Introduction to Theory and

Research. Reading, MA: Addison-Wesley, 1975.

[47] P. Legris, J. Ingham, and P. Collerette, “Why do people use information technology? A critical

review of the technology acceptance model,” Information & Management, vol. 40, no. 3, pp. 191–

204, 2003.

[48] V. Venkatesh, M. G. Morris, G. B. Davis, and F. D. Davis, “User Acceptance of Information Technol-

ogy: Toward a unified view,” MIS Quarterly, vol. 27, no. 3, pp. 425–478, 2003.

[49] M. Gorgens and J. Kusek, Making Monitoring and Evaluation Systems Work: A Capacity Develop-

ment Toolkit. World Bank, 2009.

70

Page 89: Methods to Increase ITIL Adoption

[50] T. R. Eikebrokk and J. Iden, “ITIL implementation: The role of ITIL software and project quality,” in

Proceedings - International Workshop on Database and Expert Systems Applications, no. January,

2012, pp. 60–64.

[51] J. Venable, J. Pries-Heje, and R. L. Baskerville, “FEDS: a Framework for Evaluation in Design

Science Research,” European Journal of Information Systems, vol. 25, no. 1, pp. 77–89, 2016.

71

Page 90: Methods to Increase ITIL Adoption

72

Page 91: Methods to Increase ITIL Adoption

ACriteria Weighting Judgments Matrix

The judgments matrix presented in this appendix is the result of applying the MACBETH semantic cate-

gories to judge the differences in attractiveness between each criterion in the demonstration of the ITIL

tool selection method as explained in Section 6.1.2.

73

Page 92: Methods to Increase ITIL Adoption

Figure A.1: MACBETH judgments matrix for the criteria weighting.

74

Page 93: Methods to Increase ITIL Adoption

BSensitivity Analysis

The graphs presented in this appendix are the result of applying sensitivity analysis in the demonstration

of the ITIL tool selection method as explained in Section 6.1.4.

75

Page 94: Methods to Increase ITIL Adoption

Figure B.1: Complete sensitivity analysis.

76

Page 95: Methods to Increase ITIL Adoption

CPerformance Analysis

The tables presented in this appendix are the result of the performance analysis made in the demon-

stration of the ITIL processes’ performance evaluation method as explained in Section 7.1.4.

77

Page 96: Methods to Increase ITIL Adoption

Table C.1: First subperiod performance’s results.

Table C.2: Second subperiod performance’s results.

Table C.3: Performances’ comparison between subperiods.

78