Upload
others
View
1
Download
0
Embed Size (px)
Citation preview
Normalized Systems: An Assessment of Evolvability Basedon Metrics
Ricardo Miguel Ferreira Coelho
Thesis to obtain the Master of Science Degree in
Information Systems and Computer Engineering
Supervisor: Prof. Paulo Jorge Fernandes Carreira
Examination Committee
Chairperson: Prof. João Emílio Segurado Pavão MartinsSupervisor: Prof. Paulo Jorge Fernandes Carreira
Member of the Committee: Prof. Diogo Manuel Ribeiro Ferreira
November 2016
ii
“A great accomplishment shouldn’t be the end of the road,
just the starting point for the next leap forward.”
— Harvey Mackay
iii
iv
Acknowledgments
Esta jornada nao seria possıvel sem aqueles que me apoiaram e ajudaram ao longo deste percurso.
Em primeiro lugar, gostaria de expressar a minha eterna gratidao a minha mae Maria Vitoria Fer-
reira, e ao meu padrasto Frederico Alexandre Pereira, por serem os melhores pais que podem existir.
Sem o seu apoio, sacrifıcios, e amor sem limites, nunca teria chegado onde cheguei. Tambem gostaria
de agradecer a minha avo Barbara Cardeta, cujo seu amor incondicional sempre me motivou.
Ao meu orientador, Professor Paulo Carreira, que com a sua exigencia e dedicacao, me guiou
durante a execucao deste trabalho, e me motivou para que me superasse cada vez mais.
Aos meus amigos, com destaque para os meus companheiros durante este perıodo, Miguel Alves
e Luıs Costa, que sempre me ajudaram em todos os momentos e dificuldades, e partilharam comigo
noitadas, projectos, e gargalhadas. Sem a sua amizade, seria impossıvel atingir os objectivos que atingi
no meu percurso academico.
A todos voces, um forte Obrigado!
v
vi
Resumo
A evolutibilidade e uma qualidade fundamental de sistemas, que e valorizada em todas as areas de
desenvolvimento de software. Ao longo dos anos, foram sugeridas tecnicas para obter esta qualidade,
obtendo resultados discutıveis por requererem grande disciplina por parte dos programadores, uma ar-
quitectura adequada, ou o uso de ferramentas complexas. A teoria dos Sistemas Normalizados (SN)
defende que a evolutibilidade do software pode ser obtida aplicando quatro princıpios bem definidos,
que podem ser efectivamente seguidos atraves da sistematica reutilizacao do conhecimento de arqui-
tectura e desenho utilizando geracao de codigo. Apesar das capacidades promissoras dos SN, nao
existem avaliacoes concretas da sua aplicabilidade a projectos de software reais, especialmente no
que toca a evolutibilidade das aplicacoes por si produzidas.
Neste trabalho, revemos o conceito de evolutibilidade e as abordagens existentes para a atingir,
identificando tambem as lacunas existentes na literatura acerca da avaliacao da evolutibilidade de
aplicacoes desenvolvidas usando SN e outras abordagens. De seguida, procedemos a definicao de
um caso de estudo que consiste no desenvolvimento de um sistema de informacao baseado na web,
usando ferramentas dos SN, uma abordagem tradicional, e um abordagem de Engenharia Baseada em
Modelos. Finalmente, avaliamos a evolutibilidade das implementacoes utilizando analise estatica de
codigo na forma de metricas de codigo, que medem caracterısticas de evolutibilidade, e comparamos
os resultados. Estes demonstram que as ferramentas dos SN produzem de facto aplicacoes altamente
estaveis e evolutivas, mesmo quando comparadas com as restantes abordagens. No entanto nao sao
perfeitos, e varias desvantagens foram identificadas.
Palavras-chave: Sistemas Normalizados, Evolutibilidade, Engenharia Baseada em Mode-
los, Desenho e Arquitectura de Software, Metricas de Codigo
vii
viii
Abstract
Evolvability is a fundamental quality of systems that is valued across all areas of software development.
Over the years, techniques have been suggested to achieve this quality, with disputable results mostly
because they require a great deal of developer discipline, an appropriate architecture, or because they
require embracing complex new frameworks. Normalized Systems (NS) Theory posits that software
evolvability rests on four well-defined principles that can be effectively achieved by systematically reusing
design and architectural knowledge through code generation. Despite their promising capabilities, NS
lack a proper evaluation of their applicability to real software projects, specifically with regards to the
evolvability of the applications they produce.
In this work, we overview the concept of evolvability and the existing approaches to achieve it, also
identifying the existing gaps in the literature regarding the evolvability assessment of applications devel-
oped using NS and other approaches. We then proceed to define a case study consisting in developing
a web based information system using NS tools, a traditional approach, and a Model Driven Engineer-
ing (MDE) approach. Finally, we evaluate the implementations evolvability using static code analysis in
the form of code metrics that measure evolvability characteristics, and compare their results. The results
show that NS tools indeed produce highly stable and evolvable applications, even when compared to
the remaining approaches. However, they were proven to not be perfect, and several disadvantages are
identified.
Keywords: Normalized Systems, Evolvability, Model-Driven Engineering, Software Design and
Architecture, Code Metrics
ix
x
Contents
Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . v
Resumo . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vii
Abstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix
List of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xv
List of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xvii
Glossary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xix
1 Introduction 1
1.1 Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2 Methodology and Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3 Document Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2 Concepts 5
2.1 Evolvability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.2 Design Patterns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.2.1 Design Patterns Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.2.2 Design Pattern Coupling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.3 Software Architectures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.3.1 Influences and Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.3.2 Software Architecture Construction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.3.3 Functionality and Quality Attributes . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.4 Model Driven Engineering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.4.1 Meta-models and Modeling Languages . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.4.2 Transformations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.5 Normalized Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.5.1 Software Entities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.5.2 Primitive Changes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.5.3 Normalized Systems “Theorems” . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
3 Related Work 15
3.1 Design And Architecture Influence On Evolvability . . . . . . . . . . . . . . . . . . . . . . . . 15
3.1.1 Comparative Case Studies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
xi
3.1.2 Surveys And Reviews . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.2 MDE Impact On Evolvability Qualities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.2.1 Comparative Case Studies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.2.2 Surveys And Reviews . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
3.3 Normalized Systems Expanders . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.3.1 Information Management System Case Study . . . . . . . . . . . . . . . . . . . . . . 21
3.3.2 Budget Management Application Study . . . . . . . . . . . . . . . . . . . . . . . . . . 21
3.3.3 Budget Management Application Revision Study . . . . . . . . . . . . . . . . . . . . 21
3.4 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.4.1 Evolvability In Architecture And Design . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.4.2 Evolvability In MDE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.4.3 Assessment Of Evolvability Qualities . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
3.4.4 NSX Considerations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
4 Solution 25
4.1 Base System Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
4.1.1 Domain Specification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
4.1.2 Requirements And Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
4.1.3 Extensions Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
4.2 Additional Implementation Approaches . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
4.3 MDE Tool Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
4.3.1 Selection Criteria . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
4.3.2 Search and Decision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
4.4 NSX Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
4.4.1 Application Generation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
4.4.2 Generated Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
4.4.3 Business Logic Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4.4.4 Extensions Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4.5 Manual Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4.5.1 Architecture Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4.5.2 Database Tier . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
4.5.3 Data Access Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
4.5.4 Business Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
4.5.5 Web Service Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
4.5.6 Client Tier . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
4.5.7 Extensions Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
4.6 Generjee Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
4.6.1 Application Generation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
4.6.2 Generated Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
xii
4.6.3 Business Logic Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
4.6.4 Extension Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
5 Evaluation 43
5.1 Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
5.1.1 Source Code Evaluation Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
5.1.2 Selection of System Components . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
5.1.3 Class Level Metrics Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
5.1.4 Project Level Metrics Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
5.2 Connection Between Metrics And Characteristics . . . . . . . . . . . . . . . . . . . . . . . . 48
5.3 Evaluation Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
5.3.1 Class Level Metrics Measurements . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
5.3.2 Project Level Metrics Measurements . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
5.4 Results Analysis And Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
5.4.1 NSX Implementation Key Findings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
5.4.2 Manual Implementation Key Findings . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
5.4.3 Generjee Implementation Key Findings . . . . . . . . . . . . . . . . . . . . . . . . . . 57
5.4.4 Evolvability Impact Of Extensions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
5.4.5 Results Comparisons And Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . 58
6 Conclusions 61
6.1 Impact . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
6.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
Bibliography 65
A NSX Element Descriptors 73
xiii
xiv
List of Tables
2.1 Relationship between evolvability sub-qualities and software characteristics. . . . . . . . . 6
4.1 Criteria for choosing a suitable MDE tool. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
5.1 Relationship between the chosen code metrics and the characteristics of evolvability. . . . 49
5.2 Descriptive statistics of the class level metrics measurements for the NSX system. . . . . 50
5.3 Descriptive statistics of the class level metrics measurements for the manual implemen-
tation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
5.4 Descriptive statistics of the class level metrics measurements for the Generjee system. . 51
5.5 MOOD metric suite measurements for each system. . . . . . . . . . . . . . . . . . . . . . . 55
5.6 Comparison between the characteristics of evolvability of each system implementation. . 59
5.7 Comparison between the sub-qualities of evolvability of each system implementation. . . 59
xv
xvi
List of Figures
2.1 Decomposition of evolvability into sub-qualities. . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.2 Main concepts of MDE and their relationship. . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
3.1 Structure of applications deployed by NSX. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
4.1 Domain model of the bank information system. . . . . . . . . . . . . . . . . . . . . . . . . . . 26
4.2 Partial domain model of the ExNE version of the bank information system. . . . . . . . . . 28
4.3 Diagram of the process used to select an adequate MDE tool. . . . . . . . . . . . . . . . . 29
4.4 NSX application descriptor file. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
4.5 NSX bankcomp component descriptor file. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
4.6 NSX Account data element descriptor file. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
4.7 Architecture of the generated system using the NSX tool. . . . . . . . . . . . . . . . . . . . 33
4.8 Architecture of the manual implementation of the system. . . . . . . . . . . . . . . . . . . . 36
4.9 Class diagram of the data access layer class setup. . . . . . . . . . . . . . . . . . . . . . . . 37
4.10 Architecture of the application generated by the Generjee tool. . . . . . . . . . . . . . . . . 41
5.1 Process diagram of the evaluation methodology. . . . . . . . . . . . . . . . . . . . . . . . . . 44
5.2 Tukey box plots of WMC metric measurements. . . . . . . . . . . . . . . . . . . . . . . . . . 52
5.3 Tukey box plots of DIT metric measurements. . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
5.4 Tukey box plots of NOC metric measurements. . . . . . . . . . . . . . . . . . . . . . . . . . . 53
5.5 Tukey box plots of CBO metric measurements. . . . . . . . . . . . . . . . . . . . . . . . . . . 53
5.6 Tukey box plots of RFC metric measurements. . . . . . . . . . . . . . . . . . . . . . . . . . . 54
5.7 Tukey box plots of LCOM metric measurements. . . . . . . . . . . . . . . . . . . . . . . . . . 54
5.8 Bar chart of the MOOD metric suite measurements. . . . . . . . . . . . . . . . . . . . . . . . 55
A.1 NSX Loan data element descriptor file. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
A.2 NSX Customer data element descriptor file. . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
A.3 NSX Branch data element descriptor file. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
A.4 NSX Division data element descriptor file. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
xvii
xviii
Glossary
AHF Attribute Hiding Factor
AIF Attribute Inheritance Factor
API Application Programming Interface
BV Base Version
CBO Coupling Between Objects
CE Combinatorial Effects
COF Coupling Factor
CRUD Create, Read, Update, Delete
DAO Data Access Object
DIT Depth of Inheritance Tree
DP Design Patterns
DTO Data Transfer Object
EJB Enterprise Java Bean
ExBR Extension Business Rule
ExNE Extension New Entity
ExS Extension Search
FV Final Version
HSQLDB HyperSQL DataBase
HTML HyperText Markup Language
HTTP HyperText Transfer Protocol
JEE Java Enterprise Edition
JPA Java Persistence API
JSF Java Server Faces
JSON JavaScript Object Notation
LCOM Lack of Cohesion of Methods
MDE Model Driven Engineering
MHF Method Hiding Factor
MIF Method Inheritance Factor
MOOD Metrics for Object Oriented Design
MVC Model-View-Controller
xix
MVVM Model-View-ViewModel
NOC Number Of Children
NSX Normalized Systems Expanders
NS Normalized Systems
OO Object Oriented
POF Polymorphism Factor
REST Representational State Transfer
RFC Response For Class
RMI Remote Method Invocation
SA Software Architecture
UML Unified Modeling Language
WMC Weighted Methods per Class
XML Extensible Markup Language
xx
Chapter 1
Introduction
Software evolvability has been defined as the ability of a system to accommodate requirements changes
throughout its lifespan at the least possible cost, while maintaining architectural integrity [77]. This
quality has became a central concern in the software life cycle [25]. It has been estimated that low
software evolvability can increment costs by more than 30%, and increase development effort by 25-
36% [56]. Enterprise systems that are expected to have a long lifetime (of 20+ years), will inevitably
undergo significant changes to their requirements and functionality, ranging from technology changes to
system merging and migration, that will degrade their overall structure and hamper evolvability [18, 49].
Experiments show that about 75% of all the defects identified during code reviews have an impact on
evolvability rather than functionality [56]. Therefore, most companies resist to changes in their software,
by not implementing the necessary modifications, or by completely scraping the current system and
building a new one according to the most recent requirements [70].
Even though evolvability is undeniably essential to the long-term success of a system, it is often ne-
glected during the development process. Indeed, its implementation is complex and costly [12]. In order
to achieve evolvability, software has to be built with enough robustness to support a set of anticipated
changes [92]. This robustness however represents a big investment up-front, since companies will be
designing and implementing structures based on hypothetic future requirements [92]. Since companies
seek to maximize the profit of software development, and tend to minimize the effort to implement the
required functionality, this results in systems that are not resilient, and quickly their overall structure will
degrade once change inevitably needs to be introduced [62]. Degradations will stack up and, eventually,
the system will become too complex to manage [43]. This scenario is typical in small outsourcing com-
panies for example [42], that are unsure if they will maintain the system in the long term and, therefore,
have little motivation to invest in the intrinsic quality characteristics of the software they produce.
Approaches have been proposed to decrease the investment and time necessary to develop evolv-
able systems, such as Design Patterns (DP) [93], Software Architecture (SA) [10], or MDE [76]. Unfortu-
nately these techniques still do not reduce evolvability problems effectively, due to human errors, overly
complicated or vague procedures, or simply lack of adherence to these approaches [92]. NS Theory ar-
gues that evolvable software systems rests on four well-defined design principles for evolvability that are
1
based on the concepts of systems stability [55]. The adherence to principles generates systems that are
extremely fine-grained at the modular level, composed by dozens of structures [71]. This extreme mod-
ularity will isolate the various tasks and data, preventing the propagation of changes [19]. However, to
achieve evolvability, these principles have to be applied from the start of the software’s development, and
a degree of cognitive overload and discipline can potentially lead companies to abandon the approach.
With this in mind, the creators of NS propose that the four principles should be obtained by code gener-
ation, using tools entitled Normalized Systems Expanders (NSX), in order to spare developers from all
the complexity of NS theory, and to eliminate human errors [90].
Even tough NS posits that it can produce highly evolvable systems, there is no evidence in literature
that backs up these claims. In fact, there is a scarcity of literature documenting cases of evolvability
assessment, with studies only focusing on organizational factors such as productivity and effort, or
focusing on a subset of qualities and characteristics of evolvability. Furthermore, none of these studies
present a concrete and complete evaluation methodology to determine this quality.
To bridge the gap in the study of evolvability, our work consists of (i) defining evolvability as a software
quality, presenting a proper decomposition into sub-qualities and characteristics, (ii) studying NS and
their tools, analyzing how they function and their principles to achieve evolvability, (iii) developing a
controlled case study, defining a sample system complete with domain, requirements, constraints, and
extensions, and (iv) developing a complete evaluation methodology, including a robust set of metrics that
measure software at various levels and characteristics. The development of an adequate case study and
evaluation method is not a trivial task, nor is it the proper definition of evolvability. As a matter of fact,
several of these tasks have never been tackled by other researchers, and there are barely any works
offering partial guidelines to achieve the goals we proposed.
To assess the evolvability of NS, we derive their evolvability sub-qualities from quantitative data, ob-
tained by analyzing their generated application using a set of code metrics, that enable the measurement
of software characteristics. This evaluation is complemented by also repeating the same process using
a traditional approach and a modern MDE approach, and comparing the results, as to perceive the place
of NS in the current scene.
1.1 Problem Statement
The NS approach has been applied to some real world software projects over the last years [43, 70,
71, 72], and claims all of those projects were successful, producing systems that have successfully
evolved over more than five years. However, they only focus on development time measurements,
and comparisons with estimated development times without NS. In terms of actual evolvability, no
measurements are presented in the whole set of papers. Furthermore, the team that developed NS and
its tools were always involved in these projects, by implementing them alone, or by providing assistance
to the developers in the form of training and audits.
A proper evaluation of the capability of the NS approach to produce evolvable applications is of the
utmost importance to the software community, in order to understand its feasibility, and to determine if
2
it is worthy of an initial investment, and change in development paradigm [71]. Furthermore, it is also
important to understand how the potential of NS to create evolvable software compares with the state-of-
art of existing approaches. This assessment can only be successful by employing a proper quantitative
evaluation [74], that takes into account the several existing evolvability qualities and characteristics.
1.2 Methodology and Contributions
This work aims at defining evolvability as a software quality, and evaluating the capability of NS and their
tools to produce evolvable applications, while also comparing them to other approaches. To achieve
these goals we start by defining evolvability, its sub-qualities, and characteristics. We also present the
current approaches to achieve this quality, along with a study and analysis over the NS theory and its
concepts.
Second, we review the existing literature to understand what are the current gaps in the approaches
presented, and to what extent have they been studied and compared in terms of evolvability. We also
present the NSX tools describing their functionality and purpose.
Based on our discoveries, we then define a case study with the intent to evaluate the evolvability of
NS, starting by defining a base system, along with its domain and requirements. We also define a set of
extensions to this system in order to get a better insight on the impact of changes.
Following the definition of the base system, we implement it using three different methods based
on NS, traditional methods, and a modern, suitable MDE tool, following guides and best practices to
achieve evolvable implementations.
Finally, we quantitatively evaluate the systems obtained using techniques of static code analysis
and code metrics. We start by defining the code metrics and characteristics they measure, followed by
the acquisition and analysis of the results, and closing by deducing the evolvability of the systems and
comparing them.
The main contributions of this work are the following:
• The characterization of software evolvability requirements.
• An overview of the various approaches to software evolvability, and their capability to achieve it.
• An analysis of NS, and cases of their application.
• The definition of a comparative case study capable of evaluating a system’s evolvability.
• A quantitative evaluation of the capability of NS and their tools to achieve evolvability.
• A comparison between NS and other approaches, regarding their capacity to produce evolvable
applications.
3
1.3 Document Structure
The remaining portion of this document is divided in five chapters according to its contents. Chap-
ter 2 describes evolvability as a software quality, along with the description of the current approaches to
achieve it, including an analysis of the NS theory. In Chapter 3 we review the existing literature regarding
the capability of the mentioned approaches to achieve evolvability, and how they measure that capabil-
ity. This chapter also provides a description of the NS tools, and cases of their application. Chapter 4
presents the proposal to evaluate the capacity for evolvability of NS, along with the definition of a suitable
case study to achieve that goal. This chapter also describes the implementations of the base system de-
fined in the case study, using the chosen approaches. Chapter 5 describes the evaluation methodology
used to evaluate the produced systems, while also providing an analysis of the results obtained, and a
discussion on how those results achieve the objectives of this work. Finally, in Chapter 6 we present the
conclusions obtained, the impact of our work, and future research directions.
4
Chapter 2
Concepts
This chapter presents a series of relevant definitions for understanding the problem discussed in this
work. Firstly, we will study evolvability on a more detailed level. Secondly, some concepts regarding DP
are visited, explaining their purpose and properties, and how those translate into their coupling. Next we
will focus on SA by identifying their characteristics and aspects, followed by a study over the main ideas
of MDE. Finally key concepts from the NS theory are presented.
2.1 Evolvability
Evolvability is a multi-dimensional software quality defined as the ability of a system to cope with correc-
tive, adaptive, and perfective nature changes in its environment, requirements, and technologies, that
may have an impact in terms of structural and functional enhancements [17, 26]. Its importance starts at
the beginning of development and extends throughout the software’s life cycle, encompassing long-term,
coarse-grained high-level modifications.
The evolvability quality can be decomposed into more elemental qualities as depicted in Figure 2.1,
which help understand the characteristics of evolvable software [15, 17, 18, 26, 45, 49, 63]. According to
the definition of evolvability, its two main sub-qualities are maintainability and portability. Maintainability
itself is a very broad concept, which can be decomposed into modifiability, analyzability, and testability.
Modifiability not only takes into account the capacity of the system to change, but also its capacity to
stay stable in face of those changes. Therefore, we can define the following set of elemental qualities of
evolvability:
Analyzability characterizes the ability of a system to be analyzed and explored. It is essential to cor-
rectly evolve a system, as it allows developers to identify what needs to be changed, and analyze
the impact of those changes.
Changeability is the capability of a system to enable a specified modification to be implemented. This
ability promotes evolvability by allowing software to be changed and grow without causing negative
or destructive impact to the system.
5
Figure 2.1: Diagram that shows the decomposition of evolvability into more elemental sub-qualities thatcan be better analyzed.
Sub-Qualities
Characteristics Analyzability Testability Portability Changeability Stability
Complexity # # #Cohesion Modularity Reusability #Coupling # # # # #Encapsulation #
Table 2.1: Relationship between the identified evolvability sub-qualities and the characteristics of soft-ware that affect them. Characteristic influences positively the quality. # Characteristic influencesnegatively the quality.
Portability provides the capability to transfer a system to other technological environments. Due to the
rapid development of new technologies, portability is essential in order to adapt and offer better
qualities or increase the user base.
Testability enables systems to be tested and validated. It is important in the context of any change,
as it allows developers to make sure the system remains correct after a modification, and that the
modification was implemented correctly.
Stability characterizes the sensitivity to change of a given system. Systems with low stability will be
largely affected by modifications, and will loose stability with each change preventing evolution,
since as time progresses, effects of change become progressively stronger.
As evolvability is composed of several sub-qualities, it is only natural that there are several char-
acteristics of software that affect its qualities, and thus, affect evolvability [15, 17, 18, 63]. These set
of fundamental characteristics is composed of (i) complexity, (ii) cohesion, (iii) modularity, (iv) reusabil-
6
ity, (v) coupling, and (vi) encapsulation. All of these characteristics can influence the sub-qualities of
evolvability in different ways, be it positively or negatively, as shown in Table 2.1.
2.2 Design Patterns
Design Patterns identify and abstract key aspects of a common design structure, along with the par-
ticipating classes and instances, their roles and collaborations, and distribution of responsibilities. DP
can be seen as predefined solutions to recurring problems in software engineering, that are: (i) highly
reusable, (ii) knowledge preserving, and (iii) enrich system implementations with derivable qualities [83].
The use of DP conveys properties to software that are related with increased evolvability [83, 93],
which all stem mainly from the high abstraction and dependency elimination characteristics from DP.
2.2.1 Design Patterns Classification
Design Patterns can be classified into three categories [93]:
Creational Patterns that abstract the instantiation process of object creation and make systems inde-
pendent of how their objects are created, composed, and represented. They encapsulate knowl-
edge about the system’s concrete classes, and hide how instances are created and put together.
Structural Patterns concern the composition of specific objects to form larger structures that are easily
navigatable and extensible at development-time or run-time. These patterns focus on inheritance
mechanisms or object composition depending on the need to change structure at run-time.
Behavioral Patterns concern the interaction and responsibility distribution between classes or objects.
These patterns focus on the communication aspects of systems, and shift attention away from flow
of control by prescribing well-defined object roles thus leading developers to concentrate on the
interconnection of objects.
2.2.2 Design Pattern Coupling
DP coupling results from connections between patterns, where structures play roles in more than one
DP by referencing common objects and using methods in other patterns [60]. Their interaction does not
necessarily conserve the properties of individual DP, which might be positively or negatively influenced,
depending on the used patterns and their relationship [47]. The coupling of DP can be classified in three
types:
Intersection Coupling has patterns exhibiting a uses or talks to interaction scheme.
Composite Coupling refers to patterns that have other pattern’s elements as components. Composite
interactions have emergent properties not found in the individual parts.
Embedded Coupling represents a has a relationship, where the parent DP includes an instance of the
embedded pattern in it, that is entirely within the parent’s structure.
7
2.3 Software Architectures
Every computing system with software has an architecture. SA is the structure or structures of a sys-
tem, which comprise software elements, the externally visible properties of those elements, and their
relationships. Architecture is an abstraction of a system, which suppresses details of elements that do
not affect how they use, are used by, relate to, or interact with other elements [10]. In other words, it is
not concerned with internal implementation details. The behavior of the different elements is part of an
architecture if that behavior can be observed from the point of view of another element, since that is what
influences their interaction. The architecture of a system encompasses more than one structure, having
each structure a set of specific responsibilities. There are various kinds of structures, and therefore,
various kinds of elements, interactions, and context [10].
2.3.1 Influences and Properties
SA is influenced by product requirements, structure and goals of the organization, technical environment,
and the the architect’s background and experience [16]. All those influences are connected and in turn
affect each other in a cyclic feedback loop called architecture business cycle [10]. Businesses manage
this cycle to handle growth, expand their enterprise area, and to take advantage of previous investments
in architecture. There are three fundamental fundamental properties of SA [34]:
• Architecture allows stakeholders to communicate more effectively, due to the common system
abstraction it provides.
• The earliest design decisions about a system are manifested in its architecture, which carry a
heavy weight in the system’s development, deployment, and evolution.
• The models constructed through SA represent intellectually graspable abstractions of how a sys-
tem works, and these models can be applied to systems exhibiting similar requirements, promoting
knowledge reuse.
2.3.2 Software Architecture Construction
In order to obtain a robust architecture, its construction should take into account three concepts [10, 81]:
Architectural patterns that are a description of elements and relational types, along with a set of useful
constraints on how they may be used. These constraints define a family of SA that satisfy them,
and affect the system.
Reference models that are a division of functionality together with data flow between the parts, repre-
senting a decomposition of a know problem into pieces that cooperatively solve it.
Reference architectures that are the mapping of reference models onto software elements and the
data flows between them.
8
Figure 2.2: A diagram of the main concepts of MDE and how they relate to each other. The hollowarrow head represents an is a relationship, and the open arrow head represents a generic relationshipidentified by the words next to it.
Before building a SA, an architect should first choose an architectural pattern, and build a reference
model. Then a reference architecture is conceived from those two choices, and refined into a SA [10].
2.3.3 Functionality and Quality Attributes
Functionality is the ability of the system to do the work for which it was intended [10]. This requires that
the system’s elements work in a coordinated manner to complete tasks.
Quality attributes are properties of the system that affect behavior, design, evolution, and user expe-
rience, and combined they dictate the success of an application [10, 84]. Quality attributes should be
considered throughout design, implementation, and deployment. However, quality attributes can never
be achieved in isolation, since the achievement of any quality has positive or negative effects on others.
Functionality and quality attributes are orthogonal, since the function of a system does not dictate its
qualities automatically [10]. The relation between functionality and quality attributes mainly concerns the
way functionality interacts with qualities and constrains them. For any function, the architect’s decisions
are the main factor in determining the level of quality.
2.4 Model Driven Engineering
Through models it is possible to analyze a system and plan its construction in an effective and efficient
manner, achieving more control over its life cycle. Furthermore, communication among stakeholders is
greatly improved, making it possible to share knowledge more efficiently. Since models contain a big
amount of information about systems, it is possible to use them in order to shorten the development
time, by using techniques such as code generation or model interpretation [76]. This idea has become
known as MDE, a software engineering paradigm. The main concepts of MDE and their relationship are
illustrated in Figure 2.2.
9
In the context of MDE, systems are considered a generic concept for designating software artifacts,
and might be composed of other systems, and have relations with other systems. Taking this definition
into consideration, it is possible to think of models as systems, since a model also has the properties
described, and can serve as a base for other more simplified models. A model is a system that helps
define and analyze the system under study without the need to consider it directly [76].
2.4.1 Meta-models and Modeling Languages
A meta-model is a model that defines the structure of a modeling language, in essence, it is a model
of a language of models [32]. As such, meta-models are the basis of modeling languages, which are
another concept of MDE. Modeling languages enable the specification of models in a certain level of
abstraction, which in turn allows the use of those models in software development [9, 80]. A modeling
language can be classified as a general-purpose modeling language, or as a domain-specific modeling
language [36, 48]. A general-purpose modeling language is characterized by a greater number of
generic constructs, which encourage use in different fields of application. One example of this type of
language is the widely known Unified Modeling Language (UML). On the other hand, a domain-specific
language focuses on a specific domain, using fewer constructs and concepts. Its narrow scope makes
it more readable and understandable to project stakeholders.
2.4.2 Transformations
In order to effectively incorporate models in software development, the notion of transformation needs
to be introduced. There are two main types of transformations:
Model-to-text transformations are used to produce software elements directly from models, being the
most common technique for this class of transformations know as code generation [85].
Model-to-model transformations on the other hand, translate models into other sets of models that are
closer to the solution domain, or that fulfill specific needs. These transformations are specified
through programming languages or model transformation languages [27, 35].
One of the great advantages and promises of MDE is the fact that it provides a systematic means
to reuse architectural and design knowledge [36, 64]. MDE enables developers to reuse documented
solutions even if they do not directly possess themselves the technical knowledge necessary to apply
them. By means of transformations, these tools can automatically implement patterns that solve specific
problems relevant in the context of the application. This aspect of MDE allows it to provide quality
solutions in an automated way.
2.5 Normalized Systems
The NS theory was developed with the intent to provide concrete guidelines for software development
to achieve maximum evolvability. The theory spawned from the concept of system stability whereby a
10
system is considered stable, if a bounded set of changes results in a bounded amount of impacts to the
system, independently of the system’s operational time period [50, 55]. This means that no matter how
long a system has been in development, the effect to incorporate changes to it should not increase over
time and should only depend on the changes themselves.
Whenever effort required to implement changes increases as the system grows, that is an indication
that we are in the presence of Combinatorial Effects (CE) [20]. These effects are the consequence
of dependencies between modules that should not exist, and result from the failures in encapsulation.
Due to these dependencies, changes in a given module will have impacts in other modules whose
responsibilities are independent from the original change. As the system is developed, dependencies
are introduced in the code increasing the CE, thus decreasing the overall stability of the system. A
software system is said to be a normalized system if it does not have CE (and consequently is completely
stable over an infinite period) for a set of anticipated changes [20].
2.5.1 Software Entities
In NS systems are organized in terms of modular structures [92]. Each of this modular structures is
written in a certain technology environment. Technology environments can be regarded as programming
languages, frameworks, or libraries, having as main assets the constructs they provide, which allow
modules to be written. Systems can be built by a group of construct instantiations, that are defined as
software entities. For example, in the Java programming language, a class is a construct. This construct
has many possible instantiations, like Person, or Animal, which constitute software entities.
NS view modular structures essentially as actions processing data. Considering this notion, it is
possible to identify two primitive software entities [91]:
• Data entities that contain fields and links do other data entities. This type of entity only holds data,
and as such, it does not provide an interface.
• Action entities that represent operations performed over data entities, consuming these entities
as inputs and producing them as outputs. Since it represents actions with inputs and outputs, that
can call each other, action entities must provide an interface.
In order to properly define action entities, they are decomposed into tasks [50]. A task is a set of
instructions that perform a certain functionality. Depending on the type of functionality, two forms of task
can be identified:
• Functional tasks that perform a specific functional operation, for example, calculate the cost of a
group of products.
• Supporting tasks that perform cross-cutting concerns, for example, logging.
Tasks may have different versions, that refer to a revision of the original task or a different algorithm
used in parallel with existing ones. The identification of tasks should be based on change drivers and
external technologies, in essence, each task should only contain instructions from its technology envi-
ronment [50]. Encapsulating tasks this way allows each task to evolve independently.
11
2.5.2 Primitive Changes
As highlighted earlier, the main property of NS is that they are stable for a set of anticipated changes.
These changes are regarded as primitive changes, and constitute the fundamental modifications that
are possible in an information system [50, 92]. NS identify four primitive changes:
• Adding a data attribute or field.
• Adding a data entity.
• Adding a task.
• Adding an action entity.
Every complex change can be decomposed into various changes from this set. The modification of an
existing entity is considered a combination of deletions and additions, and therefore is not included in
this set. Since NS are completely stable, the deletion of entities does not impact the system, and entities
can simply be removed in a garbage collection approach, deleting the entities that are not used in the
system anymore. Therefore the deletion of entities is not part of the set of anticipated changes.
2.5.3 Normalized Systems “Theorems”
NS postulate that stability rests on four fundamental principles (presented by the authors as theorems)
for software evolvability [55, 92]. By following this principles thoroughly, since the start of the develop-
ment process, NS posit that it is possible to build a system that is completely stable, and is resistant to
the set of primitive changes. The theorems are as follows:
1. Separation of Concerns
This theorem states that an action entity can only contain a single task. This means that all tasks
in a system should be separated in order to avoid CE. This theorem implies that action entities
should only contain one technology environment, or change driver, and all cross-cutting concerns
should be encapsulated in their own action entities. One high-level example application of this
theorem is the use of tiered architectures, where each tier is dedicated to a fundamental part of
the application.
2. Data Version Transparency
This theorem states that data entities serving as input or output for action entities need to exhibit
version transparency. This implies that the addition of fields should not affect action entities that
do not use that field, and therefore data entities should be encapsulated to wrap their various
versions. One example application of this theorem is when instead of having a firstName field in
a Person class and passing it explicitly to method calls, it is possible to define a Name class to
contain that field. This way, Name could be passed to method calls, and if a new field would need
to be introduced, like lastName, it would not be necessary to rewrite methods if the action entities
did not use the new field.
12
3. Action Version Transparency
This theorem states that action entities need to exhibit version transparency. In other words, this
means that the calling interfaces between action entities should not be impacted by a new version
of a task. This theorem implies that all action entities and task versions should be wrapped by a
separate action entity. An example of the application of this theorem is the polymorphism mech-
anism of Object Oriented (OO) languages. For instance the List interface in Java hides various
implementations of a list. A new implementation of a list wont have impact on the constructs that
use other versions.
4. Separation of State
This theorem states that the interaction between action entities needs to exhibit state keeping. This
is achieved by keeping the state of an action entity for every call or use, linking this state to the data
entity that serves as argument. This allows the action entities to be executed in a scheduled and
sequenced manner. State keeping also allows error handling by storing the error state, and having
other action entities reacting to it, instead of propagating errors through call hierarchies. It is also
possible to persist the states of the application, in order to guarantee integrity of transactions and
operations. An example of application of this theorem can be found in asynchronous processing
mechanisms, where requests are stored in persistent data entities and are processed when the
system is available, without blocking processes in synchronous pipelines.
13
14
Chapter 3
Related Work
To achieve better evolvability in information systems, several approaches have been proposed, such as
DP, SA, and MDE. NS and their NSX tools have also been proposed in order to tackle this problem.
Hence, we search and review the current state-of-art in the application of these approaches, to under-
stand how they compare between each other, how they are used, how their impacts are measured, and
what actual impact they have on the evolvability qualities of the produced systems. We also analyze
the NSX tools, by identifying what they actually do, and studies where they have been used. We then
conclude this chapter by discussing the information obtained, identifying key findings as well as gaps
and possible improvements that can be made regarding these topics.
3.1 Design And Architecture Influence On Evolvability
Design and architectural patterns have been used over the years as a means to reuse software expert
knowledge, in order to apply documented solutions to recurring problems both at the design and archi-
tectural levels. The application of these solutions is expected to improve software quality, including the
qualities associated with evolvability. Here we identify case studies and literature reviews that access
the effectiveness and qualities improved by this knowledge reuse techniques.
3.1.1 Comparative Case Studies
Chang et al. conceived a software framework with focus on architecture in order to enable rapid develop-
ment of web applications that exhibit evolvability [23]. The framework supported a layered architecture
composed of data access layer, business logic layer, control layer, and presentation layer, together with
a Model-View-Controller (MVC) architecture for the user interface. A case study was developed that
compared two information systems, each one developed using the framework and using an ad-hoc
architecture. The two implementations were then compared based on development time and question-
naires to the developers about the qualities of the two approaches. The results obtained indicate that
the framework reduces development time and effort, while significantly increasing maintainability.
15
Lindvall et al. developed a case study on the comparison of the maintainability of a distributed system
using an ad-hoc client-server architecture, and the same system using better architectural and design
patterns suitable for the system at hand [51]. The new architecture was based on both client-server and
component architectural patterns, with focus on modularity. The components were defined based on
the technologies used, meaning that a a given component should only deal with a specific technology,
in order to achieve better separation of concerns. To communicate between components the authors
employed the Mediator DP, so that they could increase encapsulation by defining communication inter-
faces. To compare the two architectures, coupling measures were used based on the coupling between
objects metric [24], and observations during the implementation of the application. The authors con-
cluded that the new architecture showed less coupling, and showed better analyzability, testability, and
modifiability, thus having better maintainability.
Mouratidou et al. developed a case study to access the impact of the use of patterns on maintainabil-
ity [65]. A small e-commerce Java Enterprise Edition (JEE) application was developed manually without
enforcing the application of patterns. This application was then redeveloped using a set of architectural
and design patterns, namely (i) Front Controller, (ii) MVC, (iii) Transfer Object, and (iv) Service to Worker.
The authors then analyzed the produced systems using a set of five OO code metrics derived from the
Chidamber & Kemerer metric suit [24]. To complement the evaluation, the authors implemented three
extensions to both versions of the application, and compared the differences between each implemen-
tation. They analyzed the results from both versions and found that the application of these patterns
reduced significantly the amount of coupling and complexity of the system, while raising reusability.
which in turn improved its maintainability.
Ampatzoglou et al. performed a study on the stability of systems that use individual or coupled Gang
of Four DP [8]. The study was conducted on 537 open source Java software projects, containing about
65.000 classes, where the authors identified the use of DP occurrences. The identified classes’ ripple
effects were then measured using a combination of code metrics related to size, polymorphism, and
class response size. The results and comparison of those measurements revealed that the application
of DP can shield certain pattern-participating classes against changes, depending on their role in the
pattern.
The work of Ampatzoglou et al. also confirmed that classes that play roles in more than one pattern
become more unstable the more DP they participate in, and that the actual instability varies significantly
over different patterns and combinations of patterns. This difference is mainly due to the various ways
classes interact in these patterns, for example, through inheritance or aggregation. Regarding pattern
coupling, the instabilities found in individual occurrences tend to be transported to the coupling, meaning
unstable patterns will generate unstable combinations.
3.1.2 Surveys And Reviews
Sharma et al. studied a variety of architectural patterns used in areas such as distributed systems,
artificial intelligence, or data mining [81]. They analyzed the architecture patterns based on the qualities
16
they bring to the software that uses them. More specific to evolvability, they analyze the correlation
between the patterns and complexity, maintainability, and portability. They found that component and
layered architectures are the ones that provide the best balance and values for these qualities, with the
component architecture focusing on high portability and low complexity, while the layered architecture
focuses on high maintainability and low complexity.
On a same note, Majidi et al. surveyed existing studies of architectural patterns and evaluations
of the qualities presented by those patterns, along with some of the more common combinations in
practice [53]. Their studies postulate that the layered architectural pattern was the pattern that presented
more benefits in the qualities related to evolvability. Furthermore, they also indicate that the combination
of the layered pattern, when allied to a client-server structured architecture, favors these qualities even
more, but a toll on performance is also mentioned.
Budgen performed a series of studies over the effectiveness of DP in software qualities and knowl-
edge reuse [22]. First they did a mapping study over existing studies of DP effectiveness, and found
some evidence of patterns improving the maintainability of systems [96]. On their next two works, they
questioned a vast differentiated demographic working on software, about the usefulness of the patterns
belonging to the Gang of Four set [97, 98].
In regards to qualities, Budgen concluded that in general, the benefits and use of DP application
vary widely, mainly due to the difficulty of implementation, risk of misuse, and domain of application.
Generally the more benefits a pattern brings, the higher the consequences of misuse. This holds true
especially for the Visitor and Facade patterns.
The trade-off aspect of DP was also confirmed by Ampatzoglou et al. in [7], where they reviewed
available studies over the software qualities affected by DP and found opposing claims over the benefits
and disadvantages of patterns.
3.2 MDE Impact On Evolvability Qualities
MDE can be considered as an approach to evolvability, one that focuses on all stages of development,
from architecture, to design, to implementation. It tries to achieve better software by relying on transfor-
mations and code generation to automatically enforce good practices and knowledge reuse efficiently,
in ways that manual development struggles to achieve. We hereby present our findings over the effec-
tiveness and impact this development approach has demonstrated regarding evolvability qualities.
3.2.1 Comparative Case Studies
Martınez et al. developed an experiment in which they asked 26 graduate students to implement three
modules of a web application with similar implementation difficulty, where each module should be im-
plemented using a code-centric approach, a model-based approach, and a model-driven development
approach [58]. The participants were then questioned on their experiences and opinions regarding the
development and maintenance of the produced modules. The subjects identified the model-based and
17
model-driven approaches as the most beneficial for maintainability and development effort. However,
they also indicated that the code-centric approach provided much more room for customizations and
personalizations of the code.
Mao et al. compared a domain-specific, template-based code generation tool with traditional de-
velopment methods [57]. They used five medium-sized systems, each developed by different teams
composed of four to six individuals. Three of the systems were developed using a traditional program-
ming approach, and two using the code generation tool, utilizing templates defined by experts and tested
beforehand. They compared the systems using size metrics, development and maintenance time, and
defect density, and found that the tool improved productivity by almost 30%, while also revealing slight
improvements on defect density. Thus the authors concluded that it decreased maintainability effort.
Ortiz et al. devised an MDE approach based on model-driven architecture and aspect-oriented pro-
graming to develop web service applications [73]. The case study follows the implementation of two
versions of a web-service application for a Spanish university, one using an ad-hoc approach, and
another using the developed approach. Their method consisted in building UML models and refining
them until a platform specific model was obtained, which was then used to generate code. The two
approaches were analyzed by employing the use of code metrics that measured cohesion, coupling,
size, complexity, encapsulation, and performance. The authors found that the MDE approach resulted
in better separation of concerns, encapsulation and simplicity, while reducing coupling. This translated
into an improvement in maintainability, while not losing performance, and increasing productivity due to
reduced development time as consequence of code generation.
3.2.2 Surveys And Reviews
Hutchinson et al. performed a series of questionnaires and interviews on MDE practitioners, with over
250 responses and 22 interviews [41]. The goal was to understand how MDE is currently applied in
industry and the factors that determine its success and failure. In terms of software qualities, the majority
of respondents considered the use of MDE beneficial not only in productivity but also in maintainability
and portability, mainly due to code generation. However, even though the majority thought MDE was
a success, a significant portion disagreed that was the case. Overall, the authors found that most
reasons for adoption, use, and success of MDE approaches come from organizational factors such as
productivity, communication between developers and stakeholders, and educational and training factors.
Furthermore, the authors also report that there is no sign of a standard set of tools for MDE, with
developers citing more than 40 modeling languages and 100 tools as standard in their survey [95].
In general, studies over MDE are few as indicated by Mohagheghi and Dehlen. The authors searched
for several research channels for publications over a span of seven years, and found only 25 papers,
over which only five were related to software qualities [64]. Papers that presented quantitative data are
focused mainly on proving the productivity and reduction on development time using MDE approaches,
and the few that mention qualities do not back their assumptions with real data.
The same conclusion was deduced from the work of Ameller et al.. They conducted a bigger mapping
18
Figure 3.1: Structure of applications deployed by NSX. They follow a component based architecture,where each component itself is composed of NSX elements. The user connector element is used toenable interaction between the user and the application.
study where they managed to find 129 papers related to MDE and the development of service-oriented
architectures, with the intent to understand how MDE supported this type of architecture, and the non-
functional requirements it enhanced or provided [6]. However, the authors still struggled to find research
in this area, with less than one fourth of the analyzed papers dealing with software qualities. Even then,
the majority of those only addressed specific qualities, mainly reliability, security, and performance, and
not qualities related to evolvability.
3.3 Normalized Systems Expanders
NSX are a set of software programs that generate code compliant with the NS theory, in the form of
structured software elements [66]. NSX take into account three main hierarchical structures, as depicted
in Figure 3.1:
• NSX Elements which are the smallest type of aggregation generated.
• Components which can be described as aggregations of various NSX elements that belong in the
same functional domain. NSX have a set of default components that are used as dependencies
19
by user-generated components.
• Application which is the highest hierarchical level, and is composed by one or more NSX compo-
nents.
NSX tools are based on the OO paradigm, using Java as their main programming language, following
the JEE standards. They are capable of generating a full working application, ranging from database
schema generation, to functional user interfaces, that are deployed using application servers. They
support relational databases only, and are available for all major Operating Systems, and have been
tested in the most popular web browsers.
The generated applications do not contain everything required to be deployed, especially concerning
graphic features and business logic. Therefore extensions will need to be made, by creating or altering
generated files. NSX support this by creating well-defined zones in the source code called anchor points
or tags, where developers can add custom code. These modifications, along with new files that might be
created are then harvested by NSX, and injected when the application is generated again, due to new
versions of the expanders or added features to the application.
NSX Elements
Elements represent modules designed with the four NS theorems in mind, and implemented according
to DP that are meant to be composed with each other as building blocks of an application. To generate
these elements, the NSX take as input descriptor text files, that contain all the necessary information of
an element, mainly their names, properties, and context relative to other elements and the application.
The framework will generate the source code by expanding these descriptor files into the DP defined for
each element [21, 54]. In the current version of the NSX there are five elements that can be generated:
• Data Element which aims at separating domain functional data and technical tasks. It generates
structures concerning the definition, storage, and transaction integrity of the application’s data.
• User Connector Element which links user interaction with a data element, separating the struc-
tures that support and manipulate data from the structures that allow users to interact with the
application. This element generates code that defines user views and actions, remote interac-
tions, and user sessions.
• Task Element that represents a single, encapsulated functional task, along with supporting tasks,
each with their own versions, as to obtain action version transparency.
• Workflow Element which represents a sequence of tasks performed by action elements, and is
triggered by an event. This element depends on the task element, as it requires one or more tasks
to form a sequence. It also depends on data elements as they allow the workflow to be configured
into the database.
• Trigger Element which provides a way to activate a workflow element at a specific time, and
thus, can only exist associated with a workflow element. In a similar way, it also depends on data
elements that persist its associated data.
20
3.3.1 Information Management System Case Study
Op’t Land et al. describe an information management system built using NSX [72]. The NS team pro-
vided training sessions to the developers on how to use the NSX, and carried out audits on the developed
application. In a first stage, the models of the application were refined and corrected, and the NS tools
implemented these models instantly. This first generated application provided support for Create, Read,
Update, Delete (CRUD) screens, and the custom screens were implemented next. Workflow function-
ality was then inserted in the system, followed by an export functionality that was able to export sets of
data as Extensible Markup Language (XML). During the development cycle of this application, various
adjustments were constantly made to its models, and these changes could always be translated to the
application.
Op’t Land et al. concluded that only extensions need to be designed, constructed, and tested, since
the remaining parts of the system are generated which include persistency, remote access, input valida-
tion, logging, user management, and simple screens. The productivity gains for the whole development
process were estimated at a factor of 2 to 2.5. However, a significant amount of time was necessary
for developers to learn how to use the NSX, and limitations in terms of testing possibilities both for the
expanded and generated code were noticed.
3.3.2 Budget Management Application Study
A budget management application was a case study reported by Oorts et al., developed by the NS
team [71]. The initial application aimed at developing a user-friendly budget management alternative to
Excel spreadsheets, that should allow budgets to be consulted through combinations of their character-
istics. Fifteen NS data element instances were identified and then expanded by the NSX.
The generated application was extended manually, and the two main types of extensions necessary
were logic extensions, and graphical extensions. Graphical extensions proved to be much more intensive
than logic ones, and required more time and effort. The total development time was reduced by one third
compared to the estimated time for a traditional approach, and more than half of that period was spent
on extensions. Despite the heavy development of extensions, they only accounted for five percent of the
total source code.
3.3.3 Budget Management Application Revision Study
Oorts et al. documented a case study that took place one year after the development of the budget man-
agement application described in the last subsection [70]. In this case study, the requirement changes
that surged over the year were implemented. The analysis conducted determined that only four element
instances would remain intact from the original fifteen, and a total of ten new element instances needed
to be included, while one needed to be removed.
In order to implement the modification of relationships, the declared links between element instances
were altered in the descriptors. The inclusion of new element instances was done by creating new
descriptor files for each of those entities, and altering the existing ones to contain links to the new ones.
21
To remove the unused element the descriptor file was erased, and the connections to it removed from
the descriptor files of the remaining instances. Some extensions were necessary, the first concerning
the validation of data, and the second type concerning graphic features.
The changes were implemented by one developer and took nine man-days, being eight of those ded-
icated to extensions. Oorts et al. concluded that evolving software built using NS can be done efficiently,
with low resources and within a short time period.
3.4 Discussion
So far we have visited a set of approaches to evolvability, reporting on case studies, surveys and litera-
ture reviews on their effectiveness. These allowed us to get a better insight into the current state-of-art
regarding the attainment of this quality, while also allowing us to perceive the methods more regularly
used to assess it, and which of its sub-qualities seem to be more addressed. We hereby, discuss several
relevant findings of our research.
3.4.1 Evolvability In Architecture And Design
Regarding the more traditional methods of improving evolvability, there seems to be controversy espe-
cially concerning DP. Some studies over DP revealed that not only does the success of the approach
depend on the pattern itself, but also on its correct implementation and application. Studies also report
that patterns can create instability when coupled to others, since classes that play more than one role
have a significant increase in complexity, coupling, and lack cohesion. However, most patterns seem
to be effective in protecting the classes that play their main roles from those instabilities. Some com-
parative studies also suggest that the use of patterns, when applied to the correct domain, do improve
evolvability qualities, especially maintainability.
Similar conclusions can be taken about architectural patterns, where the studies identified that well
structured architectures do improve evolvability qualities. Regarding specific architectural patterns, there
seems to be a preference for layered and component based applications with regards to the qualities
of evolvability, with layered architecture being preferred for maintainability, while a component based
architecture shows more advantages in portability.
3.4.2 Evolvability In MDE
There is definitely a lack of studies over the software quality benefits of MDE approaches. Not only
did we struggle to find case studies and surveys on this subject, but also the surveys that we identified
referred explicitly the lack of MDE studies in current literature, and even less focusing on qualities related
to evolvability. The majority of studies try to understand the impacts of this approach at the organizational
and business level rather then the actual effects they have on software. Furthermore, we were not able
to find case studies or surveys on comparisons between MDE approaches, only between MDE and
22
traditional ones. This shows there is definitely a need for these types of research, in order to answer
these questions.
Despite the difficulties mentioned, the studies we found seem to point that MDE has benefits regard-
ing mostly maintainability, with one study also reporting an improvement of portability. It is also worth
noticing that these improvements seem to stem mostly from the use of code generation.
3.4.3 Assessment Of Evolvability Qualities
It is possible to notice that very few studies measure all qualities related to evolvability, and even fewer
actually mention it on their papers. The quality that most papers seem to study is maintainability, and
few of them were found presenting conclusions over portability. Furthermore, most of the studies that
address maintainability do not actually present a definition of what they consider as maintainability,
which naturally poses some doubts about the actual characteristics and qualities that are objectively
being measured, since maintainability is a broad concept.
By analyzing the studies identified, we can observe that there are three main evaluation methods
used, which are development time, questionnaires and interviews, and code metrics. Measuring devel-
opment time is more adequate to assess productivity than the qualities of software, since development
time is inherently influenced by several factors, such as organizational and business factors, technolo-
gies used, and tools used. Questionnaires and interviews are useful to a certain extent, as they rely on
developers perceptions, which might be influenced by subjective factors. Code metrics provide quanti-
tative insight into the code produced, being an evaluation approach less susceptible to cognitive biases,
and that can be applied to determine characteristics related to evolvability, hence their more frequent
use.
Despite the regular use of code metrics, the studies that use them to evaluate their solutions, often
do it using a small set of metrics, or a set that does not measure all characteristics that affect evolvability
qualities. Furthermore, studies seem to use code metrics that measure software at just one abstraction
level. These factors might hinder the validity of results, as the qualities are not being measured on all
fronts, and useful insights could come to light if metrics that measured the system at a different levels
were combined.
3.4.4 NSX Considerations
Regarding NSX, we were only able to find three documented case studies about the tool’s usage and
evaluations of its effectiveness. Furthermore, these studies focus simply on development time and
perceived developer effort, providing conclusions only over those two factors. None of these studies
ever mention the actual software qualities of the code produced, neither regarding evolvability nor its
sub-qualities. Therefore, there is a need to fill this research gap, by evaluating the actual software
produced by NSX in regards to evolvability.
By analyzing the functionality of the NSX tool we can perceive some similarities between NSX and
MDE. The descriptor files used in NSX are basically the description of a model that is constructed
23
using the NSX elements, as the elements describe a language that can be used to build models of
applications. Therefore, we can consider the set of NSX elements as a meta-model. The models built
using these elements are then transformed into source code, just like in MDE approaches. These facts
imply that NSX is a combination of a meta-model, model, and transformation. Considering this reality,
even if the NSX authors do not explicitly enunciate it, NSX can actually be regarded as a specific case
of an MDE approach, one that focuses on stability and the NS theory.
24
Chapter 4
Solution
To access the evolvability of an information system developed using NSX, we devised a case study
based on the implementation of a web application. In this case study, we also intended to compare the
evolvability of systems generated by NSX with the evolvability of systems developed using other tools
and approaches. To accomplish this, we decided to implement a web application utilizing three different
approaches: (i) an NS approach, (ii) a traditional approach, and (iii) an MDE approach.
The following sections describe the case study, starting with the definition of the sample system,
followed by the definition of a set of extensions to be implemented. Afterwards, the other approaches
besides the NS one are described, with the selection process of another suitable MDE tool presented
next. Finally we describe all three implementations, starting by the NSX implementation, followed by the
traditional implementation, ending with the MDE tool implementation.
4.1 Base System Definition
The first activity in the process of defining the case study is to define a suitable base system, including
its domain and requirements. However, there are some aspects that have to be taken into account in
order to develop a feasible case study:
• The system had to be complex enough to exercise a significant amount of features.
• The system had to be somewhat elaborated, however, it should not be overly complicated.
The rationale for these concerns is that the system’s domain and features should be easily understood,
and its implementation in the various approaches should not be very time consuming and resource
intensive, as the system and its extensions would have to be developed more than once.
4.1.1 Domain Specification
The chosen information system is an application to manage entities related with the concept of a bank.
We chose this theme as we can easily make a simplification of the entities and features this information
system would have in a real setting. A banking environment is rich enough so that it does not lack
25
Figure 4.1: The domain model of the bank information system, depicting its entities and their relation-ships. The arrows next to the relationship represent its direction.
examples of concepts and requirements associated with it. Furthermore, it is a very familiar concept,
and several of its main entities and abstractions are easy to understand by most developers. Thus, it
allows us to match the concerns described above.
The domain model of the system is depicted in Figure 4.1, with its entities, their attributes and rela-
tionships. This domain is comprised of four entities:
• Customer, that represents a customer of the bank, and is identified by the customer number text
attribute. It also contains information about the customer in the form of its name, city, and street
text attributes. A customer can own a set of Accounts and Loans.
• Account, that represents a bank account, and is identified by its account number text attribute.
It also stores the numeric balance of the account. An account is associated with a Branch of the
bank, and it can be owned by one or more Customers.
• Loan, that represents a bank loan, and is identified by the loan number text property. It also has
a numeric amount attribute containing the amount of money owed. A loan is associated with a
Branch, and one or more Customers.
• Branch, that represents a branch of the bank, and is identified by its name text attribute. A branch
also contains a city text attribute and a numeric assets attribute. A branch can be associated with
one or more Accounts and Loans.
4.1.2 Requirements And Constraints
The case study system has a set of requirements that all implementations should abide to. This is
important so that functionality is not a variable. The requirements are essentially based on CRUD
operations over its domain entities:
• The user can create Customers, Loans, Accounts, and Branches.
26
• The user can delete Customers, Loans, Accounts, and Branches.
• The user can alter the attributes of Customers, Loans, Accounts, and Branches.
• The user can add or remove relationships of Customers, Loans, Accounts, and Branches.
• The user can list Customers, Loans, Accounts, and Branches.
• The user can visualize details of Customers, Loans, Accounts, and Branches.
• To access the entities operations, the user must login into the application, using a combination of
username and password.
• Once logged in, the system should allow the user to logout of the application.
To simplify the system, there is only one business rule regarding the information of each entity:
• A Customer’s customer number attribute value must be unique.
• An Account’s account number attribute value must be unique.
• A Loan’s loan number attribute value must be unique.
• A Branch’s name attribute value must be unique.
These rules ensure that there can never be two entities in the system with the same value on their
identifying field, even if that value is not the one used to identify them in the database.
We chose these requirements because they are the most basic and essential in just about all stan-
dard web-based information systems we see today. They are easy to understand, and their implemen-
tation is supported by most tools and frameworks dedicated to this type of software, including NSX. In
addition, these requirements provide enough complexity to produce a satisfyingly robust system that can
be evaluated.
Besides functional requirements, we also impose some technology rules to further level the differ-
ences between implementations:
• The database used should be a relational database.
• All the data access, business logic, and server side software, should be implemented according
to the OO paradigm using the Java programming language, with no restrictions on frameworks or
libraries.
These rules are intended to restrict implementations to the main technologies NSX supports. As
such, these rules do not allow the remaining implementations to implement their systems in fundamen-
tally different technologies or paradigms, compromising the evolvability evaluation.
27
Figure 4.2: The partial domain model of the ExNE version of the bank information system, depicting itsnew Division entity and its relationships. The arrows next to the relationship represent its direction.
4.1.3 Extensions Definition
Besides the described base system we will call Base Version (BV), we also devised a set of three
extensions to allow a more complete assessment of the evolvability of the systems. This is a tactic
implemented by other comparative studies presented in Chapter 3, that provides insight into the effect
of changes on an implemented system. The three extensions consist in (i) a new business rule, (ii)
a new search functionality, and (iii) a new domain entity, which we will refer to as Extension Business
Rule (ExBR), Extension Search (ExS), Extension New Entity (ExNE), respectively.
ExBR implements a new business rule that dictates the amount property of a Loan should be negative.
This extension intends to evaluate the impact of the modification of the application’s business logic.
It also reflects the new task anticipated change that is defined in the NS theory.
ExS defines the functionality of searching for a Customer by its customer number property. The search
should match the exact customer number, and not partial or similar strings. This extension intends
to evaluate the implementation impact of a new feature. It reflects the new action entity anticipated
change defined in the NS theory.
ExNE extension defines a new entity named Division, that represents a bank division. A bank division
has name text attribute, and can have one or more Branches. The system should also support its
associated CRUD operations. The affected portion of the domain diagram for this version of the
system is represented in Figure 4.2. This extension intends to evaluate the evolvability impact of
a new entity in the system. It reflects the new data entity anticipated change defined in the NS
theory.
These extensions simulate three common modifications to systems, and their implementation should
be made branching from the BV, generating three new separate versions of the system, as to evaluate
the extensions separately. From these, the Final Version (FV) of the system should be built, which will
combine the implementation of the three extensions, in order to evaluate the impact of their combination.
4.2 Additional Implementation Approaches
In order to do a more complete evaluation of the feasibility of the NSX tools, we decided to complement
the case study with the implementation of the system using two other approaches:
28
Figure 4.3: Diagram of the process used to select an adequate MDE tool to use in our case study. Theprocess runs until a suitable MDE tool is found.
Code generation tool based on the MDE paradigm. This implies:
1. Choosing a tool with the same purpose of NSX that uses similar methods.
2. Describing and specifying the application to the tool.
3. Generating the application.
4. Altering or extending any necessary parts of the software to attain the enumerated features
and requirements.
Using the traditional application development paradigm. This implies:
1. Designing the system’s architecture.
2. Choosing a technology stack.
3. Designing the application’s modules, classes, and interactions.
4. Manually programming the application.
With these two approaches we can not only access the evolvability of NSX, but we can also compare
them against other baselines and understand how well they hold against other available methods. This
gives us a much better perspective into the quality of their generated code. The use of a second MDE
tool allows us to perceive the position of NSX in the current MDE scene, to see how well their methods
compare with the ones used by modern tools. Thus we can draw conclusions on how relevant NSX are
and how much they bring to the table in regards to evolvability.
The use of a traditional approach serves as a baseline for the currently most used method to develop
informations systems today. This approach enables us to understand how an average ad-hoc imple-
mentation done entirely by a software developer, trying to use best practices in architecture, design, and
implementation, compares to MDE code generation built for evolvability, and what are its advantages
and disadvantages relative to this quality.
29
Criteria Motivation
The tool must be oriented towards
Domainthe development of information systems.
The tool must be oriented towardsthe development of web applications.
The server side of the application
Technologymust be implemented using Java.
The application must persist its datausing a relational database.
The tool must be capable of
Functionalitygenerating a complete application.
The tool must not generate codebased on user-defined templates.
The tool must have been updated
Relevanceat least once in the last three years.
The tool must not have beendiscontinued.
The tool must allow full access to
Accessibility
the generated code.
The tool must have a free versionavailable, or have a free trial thatmatches the other criteria.
Table 4.1: Criteria for choosing a suitable MDE tool for the developed case study. Each criteria has adescription and the motivation behind its definition. More than one criteria can be motivated by the sameprinciple.
4.3 MDE Tool Selection
The selection process to choose an adequate MDE tool is depicted in Figure 4.3 and consisted in: (i)
defining a set of selection criteria, (ii) finding a group of currently available MDE tools, and (iii) choosing
a tool matching the defined criteria.
4.3.1 Selection Criteria
The set of criteria defined to choose a suitable MDE tool is presented in Table 4.1. There are a total of
ten criteria, grouped by the motivation behind their formulation. Domain criteria make sure the chosen
tool is suitable for this case study and the implementation of the system described. The technological
constraints are aimed at restraining the chosen tool to the main technologies used by NSX, so it can be
fairly compared with them. The functional criteria also have the goal of leveling the differences between
NSX and the chosen tool, but from a perspective of how the tool works and its end result. There are also
criteria to ensure the system is relevant to the current software scene, since, if we are considering the
tool as significantly modern, it has to belong to an active project, and has to be updated with a certain
30
level of frequency. Finally, in order to actually determine the evolvability of the system, we will need to
access the source code the tool generates. The accessibility criteria ensure we can access and use the
tool and its features without financial restrictions.
4.3.2 Search and Decision
We searched the Internet for MDE application development tools using search engines, and consulting
indexing websites, and found a total of 46 tools. To resolve their compatibility with the defined criteria, we
analyzed all resources made available for each one of these tools, mainly their websites, documentation,
guides, white papers, and related published articles.
After applying the criteria, we were left with Generjee 1, a free tool that allows users to generate
full JEE information systems. It is an open source active project, last updated on April, 2016. The tool
is hosted as a web application and allows the creation of systems by describing the domain model of
the application using various forms. The application supports the exportation of a JavaScript Object
Notation (JSON) model representation of the generated application, and can import previously exported
JSON model representations in order to continue development.
4.4 NSX Implementation
The implementation of the system using the NSX tools consisted in (i) translating the defined system into
NS elements, (ii) describing the application, components, and elements, (iii) generating the application,
(iv) extending the application with business rules, and (vi) implementing the system extensions. These
steps are explained in the following subsections, along with a description of the obtained system.
4.4.1 Application Generation
The first step in generating a system using NSX is to translate the existing domain model into data
elements. This process results in four data elements that originate from the four domain entities defined:
(i) Customer, (ii) Loan, (iii) Account, and (iv) Branch. The only other elements needed are user connector
elements, but these elements are generated automatically with data elements, even tough they are
distinguished as a different element.
The next step is to describe the application to the NSX tool using an application descriptor file, as in
Figure 4.4. In this file we sequentially define:
1. The name of the application, in this case bankApp, and its user interface name, Bank.
2. The styles of the user interface, in this case the default style nsxbootstrap.
3. The components that are part of the application, in this case the four default base NSX components
utils, validation, account, and workflow, and the new component bankcomp, that represents our
system.1http://www.generjee.com/
31
Figure 4.4: NSX application descriptor file that defines settings necessary for the generation of theapplication. It contains the names, styles, components, and options of the application.
Figure 4.5: NSX component descriptor file that defines settings necessary for the generation of the newcomponent bankcomp. It contains the names and direct dependencies of the component.
4. The database options, in this case the option optionDataBaseSchema, that indicates that all com-
ponents should have their database tables in separate schema.
Afterwards, we describe the bankcomp component in a separate component descriptor file, depicted
in Figure 4.5, were we define:
1. The name of the component, bankcomp.
2. The components on which this component directly depends on, in this case the utils and validation
base components.
Finally we can describe our four data elements. Each element is described in its own data descriptor
file, as in the example of the Account element in Figure 4.6. All the remaining data element descriptors
can be found in Figures A.1, A.2, and A.3 of Annex A. The descriptions define the following aspects:
1. The name of the component, the Java package name, and the name of the data element.
2. The fields of the element, following the format of type, name, and booleans to indicate (i) if the
field should be included in partial representations of the object, (ii) if the field should be findable
(deprecated but necessary), and (iii) if the field is an enumeration. In case of a relationship, the
type indicates both the code of the relationship and the other element of the relationship.
Figure 4.6: NSX data element descriptor file that defines settings necessary for the generation of theAccount data element. It contains the names, fields, relationships, and options of the data element.
32
Figure 4.7: Architecture of the generated system using the NSX tool. The client tier contains the viewand view model that compose the user interface. The application server holds the presentation tier thatcontains user base logic, and the domain logic tier with business logic and persistency infrastructure,accessing the database tier that holds the data.
3. Finder methods, in this case none.
4. Options over the defined element.
Having described all data elements, the application can be generated through the command line,
by running a series of scripts bundled with NSX, that automatically setup the necessary environment
variables, and parse the descriptor files, to generate the full application.
4.4.2 Generated Architecture
The architecture of the generated application is depicted in Figure 4.7. This architecture follows the JEE
standard architecture, and is composed by four tiers:
• Database Tier, containing the application’s database.
• Domain Logic Tier, which contains the data access and domain logic.
• Presentation Tier, responsible for mediating the interaction between the client tier and the domain
logic tier, containing user sessions, server actions, and a web service Application Programming
Interface (API).
• Client Tier, representing the web browser portion of the application.
The application server used is the default OW2 JOnAS application server 2. The database tier is
2https://jonas.ow2.org/bin/view/Main/
33
implemented using HyperSQL DataBase (HSQLDB) 3, the default technology, and the schema is gen-
erated automatically according to the the domain classes defined in the domain logic tier. The domain
logic tier encompasses groups of three layered modules that support the access and manipulation of
data:
• Persistence Support Module, that defines the domain data objects and supports access to the
database using a Java Persistence API (JPA) provider, in this case Hibernate 4.
• Transaction Support Module, that defines business logic, and controls transactions on the ac-
cess of data using Enterprise Java Bean (EJB)3 5.
• Remoting Support Module, that includes the infrastructure enabling the communication between
the domain logic tier and the presentation tier, using Remote Method Invocation (RMI) 6.
The presentation tier implements concerns regarding the user connector element. It is composed of
primarily three modules:
• Remoting Support Module, that implements the communication with the domain logic tier, using
RMI.
• Session Support Module, containing the implementation of user communication sessions.
• User Support Module, that implements aspects related to user interaction and communication
with the client tier, using Struts 2 7.
The user support module uses a MVC pattern, where action classes serve as the model and contain
actions to be performed on the server, java server page files serve as the views, and XML files serve
as controllers, mapping views to models. This module also provides a Representational State Transfer
(REST) API to communicate with the client.
The client tier resides on the web browser of the end user, and is implemented using a Model-
View-ViewModel (MVVM) pattern using KnockoutJS 8, by means of the user connector element. The
view provides several page fragments that compose a web page, and it interacts with the rest of the
application through the view model, implemented using KnockoutJS. The view model obtains and sends
data through the REST API defined in the presentation layer.
Each data element implies the generation of a group of these modules. All these modules make use
of the NSX own internal framework to mediate the implementation of the module and the technologies
it uses, so that the classes only depend on the NSX internal framework and general specifications such
as JPA. The communication between each module is made trough a Data Transfer Object (DTO) with a
detailed and info version, that represent data of a domain entity, be it complete in the case of the details
object, or only a group of selected fields in the info object.
3http://hsqldb.org/4http://hibernate.org/orm/5http://ejb3.jboss.org/6http://www.oracle.com/technetwork/java/javase/overview/index-jsp-136424.html7http://struts.apache.org/8http://knockoutjs.com/
34
4.4.3 Business Logic Implementation
With the application generated we can implement the business rules defined earlier. This implementation
has to be done manually by altering the transaction modules of the domain logic tier. To achieve this,
we alter the Bean classes that encompass logic, and insert code between the anchor tags that are
generated along with the code. In this case, our business rules are operations that need to be performed
before creating or modifying an entity object, thus we insert in all Beans a check for uniqueness of the
specific attribute between all tags referent to those operations, that returns an error in case the condition
is not satisfied.
4.4.4 Extensions Implementation
The ExBR extension was implemented in a similar way to the previous business rules. This means that
it was implemented by inserting a check operation for the amount attribute into the LoanBean class of
the domain logic layer. This operation was inserted in the tags concerning operations before creation
and modification of entities, and if the amount value was less than zero, an error would be thrown.
The ExS extension was implemented by adding a line containing findByCustomerNumberEq be-
tween the attributes and options lines of the Customer data element descriptor, and regenerating the
application. This created the infrastructure necessary to support the search functionality, and a user
interface fragment for interaction.
The ExNE extension involved adding a new data element to the bankcomp component, the Division
data element. To add this element we added the descriptor file depicted in Figure A.4 Annex A, to define
the new element, and regenerated the application. In addition, the Branch data element’s descriptor was
altered in order to insert the relationship with the new element.
4.5 Manual Implementation
The manual implementation involved designing and programming the proposed system. To achieve
an evolvable application, we applied software best practices in all implementation steps, in the form of
architectural patterns, design patterns, and programming conventions and standards. The produced
system is hereby described in the following subsections.
4.5.1 Architecture Overview
The system uses a three tiered architecture conventional in web applications [10], depicted in Figure 4.8,
and is composed by:
Database Tier, which is a relational database serving as the persistent storage of data.
Server Tier, which can access, store, and manipulate data fetched from the database, and provides
web services that can be used by clients.
35
Figure 4.8: The three tiered architecture of the manual implementation of the system. The client tier iscomposed of the view and view model that provide user interface functionality. The server tier containsthe web services infrastructure, business logic, and data access logic to communicate with the databasetier.
Client Tier, which consumes the web services provided by the server, and presents content to the
user, allowing him to interact with the application.
The server uses a layered architecture. We chose this architecture as it has been used as an
evolvable architectural pattern [23, 53, 81]. This architecture divides the system into layers with well
defined responsibilities and communication interfaces, each serving as a base to the next layer and
abstracting from it the details of its implementation [10]. The server tier has three layers:
Data Access Layer, which communicates with the database, accessing and manipulating data. It also
maps domain object classes to their tables in the database.
Business Layer, which implements business logic and performs various operations according to the
defined business rules. It accesses data through the data access layer.
Web Service Layer, which defines a client interface by providing REST web services, while also con-
trolling their access, and providing the data sent by the client to a format understood by the busi-
ness layer.
The client tier uses a variation of the MVC pattern called MVVM, where the controller is replaced
with a view model [88]. The main components in this pattern are therefore:
View, which is the interface exposed to the user. The view gets its data and actions from the view
model under it.
View Model, that sits below the view and exposes the data and command objects that the view needs,
serving like a container object, pulling its data from the model below.
Model, which in this case is the server, as it is the one that accesses and manipulates data.
36
Figure 4.9: Class UML diagram of the data access layer class setup. It is a combination of the DAOand Factory Method design patterns, that abstracts the implementation of data access infrastructurefrom the business logic classes, which in turn only need to know the specific DAO interface, and theRepository interface, to access the data. The suspension points indicate other possible technologyimplementations. For simplicity, the only methods shown are the factory methods.
4.5.2 Database Tier
The database tier was developed using PostgreSQL 9, and the tables used map the domain as closely
as possible, with the Customer, Branch, Account, and Loan tables. These tables, besides the attributes
described in the domain, also have the id attribute, which is the attribute that uniquely identifies a record.
The many-to-one relationships are mapped using the id of the entity that plays the role of the one. To
map the many-to-many relationships we use a mediator table that stores the ids of both related records.
The additional UserAccount table maps the user credentials to access the system.
4.5.3 Data Access Layer
The data access layer has the responsibility of reading and writing to the database, while also mapping
the data obtained into domain objects that can be used by the upper layers. This layer is implemented
using the Hibernate framework, a widely used implementation and extension of the JPA. To do the
relational to object mapping, we use the JPA annotations provided by Hibernate in our domain object
classes, to indicate what fields match columns in the database, or what tables map relationships.
To effectively access the database and manipulate data, we employ an approach depicted in Fig-
ure 4.9, which includes an implementation of the DAO pattern [11]. This pattern abstracts the access
9https://www.postgresql.org/
37
to the database, so that the upper layers who need to access data, do not have to know the details of
the implementation of that access, and can simply invoke a method declared in a DAO interface. To
implement this pattern we defined an abstract generic DAO class and interface that take care of the
details behind the most basic operations. Then, for every domain object, we define a specific DAO for
that object, which extends from the generic one, that implements specific tasks related to that domain
entity, such as entity specific queries. These DAO are also parameterized, so that the using class has
to instantiate the specific DAO for the entity it wants to access. Next, we define DAO interfaces for
each domain entity, for example, AccountDAO interface. Besides abstraction of details, this pattern also
allows us to have different implementations of data accessing logic. For example, we could have an
implementation using Hibernate and one using another JPA provider.
To finalize the implementation of this layer, we use the Factory Method pattern [93] to delegate
the creation of DAO objects to a Repository. This is an interface implemented by a base abstract
implementation that provides a method to create DAO just by receiving its associated domain class as
a parameter. Thus the business layer only needs to know the domain object it wants to access, and not
the specific technology that enables that access, as the instantiation is delegated. It is also trough this
class that objects that want to access the database can initiate, commit, and terminate transactions, as
it also abstracts the specifics of those actions.
4.5.4 Business Layer
The business layer is where the business logic lies, allowing the manipulation of domain entities while
enforcing business rules. It is implemented mostly in pure Java, and receives input from the web service
layer, and uses that input by issuing database transactions, and accessing data from the data access
layer, to perform operations on that data. These operations involve various domain objects simultane-
ously, and encompass the validation of inputs and states, and execution of business rules.
4.5.5 Web Service Layer
This layer encompasses the web services infrastructure necessary to expose the server API to the
client, and is implemented with the help of the Restlet Framework 10. It declares a series of REST web
services that the client can invoke, and each service can allow one or more of the HyperText Transfer
Protocol (HTTP) operations (i) GET, (ii) PUT, (iii) POST, and (iv) DELETE. The services are stateless,
meaning that each request from the client contains all the information necessary to service the request,
and session state is not held on the server. This implies that the client has to authenticate with every
call to a service, which is done through a verifier that filters service calls. A REST based API essentially
separates the client from the server, allowing the tiers to evolve independently.
The services are determined by the domain entity in which they are based on, and the plurality of
the operations. For the operations involving relationships between the entities, as in operations over
an account’s owners, there is also a separate service, to facilitate both the display and manipulation of
10https://restlet.com/projects/restlet-framework/
38
those relationships. Each service is implemented as a class, and its HTTP operations are mapped to
methods on the class.
The communication itself is made trough JSON DTOs that are received from the client, and parsed
formats that can be used by the business layer. In case exceptions are thrown by the business layer,
this layer can turn those exceptions into HTTP error codes and error messages, in order to provide the
client with more useful information.
4.5.6 Client Tier
The client is implemented using the Bootstrap 11 and AngularJS 12 frameworks. It is organized as a
single page application, where a main HyperText Markup Language (HTML) frame is defined and the
changing portions of the view are implemented as partial HTML files. AngularJS allows the HTML to be
augmented with data-bindings that present content dynamically based on the state of the view model,
and can also affect the state of the view model based on user inputs. The view model is implemented
using AngularJS defining operations and fetching data that can be used by, and affect, the view.
4.5.7 Extensions Implementation
The ExBR extension was implemented by adding the business rule to the business layer methods that
manipulate the loan domain entity, where for every operation that involved the persistence or update of
the amount attribute, the system will first check of it is greater or equal than zero, and in case it is, an
exception will be thrown.
To implement the ExS extension, we added a search form into the customer listing view in the client,
and when this form is submitted, a call to the service that fetches the list of customers is issued, con-
taining a query with the searching term. We modified this service’s implementation in the server in order
to allow the use of a query in its GET method, and to parse that query in order to obtain the searching
term. Then we access the database to find the customer with the given name, and return it.
The implementation of the ExNE extension consisted in adding a new table to the database schema
and altering the Branch table to reference it. It also implied the addition of a new domain object class
and DAO in the data access layer, and new set of web services related to the entity, along with business
methods to support the new operations.
4.6 Generjee Implementation
The implementation of the system using Generjee, required (i) the description of the system’s domain
model and project setting, (ii) generating the application, (iii) extending the generated code with the
business rules, and (iv) implementing the extensions proposed. All these steps are described in the next
subsections as well as the architecture of the system obtained.
11http://getbootstrap.com/12https://angularjs.org/
39
4.6.1 Application Generation
The first step in the implementation process is to describe the application and its domain using the
Generjee web application. This application provides a series of forms that allows users to make this
description. The first required description is of the application we want to generate, by providing the
following information:
1. The name of the application and its source package.
2. User interface and deployment options such as themes, responsive support, and application con-
tainers. In this implementation we use the default values.
3. Authentication options. In this case we require user authentication.
The next step is to describe the domain model of the application. This can be done through a form
that requires the following inputs:
1. The entities name and its interface identifier attribute.
2. Options regarding data transferring such as image transfer support. In this application we do not
select any of these options as they are not needed.
3. Fields and relationships. For each field it is required to indicate its name and type. In case of
relationships, it requires the type of relationship and visibility options for the entities involved.
4. If authentication is necessary to access the entity, which is true in all entities.
Having described all the entities, the following action is to generate the application, which results in a
download of the application and an extensive JSON model of the application, that can be imported later
to continue development.
4.6.2 Generated Architecture
The generated application is based on JEE, uses the Payara Application Server 13, and has three main
tiers, depicted in Figure 4.10:
• Database Tier, that contains the application’s database.
• Server Tier, containing all server side implementation.
• Client Tier, containing the browser portion of the application.
The database tier is implemented using HSQLDB, and is automatically generated from the JPA
specification of the domain model. The server tier is implemented according to a Java Server Pages
(JSF) MVC architecture. The model is composed by two main modules:
• Data Module, containing data access support using EclipseLink 14.13http://www.payara.fish/14http://www.eclipse.org/eclipselink/
40
Figure 4.10: The architecture of the application generated by Generjee. The client tier is consists on theview rendered by the server that is organized in an MVC architecture. It contains the server view pages,the FacesServlet controller and the model, encompassing business logic and data access modules tointeract with the database tier.
• Business Module, composed of JEE bean classes containing business logic, and the actions to
be invoked by the controller, using PrimeFaces 15.
The data module allows abstracted access to the database trough the use of the DAO pattern, having
a generic DAO for the common operations and specific parameterized DAO for queries specific to the
data entity. The business module has a Bean class for every data entity that contains business logic
methods, and simultaneously acts as the managed bean of the JSF architecture. This means that it also
contains action methods that are invoked by the controller, that serve data necessary to construct the
view.
The view portion of the architecture is implemented by pages using PrimeFaces components and
themes. The mediation between the view and the model is made by the controller which is the Faces
Servlet provided by the application server.
The client tier is composed by HTML pages and Jquery 16 scripts that are served based on the
parsing of the view pages, together with the data from the model. The client interacts with the server
through ajax calls to the controller, that retrieves views, and issues actions in the model to retrieve data,
in order to update the client.
4.6.3 Business Logic Implementation
Business logic needs to be insert manually on the generated application source code, as in the NSX tool.
In order to implement the business rules defined, we alter the Bean classes of the business module, by
inserting a method that validates the rule, and a call to that method before saving or updating an entity.
15http://www.primefaces.org/16https://jquery.com/
41
In case the rule is verified the save or update continues, else an error is thrown. This procedure is
applicable to all the domain entities’ Bean classes.
4.6.4 Extension Implementation
To implement the ExBR extension, we follow a procedure similar to the one used to implement the
previous business rules. This means that we insert a method in the LoanBean that checks if the amount
is not negative before saving or updating. In case it is, an error is thrown.
The ExS extension was implemented by adding a new query to the data layer to search customers
by the customer number attribute. Then the CustomerBean is extended with a new search method that
calls the previous method in the corresponding service, and sets the customers list data to the result
obtained. Finally, we extend the interface to include a new button and associated dialog, for the search
term input and submission.
The ExNE extension was implemented by importing the exported JSON model into the Generjee
tool, which parses the model and automatically sets the description forms to reflect the information on
the model. Then we describe the new entity, and alter the Branch entity to reflect the new relationship.
Finally, the application is regenerated, resulting in a new version with the new entity added.
42
Chapter 5
Evaluation
To evaluate the evolvability of the produced systems, we employ methods of static code analysis, using
code metric suites capable of measuring the characteristics of evolvable systems. The purpose of this
analysis is to provide quantitative data about the qualities of the produced systems in order to validate
the following hypothesis:
• A system developed using the NSX tools is highly evolvable.
• A system developed using the NSX tools is more evolvable than the systems developed using
other approaches.
The chapter begins by introducing the methodology used in the evaluation. Next, we explain the connec-
tions between the code metrics used in our evaluation and the characteristics of evolvability. Afterwards,
we present the results of the analysis and a walk-through of the values obtained. Finally, an analysis of
the results in terms of evolvability is carried out, along with conclusions about the hypothesis presented.
5.1 Methodology
The process used to evaluate the implementations and validate the hypothesis described, is depicted in
Figure 5.1. For each implementation and its versions, we evaluate their source code, and analyze the
data obtained, organizing it into graphs and tables that help visualize the data and get better insights
from it. Following this process, the results are used to infer the qualities of the systems, and compare
them, in order to understand if a system generated with NSX is evolvable and how that evolvability
compares with the evolvability of the remaining approaches.
5.1.1 Source Code Evaluation Approach
To evaluate the produced information systems, we use a static code analysis based approach. Static
code analysis is performed on source code without actually running it. It encompasses a family of tech-
niques to compute information about the behavior and quality of a program [67]. Static code analyzers
43
(a) Subprocess of the evaluation of a single implementation.
(b) Full process used to assess and compare the evolvability of the implementations.
Figure 5.1: Process diagram of the evaluation methodology used to assess and compare the evolvabilityof the implementations. In (a) the process to evaluate one system implementation is described, where ananalysis is applied in the various milestones of the implementation. In (b) the full process is presented,with (a) depicted as a subprocess, executed for the three implementations.
can capture comprehensive and accurate models of software systems, and are used for different pur-
poses, such as (i) security vulnerabilities and bug detection, (ii) verification of security properties, (iii)
program understanding, and (iv) quality determination [28, 99].
Static code analysis is typically guided by rules or metrics, which represent programming conventions
or expert knowledge on how to develop software [74], and its methods and tools have been met with
significant practical success, being frequently integrated in the development of evolvable software [52],
as confirmed by the literature reviewed in Chaper 3.
Software code metrics are units of software measurement that provide a formal means to estimate
software quality and complexity [61]. Metrics possess a number of interesting characteristics: They are
(i) simple, (ii) precise, (iii) versatile, and (iv) scalable to various sizes of software.
Over the years, software metrics have been a useful tool to control the development and maintenance
44
activities of software applications [100], and can be applied throughout the whole software development
process. When used properly, they provide multiple benefits [14]:
• Quantified and meaningful estimates.
• Support for decision making processes.
• Identification and quantification of product and process evolution, efficiency, and productivity.
A great number of code metrics have been defined throughout the years [4]. In this evaluation, we
use two metric suites designed to analyze systems both at class level and project level. The metric
suites used in this evaluation are the Chidamber & Kemerer’s class metrics [24], and Brito & Abreu’s
Metrics for Object Oriented Design (MOOD) project metrics [2, 3].
5.1.2 Selection of System Components
In this work, all of the developed systems use Java as their main language, and the bulk of their source
code is written in it. Hence, we considered that applying OO source code metrics to those parts of the
systems would allow us to assess most of their evolvability. This approach allowed us to simplify the
analysis process, since to analyze the remaining code used in the applications, we would have to use
metrics that would not abide by the OO paradigm. These metrics would not measure the same aspects
in the same way as the metrics used for the Java code, which would compromise the evaluation, since
to effectively assess evolvability, the results would have to be merged. Due to this reason, we made the
decision to focus our evaluation on the Java portions of the systems.
Regarding the NSX implementation, the bankcomp component is the only component that is actually
defined during the development process, and the only component where modifications were made in
order to implement the proposed extensions. The remaining components are automatically generated
by the NSX tool, and although they are part of the application, their purpose is more similar to that of
a library or framework. Therefore, adding to the decisions stated above, we also decided to focus our
evaluation on the bankcomp component only, treating the remaining components as dependencies.
5.1.3 Class Level Metrics Definition
Chidamber & Kemerer proposed a metric suite to evaluate the quality of OO software at class level [24].
Since then, these metrics have been extensively validated and used throughout case studies and anal-
ysis [31, 68, 86, 87]. This suite is composed of the following six metrics:
Weighted Methods per Class (WMC) measures the total complexity of all methods of a given class. It
can be defined by the expression:
WMC =n
∑i=1Ci
where n is the number of methods of the class and Ci is the complexity of method i. The orig-
inal authors of this metric did not define a function to calculate the complexity of a method, in
order to improve the flexibility of the metric. In practice, this metric can be paired with McCabe’s
45
Cyclomatic Complexity [59]. This metric measures complexity by counting the number of distinct
execution paths in a method [94]. The different paths of execution can be identified by keywords
that denote conditional or looping actions. Therefore, the cyclomatic complexity of a method starts
with the value of one, and is incremented for every if, while, for, switch, catch, or other conditional
expressions found in the method.
Depth of Inheritance Tree (DIT) measures the the number of inheritance steps between the class and
the root class. Therefore, it is a measure of how many ancestor classes can potentially affect the
class. In Java, it is possible to determine the DIT by counting the number of superclasses until the
java.lang.Object class. As every Java class must be a subclass of java.lang.Object, the DIT metric
will always have a minimum value of one.
Number Of Children (NOC) measures the total number of direct subclasses of a class. It is a measure
of how many subclasses are going to inherit the methods of the parent class. In Java, it can be
determined by counting the number of classes that extend the class, and therefore, its minimum
value is zero if the class is a leaf class.
Coupling Between Objects (CBO) measures the number of classes to which a class is coupled. A
class is declared to be coupled with another if it depends on that class or is depended on by that
class. A class depends on another when it uses methods or instance variables defined by that
class. Dependencies due to inheritance are not counted for purposes of this metric.
Response For Class (RFC) measures the response set of a class. The response set of a class is a
set of methods that can potentially be executed in response to a message received by an object
of that class. It can be expressed as:
RS = {Mi} ∪all i {Ri}
where {Mi} is the set of methods in the class and {Ri} is the set of methods called by method i.
In Java, this can be considered the sum of the number of methods and constructors in the class,
plus the number of methods and constructors that the class may directly call. Methods defined in
superclasses or called by superclasses are not counted by this metric.
Lack of Cohesion of Methods (LCOM) measures the cohesion of a class by determining the relative
disparate nature of methods in the class. This metric was originally defined as the number of
method pairs whose similarity is zero, minus the number of method pairs whose similarity is not
zero. A method pair was declared to be similar if it shared at least one instance variable. This
metric has since been reviewed [5, 40, 44, 100], and more mature versions were defined. The
version known as LCOM4, proposed by Hitz & Montazeri [40], is the one used in this evaluation.
LCOM4 considers two methods of a class related if they share a variable use, or one method calls
another. Furthermore, methods such as constructors, equals(), hashCode(), toString(), clone(),
finalize(), readObject(), and writeObject() are not considered for purposes of this metric, as these
46
Java scaffolding methods often touch all variables in a class, and would thus result in values
indicating more cohesiveness than is actually apparent in the design.
5.1.4 Project Level Metrics Definition
The MOOD metric suite was proposed by Brito & Abreu, in an effort to provide size-independent project
level metrics for OO software [2, 3]. These metrics are expressed as percentages, ranging from 0%
to 100%, and have since been subjected to various case studies, being proven to be useful in the
determination of software quality [31, 38, 46, 78]. The metrics as used in this evaluation are defined as
follows:
Method Hiding Factor (MHF) measures the degree of method encapsulation in a project. Essentially,
it gives the ratio of how many classes an average method is visible from, other than the defining
class. It is formally defined by the expression:
MHF =∑
TCi=1 ∑
Md(Ci)m=1 (1 − V (Mmi))
∑TCi=1 Md (Ci)
where:
V (Mmi) =∑
TCj=1 is visible (Mmi,Cj)
TC − 1
is visible (Mmi,Cj) =
⎧⎪⎪⎪⎨⎪⎪⎪⎩
1 iff j ≠ i and Cj may call Mmi
0 otherwise
TC is the total number of classes in the system, and Md (Ci) is the number of constructors and
methods defined with any access modifier excluding abstract and inherited methods.
Attribute Hiding Factor (AHF) measures the degree of attribute encapsulation in a project. Essentially,
it gives the ratio of how many classes an average field is visible from, other than the defining class.
It is calculated using the expression:
MHF =∑
TCi=1 ∑
Ad(Ci)m=1 (1 − V (Ami))
∑TCi=1 Ad (Ci)
where:
V (Ami) =∑
TCj=1 is visible (Ami,Cj)
TC − 1
is visible (Ami,Cj) =
⎧⎪⎪⎪⎨⎪⎪⎪⎩
1 iff j ≠ i and Cj may reference Ami
0 otherwise
TC is the total number of classes in the system, and Ad (Ci) number of all attributes with any
access modifier excluding inherited ones.
Method Inheritance Factor (MIF) measures the degree of method inheritance in a project. It provides
insight on how many methods available on an average class are due to inheritance. Its defining
47
expression is:
MIF =∑
TCi=1 Mi (Ci)
∑TCi=1 [Md (Ci) +Mi (Ci)]
where TC is the total number of classes in the system, Mi (Ci) is the number non-overridden
inherited methods, and Md (Ci) is the number of non-abstract methods defined.
Attribute Inheritance Factor (AIF) measures the degree of attribute inheritance in a project. It pro-
vides the ratio of how many fields available in a class are due to inheritance. It is defined by the
expression:
AIF =∑
TCi=1 Ai (Ci)
∑TCi=1 [Ad (Ci) +Ai (Ci)]
where TC is the total number of classes in the system, Ai (Ci) is the number of inherited attributes,
and Ad (Ci) is the number of defined attributes.
Coupling Factor (COF) measures the degree of coupling in a project as a whole. It calculates the
proportion of the classes in a project that are coupled to an average class in the project, without
taking inheritance into account. It is calculated with the expression:
COF =∑
TCi=1 ∑
TCj=1 is client (Ci,Cj)
TC2 − TC
where:
is client (Ci,Cj) =
⎧⎪⎪⎪⎨⎪⎪⎪⎩
1 iff Ci ⇒ Cj and Ci ≠ Cj
0 otherwise
TC is the total number of classes in the system, and Ci ⇒ Cj represents at least one reference
from class Ci to class Cj .
Polymorphism Factor (POF) measures the degree of polymorphism in a project. Essentially, it calcu-
lates the probability that a given method will be overridden by a subclass, and is thus an indirect
measure of the relative amount of dynamic binding in a system. It is formally defined by the ex-
pression:
POF =∑
TCi=1 Mo (Ci)
∑TCi=1 [Mn (Ci) ×DC (Ci)]
where TC is the total number of classes in the system, Mo (Ci) is the number of overridden meth-
ods, Mn (Ci) is the number of newly defined non-overriding methods, and DC (Ci) is the number
of direct and indirect subclasses of class Ci.
5.2 Connection Between Metrics And Characteristics
The metrics defined earlier have a series of correlations with the characteristics that affect the evolvability
of systems. Table 5.1 summarizes those relationships, while also indicating the recommend interval
where the metric values should lie, and the preferential limit to which an implementation should aim at.
48
Evolvability Characteristics
Suites Metrics Com
plex
ity
Coh
esio
n
Mod
ular
ity
Reu
sabi
lity
Cou
plin
g
Enc
apsu
latio
n
Acceptable Interval Preference
C&K
Weighted Methods per Class [13, 39, 69] # # [0,30] lowerDepth of Inheritance Tree [13, 33] # [1,4] middleNumber Of Children [13, 89] # [0,4] lowerCoupling Between Objects [82, 89] # # [0,9] lowerResponse For a Class [33, 89] # # # [0,40] lowerLack of Cohesion Of Methods [33, 89] # # # [0,20] lower
MOOD
Method Hiding Factor [1, 29] # [12.7%,21.8%] higherAttribute Hiding Factor [29] [75.2%,100%] higherMethod Inheritance Factor [29] # [20%,80%] lowerAttribute Inheritance Factor [29, 37] # [52.7%,66.3%] lowerCoupling Factor [29] # # # [4%,11.2%] lowerPolymorphism Factor [1, 29] # [2.7%,9.6%] lower
Table 5.1: Relationship between the chosen metrics and the characteristics of evolvability, along with therecommended intervals for each metric, and the preferential tendency of the values inside the interval.The references are mostly relative to the range of values and preferences. Increased value of themetric increases the effects of the characteristic. # Increased value of the metric decreases the effectof the characteristic.
WMC is primarily focused on the complexity of a class, and a high measure of WMC indicates that the
methods in the class are overly complex, making them harder to understand, alter, and test [24, 31, 75].
Furthermore, WMC is also a measure of size, as its value increases with the number of methods.
This aspect of a class is also measured by the RFC metric, as the response of a class increases with
its methods. A class with a large number of methods might indicate problems with modularity and
encapsulation, as well as becoming more complex, and less reusable. Classes with more methods also
tend to be very technology or application specific [79].
RFC is also a measure of class relationships, as it also takes into account the methods a class can
invoke. The more methods a class can invoke, the more it is coupled with other classes [40, 75]. The
CBO and COF metrics focus on the coupling aspect, as they effectively measure how many classes are
coupled. A high value of these three metrics indicates that the classes are highly dependent on each
other, with complex relationships, and any change can have numerous impacts on other modules.
Inheritance is a form of relationship between classes, and it has various advantages as it promotes
reuse, and helps reduce complexity, but when used incorrectly it promotes coupling and can seriously
affect evolvability. The metrics proposed help understand how well it is being used. High and frequent
DIT values may indicate that the system has deep class hierarchies, where classes deep in the hierarchy
inherit many methods, making their behavior and design more complex and hard to predict [33, 46]. NOC
measures how many classes inherit from the measured class. A high NOC value indicates more reuse,
but in turn can also indicate poor use of polymorphism [24, 75], as a class might have many children
due to having relations of inheritance in cases where other types of relationships would be more correct.
MIF and AIF essentially determine how many methods and attributes in the system’s classes origi-
nate from inheritance. This means that a high value of MIF or AIF can represent both more reusability,
49
Metrics
Versions
BV ExBR ExS ExNE FV
∧ ∨ x σ ∧ ∨ x σ ∧ ∨ x σ ∧ ∨ x σ ∧ ∨ x σ
WMC 1 93 23.0 20.9 1 93 23.0 20.9 1 93 22.9 20.9 1 101 22.8 20.8 1 101 22.7 20.8DIT 1 2 1.4 0.5 1 2 1.4 0.5 1 2 1.4 0.5 1 2 1.4 0.5 1 2 1.4 0.5NOC 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0CBO 0 25 5.3 5.5 0 25 5.3 5.5 0 25 5.3 5.5 0 25 5.3 5.5 0 25 5.3 5.5RFC 1 122 35.1 26.8 1 122 35.1 26.9 1 122 34.9 26.9 1 122 34.9 26.8 1 122 34.7 26.8LCOM 0 9 2.7 1.9 0 9 2.7 1.9 0 9 2.7 1.9 0 9 2.7 1.8 0 9 2.7 1.8
Table 5.2: Descriptive statistics of the Chidamber & Kemerer metric suite measurements for the NSXsystem versions. The descriptive statistics calculated are the ∧ minimum, ∨ maximum, x average, andσ standard deviation.
or a system where inheritance is used superfluously [78]. A very low value of these metrics might in-
dicate lack of inheritance, or that the subclasses are overriding most methods in the case of the MIF
metric. The POF metric complements the MIF metric, by measuring the percentage of methods that
originate from overrides. If the POF value is high it means that the system is not taking advantage of the
reusability aspect of inheritance [3].
Interfaces should be clearly defined, hiding the detailed implementation methods from the calling
classes. The level of encapsulation of the system can be primarily measured using the MHF and AHF
metrics. A low value of MHF indicates improper encapsulation of classes, meaning that most methods
can be accessed by other classes [78]. AHF measures should be as high as possible, since attributes
should always be encapsulated in their defining class, and accessed trough methods.
The LCOM metric measures lack of cohesion, thus its values should be kept as low as possible.
High values of LCOM indicate that a class has many responsibilities, increasing its complexity, and
contributing for a less modular system [30, 87].
5.3 Evaluation Results
For each version of each implementation, we measured the various metrics proposed. The data obtained
by these measurements is depicted in the form of tables and graphs in the following subsections, for both
the Chidamber & Kemerer metrics, and the MOOD metrics.
5.3.1 Class Level Metrics Measurements
We calculated a set of descriptive statistics for the Chidamber & Kemerer metrics. These statistics
are the minimum, maximum, average, and standard deviation, and are represented in Tables 5.2, 5.3,
and 5.4, for the NSX, manual, and Generjee implementations, respectively. We also constructed a set
of Tukey box plots, that compare the values of each system’s versions side by side. The box plots allow
us to visualize the distribution of the data, to detect outsider classes, and to see their evolution.
Among the fluctuations in the values between the NSX versions, the WMC, RFC, and CBO maximum
values are largely above the acceptable maximum, and the addition of a new entity seems to spark a
significant increase. The NOC metric always has a value of zero, and the DIT metric also has very
low values, only ranging between one and two. This shows that the system barely uses inheritance.
50
Metrics
Versions
BV ExBR ExS ExNE FV
∧ ∨ x σ ∧ ∨ x σ ∧ ∨ x σ ∧ ∨ x σ ∧ ∨ x σ
WMC 0 25 5.9 5.9 0 25 5.9 5.9 0 25 5.8 5.9 0 25 5.6 5.7 0 25 5.6 5.8DIT 1 6 3.5 2.1 1 6 3.5 2.1 1 6 3.5 2.1 1 6 3.7 2.1 1 6 3.7 2.1NOC 0 21 0.6 2.9 0 21 0.6 2.9 0 21 0.6 2.9 0 25 0.7 3.3 0 25 0.7 3.3CBO 1 28 8.6 5.3 1 28 8.6 5.3 1 28 8.7 5.3 1 28 9.0 5.8 1 28 9.0 5.8RFC 0 48 13.1 11.1 0 48 13.1 11.1 0 48 13.1 11.2 0 48 12.9 10.9 0 48 12.9 11.0LCOM 0 6 2.0 1.4 0 6 2.0 1.4 0 6 2.0 1.4 0 6 2.0 1.4 0 6 2.0 1.4
Table 5.3: Descriptive statistics of the Chidamber & Kemerer metric suite measurements for the manualsystem versions. The descriptive statistics calculated are the ∧ minimum, ∨ maximum, x average, andσ standard deviation.
Metrics
Versions
BV ExBR ExS ExNE FV
∧ ∨ x σ ∧ ∨ x σ ∧ ∨ x σ ∧ ∨ x σ ∧ ∨ x σ
WMC 0 50 11.9 13.5 0 50 11.9 13.6 0 52 11.9 13.6 0 53 12.1 13.6 0 53 12.3 13.9DIT 1 5 1.6 0.9 1 5 1.6 0.9 1 5 1.6 0.9 1 5 1.6 0.8 1 5 1.6 0.8NOC 0 6 0.4 1.5 0 6 0.4 1.5 0 6 0.4 1.5 0 7 0.4 1.7 0 7 0.4 1.7CBO 0 17 6.3 4.1 0 17 6.3 4.1 0 17 6.3 4.1 0 19 6.6 4.4 0 19 6.6 4.4RFC 0 61 18.9 17.3 0 61 19.0 17.6 0 63 19.3 17.6 0 63 19.3 17.6 0 63 19.5 18.0LCOM 0 7 2.4 1.9 0 7 2.4 1.9 0 7 2.4 1.9 0 7 2.3 1.9 0 7 2.3 1.9
Table 5.4: Descriptive statistics of the Chidamber & Kemerer metric suite measurements for the Gener-jee system versions. The descriptive statistics calculated are the ∧ minimum, ∨ maximum, x average,and σ standard deviation.
The DIT, NOC, and LCOM averages are in a good range, with tolerable WMC, RFC, and CBO metric
averages. When summing the standard deviation value to the average the WMC, RFC, and CBO metrics
are classified in the bad range of values.
The value that changes the most between the manual versions is the NOC maximum value, in-
creasing by four, with the addition of a new entity. Another maximum that is also too large is the CBO
maximum, which is well above the recommended interval. The average values of the metrics are sat-
isfactory, with only the DIT and CBO values showing values closer to the maximum. If we consider the
average plus standard deviation, we can see that the WMC metric remains in a good range, but DIT and
CBO trespass the acceptable maximum.
In the Generjee implementation, the values of some maximums also change, namely WMC and RFC,
but the variations are not dramatic, never going beyond three. The only maximums that are very well
above the maximum threshold are the RFC and CBO values. The average values of the metrics fall into
the good threshold bound, with only WMC and CBO going a little bit nearer the maximum. Considering
the average plus standard deviation, most metrics fall into the tolerable range, excluding the CBO metric
which falls on the bad range.
Figure 5.2 shows that the manual and Generjee implementations have a reasonable distribution of
WMC values, however the NSX implementation clearly stands out due to the very high values of its
outsiders. This fact gets even more accentuated in the ExNE and FV versions. However, at least 75% of
its classes are in the tolerable range. The manual version has the lowest range of values, and remains
constant throughout all versions. Its outsiders do not go beyond the bad threshold, and most of the
classes have low values. The Generjee version also has a good distribution of values, with few outsiders
51
BV ExBR ExS ExNE FV
0
10
20
30
40
50
60
70
80
90
100
Version
WM
CVa
lue
NSXManual
Generjee
Figure 5.2: Tukey box plots of WMC metric measurements. The plots compare the metric measurements(left axis) for each version of each implementation (horizontal axis), in relation to their quartiles andinterquartile ranges. The values outside the whisker bounds are outliers.
BV ExBR ExS ExNE FV
1
2
3
4
5
6
Version
DIT
Valu
e
NSXManual
Generjee
Figure 5.3: Tukey box plots of DIT metric measurements. The plots compare the metric measurements(left axis) for each version of each implementation (horizontal axis), in relation to their quartiles andinterquartile ranges. The values outside the whisker bounds are outliers.
and a majority of classes with tolerable WMC.
The DIT box plots in Figure 5.3 show that the manual system has the biggest range of values. The
distribution is very even between all the systems, and the Generjee system seems to have a distribution
always inside the recommended interval, as does NSX.
In Figure 5.4, the plots show no boxes. This means that the vast majority of classes have NOC
values of zero. The ones that have measurements greater than zero are represented as outliers. The
NSX system has no outliers which means that no classes in the application have children. The Generjee
system only has one class with NOC greater than zero, while the manual implementation has three
outliers, with one of them having a value far beyond the acceptable maximum. The ExNE and FV
versions sparked an increase in the Generjee outsider and in two of the manual implementation ones.
52
BV ExBR ExS ExNE FV
0
3
6
9
12
15
18
21
24
27
Version
NO
CVa
lue
NSXManual
Generjee
Figure 5.4: Tukey box plots of NOC metric measurements. The plots compare the metric measurements(left axis) for each version of each implementation (horizontal axis), in relation to their quartiles andinterquartile ranges. The values outside the whisker bounds are outliers.
BV ExBR ExS ExNE FV
0
3
6
9
12
15
18
21
24
27
30
Version
CB
OVa
lue
NSXManual
Generjee
Figure 5.5: Tukey box plots of CBO metric measurements. The plots compare the metric measurements(left axis) for each version of each implementation (horizontal axis), in relation to their quartiles andinterquartile ranges. The values outside the whisker bounds are outliers.
The CBO plots are displayed in Figure 5.5. The manual implementation has the highest range of
values, and outsiders with very high values. The Generjee boxes fall closely behind, but they do not
have outsiders, and at least 75% of them are between the recommended interval. The NSX boxes have
a very contained distribution of values, but it has a significant amount of outsiders that acquire elevated
values. The ExNE and FV versions cause more outsiders and elevated values in all the systems.
Figure 5.6 shows the RFC distributions, and the NSX system stands out as the one with the highest
distribution. The values of its outsiders are extremely high going over a hundred, and the situation
becomes even more accentuated with the addition of the new entity. The manual system shows the
majority of its classes have low values, and outsiders with almost tolerable values, while the Generjee
range of values is also mostly tolerable.
53
BV ExBR ExS ExNE FV
0
15
30
45
60
75
90
105
120
Version
RFC
Valu
e
NSXManual
Generjee
Figure 5.6: Tukey box plots of RFC metric measurements. The plots compare the metric measurements(left axis) for each version of each implementation (horizontal axis), in relation to their quartiles andinterquartile ranges. The values outside the whisker bounds are outliers.
BV ExBR ExS ExNE FV
0
2
4
6
8
Version
LCO
MVa
lue
NSXManual
Generjee
Figure 5.7: Tukey box plots of LCOM metric measurements. The plots compare the metric measure-ments (left axis) for each version of each implementation (horizontal axis), in relation to their quartilesand interquartile ranges. The values outside the whisker bounds are outliers.
Finally the LCOM box plot can be seen in Figure 5.7. The plots of all versions remain constant,
with the NSX system having the highest range of values, and the Generjee system having a close
distribution to the manual one. There are almost no outsiders, which means all systems have healthy
values of LCOM.
5.3.2 Project Level Metrics Measurements
The values obtained for the MOOD metrics can be seen on Table 5.5. A graph containing the mea-
surement values for the final version of each system is also presented in Figure 5.8. We decided not to
represent all versions graphically because, as can be seen in the tables, the differences between each
metric are not significant enough to be easily perceptible in the bars.
54
Metrics
Implementation
NSX Manual Generjee
BV ExBR ExS ExNE FV BV ExBR ExS ExNE FV BV ExBR ExS ExNE FV
MHF 17,7% 17,7% 17,7% 17,8% 17,7% 15,1% 15,1% 15,1% 14,1% 14,1% 4,5% 4,9% 4,5% 4,7% 5,0%AHF 99,5% 99,5% 99,5% 99,5% 99,5% 70,0% 70,0% 70,0% 71,0% 71,0% 95,9% 95,9% 95,9% 96,3% 96,3%MIF 1,7% 1,7% 1,7% 1,7% 1,7% 63,7% 63,7% 63,7% 66,1% 66,1% 26,6% 26,5% 26,5% 27,5% 27,3%AIF 29,2% 29,1% 29,1% 29,3% 29,3% 33,3% 33,3% 33,3% 32,6% 32,6% 11,7% 11,7% 11,7% 11,2% 11,2%COF 4,3% 4,3% 4,3% 3,4% 3,4% 14,5% 14,5% 14,6% 13,7% 13,8% 20,3% 20,3% 20,3% 19,4% 19,4%POF 2,7% 2,7% 2,7% 2,8% 2,7% 17,8% 17,8% 17,8% 18,4% 18,4% 6,2% 6,1% 6,2% 6,0% 6,0%
Table 5.5: MOOD metric suite measurements for each version of each system implementation, repre-sented as percentages.
MHF AHF MIF AIF COF POF0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Metric
Perc
ent
NSX Manual Generjee
Figure 5.8: Bar chart of the MOOD metric suite measurements. The chart compares the metric mea-surements in percentages(left axis) for the Final Version of each implementation (horizontal axis). Onlythe Final Version is represented since the variations between each individual version are not significantenough to justisfy a chart for each version.
Regarding the MHF metric, we can see that its values are low for all the systems. Both the NSX
and the manual system fall between the recommended values, but the Generjee system is below the
minimum threshold.
The AHF metric obtained very high measurements for all systems. The NSX and Generjee imple-
mentations have almost 100% attribute hiding, which is the ideal value. The manual system however is
below the recommended range.
The NSX application obtained near zero percent values for the MIF metric, due to its lack of in-
heritance, while the other systems obtained much higher values, especially the manual implementation.
Nevertheless, the manual implementation still falls between the acceptable values, as does the Generjee
system.
The NSX system has a considerable percentage in the AIF metric. An explanation for this value can
be that the NSX classes declare few attributes, and therefore the fields transfered by the superclasses
from its dependencies cause a significant impact in the number of fields. Nonetheless, all systems are
under the recommended range of values.
55
The COF metric has the lowest value in the NSX implementation, which is close to zero, while the
highest value belongs to the Generjee implementation. Regarding the recommended range of values,
both the manual and Generjee systems are above the acceptable maximum, while the NSX system is
below the acceptable minimum.
Finally the POF metric presents values coherent with the measurements obtained for other inher-
itance based metrics. The highest value belongs to the manual implementation, while both the NSX
and Generjee measurements fall between the recommended bounds, with NSX being almost below the
minimum.
5.4 Results Analysis And Discussion
The data obtained trough the measurements of the proposed metrics, presented in the previous section,
reveals some key findings about the implementations. These findings give a more objective understand-
ing of the characteristics and qualities of the developed systems, and give us enough evidence to draw
some conclusions about their evolvability.
5.4.1 NSX Implementation Key Findings
The data shines light over some important aspects of the NSX generated code. The most evident and
relevant findings are the following:
Existence of a set of monolithic classes. The WMC, RFC, and CBO metrics show that the system
has a set of monolithic classes, that display very high complexity and coupling. This decreases
the evolvability of the system, and any manual change that involves those classes will most likely
implicate quite the amount of effort.
Almost complete lack of inheritance use. The constant NOC value of zero indicates that there is no
inheritance relations between the classes of the application. By not using this mechanism, it ab-
dicates of one of the most important mechanisms of OO programming, and does not reap its
reusability benefits. However, this decision ensures that the system does not suffer from the con-
sequences of complex and hardly understandable deep class hierarchies. Also, it avoids change
propagation by inheritance relationships, which greatly increases the stability of the system.
Low coupling values. The NSX system show the lowest distribution of coupling values for the majority
of its classes, meaning that most of its classes do not have many potentially crippling dependen-
cies. Its distribution is also the tightest, thus the coupling values do not vary by much between
most classes.
Good encapsulation of operations and attributes. This system has the highest MHF value while falling
between the recommended interval, meaning the encapsulation of its operations is good and its
interfaces are well defined. Furthermore, its AHF value is practically perfect, indicating an excep-
tional encapsulation of its fields.
56
Overall, the NSX system has a good value of all the sub-qualities of evolvability, and apart from the
monolithic classes, it shows good levels of all the characteristics, especially encapsulation, cohesion,
coupling, and modularity. However, it seems that in order to allow most classes to have good qualities,
and to avoid inheritance, it relies on having a set of massive center classes that are very heavy on
complexity, response set, and coupling to hold the system together. Thus any evolvability operation that
does not entail these classes should run smooth without much impact. However, if these classes need
to be heavily manipulated, it will consist in high effort tasks, and all sorts of evolvability problems may
occur.
5.4.2 Manual Implementation Key Findings
The acquired results show that the manual implementation has a good contrast with the NSX one. The
main conclusions obtained from the presented data are as follows:
Low values on complexity metrics. The system obtained by far the lowest WMC and RFC values,
much lower than NSX. Not only are their values the lowest, their distributions are also tightest,
showing that this applies to the majority of its classes.
Abundant use of inheritance. All the metrics related to inheritance have the highest values in this sys-
tem, meaning it clearly relies on inheritance as one of its main mechanisms to increase reusability.
However, data suggests that in some cases it goes too far, as can be seen by the extremely high
NOC values of some classes, the high distribution of DIT, and a POF value above the recom-
mended interval. This means that inheritance in some instances is being overly and incorrectly
used, raising complexity and coupling.
High coupling values. The manual implementation has the highest coupling values of all the systems.
Its CBO box plots show that quite a few classes have values of CBO above the bad threshold. Its
COF value is also above the recommended maximum. Therefore, the system seems to suffer from
coupling problems, which can seriously compromise its evolvability.
Insufficient encapsulation of fields. The system seems to have good encapsulation regarding meth-
ods, but the same cannot be said attribute wise. It has the lowest value of AHF of all the systems
by a significant amount, showing that the definition of the fields of its classes should be much more
carefully planned, as poor encapsulation can take its toll on modifiability.
Overall, the system shows concerns mainly about its reusability. Nonetheless, problems with cou-
pling and encapsulation are evident, and its extended inheritance use might be increasing its complexity,
even tough it has low values on complexity based metrics.
5.4.3 Generjee Implementation Key Findings
The Generjee implementation analysis data shows an interesting set of values, mainly when compared
with the values from the other implementations. The main findings obtained from that data are the
following:
57
Good balance of inheritance use. There seems to be only one class that possesses a high amount
NOC value, and all inheritance metrics show healthy values. Of all the systems, it is the one that
strikes the most balance between the advantages provided by inheritance, and its disadvantages.
Most metrics in recommended ranges. Most of this system’s values fall between a tolerable spec-
trum. It does have its share of outliers, but none of them are ever as accentuated as the ones from
the other systems. Its worst outliers appear on the CBO measurements, although with values not
as high as the other systems.
Middle ground between NSX and manual systems. The Generjee implementation shows values that
lie between the other two implementations, even tough its values are closer to those of the manual
implementation. This means that it tries to achieve equilibrium between qualities, not expressively
favoring some characteristics over others.
In general, the Generjee system has a good balance of qualities, with most distributions containing
tolerable values, and not many outliers or high maximums. This is also reflected in its use of inheritance,
which the metrics show that is used carefully.
5.4.4 Evolvability Impact Of Extensions
Unfortunately, we cannot make many significant assumptions about the impact of the implemented ex-
tensions in the systems. In most cases, the measurement variations were inexistent, or did not produce
enough alterations that would allow to draw strong conclusions. The only extension that did cause more
impact was the ExNE extension. The final version of the system showed values identical to the sum of
the three versions, meaning that the extensions do not impact each other on all systems.
Some interesting remarks can be made about the ExNE extension. The metrics that saw most
differences in their values were WMC, NOC, CBO, and RFC. In all these metrics, we could see an
increase in the number of outsider classes in the Tukey box plots, with the exception of NOC. We can
also see that the values of existing outsiders increase. These two facts are more accentuated on the
NSX and manual implementations, with NSX seeing more impact in the WMC and RFC metrics, and the
manual in the NOC and CBO metrics.
The extension shows that a new entity creates more monolithic classes and increases the complexity
of the existing ones in the NSX system. It also shows that a new entity in the manual system creates
even more coupling and expands its inheritance use.
5.4.5 Results Comparisons And Conclusions
The results obtained allow us to infer the characteristics of the general system, and compare them, as
summarized in Table 5.6. Using these characteristics and their relationships to the evolvability qualities,
we can compare the values of the various qualities for each system using Table 5.7.
This evaluation allows us to conclude that NSX generated code has indeed a good evolvability. It
also shows that, as declared by the NS Theory, stability is a core evolvability sub-quality for NS, since
58
Characteristics
Implementation Com
plex
ity
Coh
esio
n
Mod
ular
ity
Reu
sabi
lity
Cou
plin
g
Enc
apsu
latio
n
NSX G# G# Manual G# G# # G#Generjee G# G# G#
Table 5.6: Comparison between the characteristics of evolvability of each system implementation. Good values. G# Tolerable values. # Dangerous values.
Evolvability Qualities
Implementation Ana
lyza
bilit
y
Test
abili
ty
Port
abili
ty
Cha
ngea
bilit
y
Sta
bilit
y
NSX G# Manual G# G# G# G# #Generjee G# G#
Table 5.7: Comparison between the sub-qualities of evolvability of each system implementation, derivedfrom the characteristics estimation. High. G# Tolerable. # Low.
they clearly try to take coupling to a minimum in the majority of the system, and put great effort into the
encapsulation of their classes, as to have clearly defined and stable interfaces. They also go so far as
to almost completely avoiding inheritance, one of the main mechanisms of OO programming, to avoid
combinatorial effects through inheritance. However, they are not perfect, and there is clear evidence that
they rely on the existence of a handful of centralized, large complex classes, with high complexity and
coupling. Therefore, evolutions that involve these classes have to be taken very seriously, as they can
compromise a significant part of the system.
Regarding the other systems, the manual implementation shows great contrast with NSX, as it fo-
cuses on other methods to achieve evolvability. Essentially, the manual version suffers mostly from
problems related to encapsulation and coupling, and shows less stability than the NSX version. How-
ever, it does not rely on monolithic classes, and has more concerns about reusability. Still, we can
assume that the manual implementation is not as evolvable as the NSX implementation.
Generjee follows an approach more akin to the manual one, but being more careful with its ap-
proach, mainly regarding inheritance and coupling relationships. It achieves satisfiable values in most
characteristics, and it tries to balance the various qualities. This implementation has good evolvabil-
ity, but considering the majority of the system, the NSX implementation is still more evolvable than the
Generjee one. However, if the intent is to make many broad and heavy or detailed customizations to
the generated system, Generjee may take the upper hand, since it does not show signs of very large
and complex classes, where mistakes can have serious consequences, and its proximity to the manual
59
method might make it more familiar to developers.
60
Chapter 6
Conclusions
Evolvability is a software quality that plays a major role in a system’s success, but current approaches to
achieve it are still far from ideal. NS Theory and its NSX tools were proposed, defending that the solution
lied in four well-defined theorems and in knowledge reuse through code generation. Although promising,
NS lack a proper evaluation of their capabilities and applicability. With a concrete and comparative
validation of the evolvability of applications generated with NSX, both developers and organizations could
achieve more informed decisions about the practices they employ to produce long-lasting information
systems. With this in mind, we set out to properly define evolvability, and to devise a case study and
evaluation methodology that could effectively measure the evolvability of information systems built using
NS, and to compare that evolvability with the one resulting from other approaches, namely, a traditional
approach, and an MDE approach using modern tools.
As our review of literature revealed, there is not enough work made in this field, and the current
research scene lacks studies over evolvability and its assessment. Besides, no studies that prove the
capability of NSX to produce evolvable systems could be found, nor did there exist studies that compared
MDE tools in relation to this quality. Furthermore, there is no systematic and complete way to assess the
evolvability of a system, regardless of its implementation approach. To solve this problem, we defined
a case study using a sample, controlled system, adequate for this task. We clearly defined its domain,
requirements, and constraints, taking into account the inner workings and purpose of the NSX tools, in
order to maximize its validity. As a complement to the base system, a set of extensions was devised, to
observe their impact on the systems’ evolvability. The selection of an adequate MDE tool to use as one
of the approaches was also taken seriously, by carefully reviewing existing tools according to a robust set
of criteria, motivated by (i) accessibility, (ii) relevance, (iii) technology, (iv) functionality, and (v) domain,
in order to achieve a fair comparison. The implementations of the system were attentively carried out,
by adhering to best practices, conventions, and development guides as to attain good evolvability.
In order to evaluate the implementations, we employed static code analysis methods, defining a
robust set of code metrics based on existing literature, identifying the specifics of their measurements,
the characteristics they measured, and the values recommended by researchers. We then employed that
methodology, obtaining a set of results that indicated NSX are indeed capable of generating evolvable
61
applications, with focus on stability. By comparing the results, we also perceived that the traditional
approach focuses on different characteristics to achieve evolvability, especially reusability, and employs
strong use of inheritance mechanisms. The utilized MDE tool, Generjee, was revealed to also produce
satisfiable levels of evolvability, residing between the other two approaches, achieving a solid balance
of characteristics. Nonetheless, NSX were revealed to not be perfect, with the results revealing the
existence of a handful of monolithic classes, heavy on complexity and coupling. A surprising, almost
complete, lack of inheritance use was also perceivable from our evaluation. We can conclude that
although the majority of the NSX generated system displays high levels of evolvability, mainly due to its
stability, customizations that can affect the monolithic classes can turn out to be significantly difficult and
dangerous, possibly hampering the stability on which NS are founded.
6.1 Impact
The carried out literature research and review over the identified approaches to evolvability, revealed
that there is not a satisfiable evaluation of these approaches. In fact, there are few studies over their
practical effectiveness in regards to evolvability, and the methods employed often are not mature, or do
not achieve complete results. Furthermore, there is no quantitative evidence that NSX actually produce
evolvable code, with only three documented applications. This work provides a complete definition of
evolvability, and an evaluation method that takes that definition into account. Therefore, this is the first
recorded work that quantitatively evaluates the evolvability of an application produced with NSX. Also,
due to the lack of studies over NSX applicability, it also helps disseminate them as a viable development
tool, as it has proven they are capable of producing evolvable code.
NS are not the only approach evaluated, as a traditional approach and an MDE approach were
also assessed. Thus, we also contribute to the software community by providing proper evaluations on
the evolvability of these approaches, especially regarding MDE, since as identified, there are very few
studies on MDE that share similar goals as ours. In this perspective, our work is also unique in the sense
that it is the first regarding the Generjee tool, and the only one we know that compares two different MDE
tools.
Another important contribution of our work is the definition of a case study and evaluation method,
capable of evaluating evolvability using a robust set of metrics that evaluate code at both class level
and project level, and associates them with software characteristics and recommended values. The
guidelines of this work can help other researchers evaluate their approaches and tools.
As a final note, this study can help developers and organizations to decide on the adoption of NS
or other approaches, based on the facts presented. Thus we contribute to a more knowledgeable and
informed software community.
62
6.2 Future Work
Although we managed to achieve the goals we set out to accomplish, there are some enhancements
and branches of research that spawn from our work, namely:
• Analyze NS and their tools from other perspectives. Even tough evolvability is the focus of NS
and NSX, for them to be a completely feasible solution for the software market, they need to prove
themselves on other fronts. Case studies could be defined in order to analyze the applications pro-
duced by NS in regards to other non-functional qualities such as performance, security, reliability,
or scalability, to name a few.
• Expand the evaluation methodology. Our evaluation methodology, although complete, could
incorporate different metric suites, or apply other evaluation methods different from those we used.
This could potentially allow for even better confirmation of the results, or to obtain insights related
to other characteristics or abstraction levels, depending on the specific goals of future research.
• Adapt the case study to a real-life setting. This work used a sample system in order to reduce
variables and complexity that could bias the results. However, it would be interesting to apply the
case study using a real, complex, production software system, that could exercise the approaches
in ways that a sample system could not. This could include involving more developers into the
experiment, or using a bigger system. Another set of extensions could also be used, ones that
could exercise even more aspects of the system, or give more prolonged insights into its continuous
evolution.
• Compare NSX with more tools and paradigms. This work focuses on evaluating NSX against
tools and paradigms that share the same basic principles. Nonetheless, it would be interesting
to compare NSX with even more tools, that would offer a diversified set of characteristics, or
even against different programming paradigms than OO programming, such as aspect-oriented
programming, or functional programming.
63
64
Bibliography
[1] F. Abreu and M. Goulao. Toward the design quality evaluation of object-oriented software systems.
Proc. 5th Int’ernatianal Conf. Softw. Qual., pages 44–57, 1995.
[2] F. B. Abreu and R. Carapuca. Object-Oriented Software Engineering : Measuring and Controlling
the Development Process. 4th. Int. Conf. Softw. Qual., 4:3–5, 1994.
[3] F. B. Abreu, R. Esteves, M. Goulao, F. Brito, A. Brito, R. Esteves, and M. Goulao. The design of
eiffel programs: Quantitative evaluation using the mood metrics. Proc. TOOLS’96, 1996.
[4] K. Aggarwal, Y. Singh, A. Kaur, and R. Malhotra. Empirical Study of Object-Oriented Metrics. J.
Object Technol., 5(8):149–173, 2006.
[5] M. Alshayeb and W. Li. An Empirical Validation of Object-Oriented Metrics in Two Different Iterative
Software Processes. IEEE Trans. Softw. Eng., pages 1043–1049, 2003.
[6] D. Ameller, X. Burgues, O. Collell, D. Costal, X. Franch, and M. P. Papazoglou. Development of
Service-Oriented Architectures using Model-Driven Development: A Mapping Study. Inf. Softw.
Technol., 62:42–66, 2015.
[7] A. Ampatzoglou, S. Charalampidou, and I. Stamelos. Research state of the art on GoF design
patterns: A mapping study. J. Syst. Softw., 86(7):1945–1964, 2013.
[8] A. Ampatzoglou, A. Chatzigeorgiou, S. Charalampidou, and P. Avgeriou. The Effect of GoF Design
Patterns on Stability: A Case Study. IEEE Trans. Softw. Eng., 41(8):781–802, 2015.
[9] C. Atkinson and T. Kuhne. Model-Driven Development: A Metamodeling Foundation. Software,
IEEE, 20(5):36–41, 2003.
[10] L. Bass, P. Clements, and R. Kazman. Software Architecture in Practice. Addison Wesley, 2nd
edition, 2003.
[11] C. Bauer and G. King. Java Persistence with Hibernate. Manning Publications Co., revised ed
edition, 2007.
[12] K. H. Bennett and V. T. V. T. Rajlich. Software Maintenance and Evolution: A Roadmap. In Proc.
Conf. Futur. Softw. Eng., pages 73 – 87, 2000.
65
[13] M. A. S. Bigonha and K. A. M. Ferreira. A Catalogue of Thresholds for Object-Oriented Software
Metrics. pages 48–55, 2015.
[14] M. Borys and M. Miłosz. Metrics of Object Oriented Software. Adv. Object-Oriented Technol.,
(January):78–90, 2010.
[15] R. Brcina, S. Bode, and M. Riebisch. Optimisation Process for Maintaining Evolvability during
Software Evolution. In 16th Annu. IEEE Int. Conf. Work. Eng. Comput. Based Syst., pages 196–
205, 2009.
[16] H. P. Breivold and I. Crnkovic. A systematic review on architecting for software evolvability. In
Proc. Aust. Softw. Eng. Conf. ASWEC, pages 13–22, 2010.
[17] H. P. Breivold, I. Crnkovic, and P. Eriksson. Evaluating software evolvability. In Proc. 7th Conf.
Softw. Eng. Pract. Sweden, number 2, pages 96–103, 2007.
[18] H. P. Breivold, I. Crnkovic, and P. J. Eriksson. Analyzing Software Evolvability. In IEEE Int. Comput.
Softw. Appl. Conf., pages 327–330, 2008.
[19] P. D. Bruyn and H. Mannaert. Towards applying normalized systems concepts to modularity and
the systems engineering process. ICONS 2012 Seventh Int. Conf. Syst., pages 59–66, 2012.
[20] P. D. Bruyn, G. Dierckx, and H. Mannaert. Aligning the Normalized Systems Theorems with
Existing Heuristic Software Engineering Knowledge. In Seventh Int. Conf. Softw. Eng. Adv., pages
84–89, 2012.
[21] P. D. Bruyn, H. Mannaert, and J. Verelst. Towards Organizational Modules and Patterns based on
Normalized Systems Theory. ICONS 2014 Ninth Int. Conf. Syst., pages 106–115, 2014.
[22] D. Budgen. Design patterns: Magic or myth? IEEE Softw., 30(2):87–90, 2013.
[23] C.-H. Chang, C.-W. Lu, and P.-A. Hsiung. Pattern-based framework for modularized software
development and evolution robustness. Inf. Softw. Technol., 53(4):307–316, 2011.
[24] S. R. Chidamber and C. F. Kemerer. A Metrics Suite for Object Oriented Design. IEEE Trans.
Softw. Eng., 20(6):476–493, 1994.
[25] S. Ciraci and P. Van Den Broek. Evolvability as a Quality Attribute of Software Architectures. In
Int. ERCIM Work. Softw. Evol., pages 29–31, 2006.
[26] S. Cook, H. Ji, and R. Harrison. Software Evolution and Software Evolvability. In Univ. Reading,
UK, pages 1–12, 2000.
[27] K. Czarnecki and S. Helsen. Classification of Model Transformation Approaches. In OOPSLA’03
Work. Gener. Tech. Context Model. Archit., pages 1–17, 2003.
[28] V. D ’silva, D. Kroening, and G. Weissenbacher. A Survey of Automated Techniques for Formal
Software Verification. IEEE Trans. Comput. Des. Integr. Circuits Syst., 27(7):1165–1178, 2008.
66
[29] F. e Abreu. Design Metrics for Object-Oriented Software Systems. Work. Quant. Methods Object-
Oriented Syst. Dev., (August), 1995.
[30] S. Eski and F. Buzluca. An empirical study on object-oriented metrics and software evolution in
order to reduce testing costs by predicting change-prone classes. Proc. - 4th IEEE Int. Conf.
Softw. Testing, Verif. Valid. Work. ICSTW 2011, pages 566–571, 2011.
[31] A. Farooq, R. Braungarten, and R. R. Dumke. An empirical analysis of object-oriented metrics for
Java technologies. 2005 Pakistan Sect. Multitopic Conf. INMIC, 2005.
[32] J.-M. Favre and T. NGuyen. Towards a Megamodel to Model Software Evolution Through Trans-
formations. Electron. Notes Theor. Comput. Sci., 127(3):59–74, 2005.
[33] K. A. M. Ferreira, M. A. S. Bigonha, R. S. Bigonha, L. F. O. Mendes, and H. C. Almeida. Identifying
thresholds for object-oriented software metrics. J. Syst. Softw., 85(2):244–257, 2012.
[34] D. Garlan. Software architecture: a roadmap. In Proc. Conf. Futur. Softw. Eng., pages 91–101,
2000.
[35] C. Gomes, B. Barroca, and V. Amaral. Classification of Model Transformation Tools: Pattern
Matching Techniques. In 17th Int. Conf. Model., pages 619–635, 2014.
[36] J. Greenfield and K. Short. Software Factories: Assembling Applications with Patterns, Models,
Frameworks and Tools. In 18th Annu. ACM SIGPLAN Conf. Object-oriented Program. Syst. Lang.
Appl., pages 16–27, 2003.
[37] S. Gupta. Advanced Object Oriented Metrics for Process Measurement. ICSEA 2011 Sixth Int.
Conf. Softw. Eng. Adv. Adv., pages 318–324, 2011.
[38] R. Harrison, S. J. Counsell, and R. V. Nithi. An evaluation of the MOOD set of object-oriented
software metrics. IEEE Trans. Softw. Eng., 24(6):491–496, 1998.
[39] K. Hirama. Software Complexity Analysis Based on Shannon ´ s Entropy Theory and C & K
Metrics. IEEE Lat. Am., 14(5):2485–2490, 2016.
[40] M. Hitz and B. Montazeri. Measuring Coupling and Cohesion In Object-Oriented Systems. Angew.
Inform., 50:1–10, 1995.
[41] J. Hutchinson, J. Whittle, M. Rouncefield, and S. Kristoffersen. Empirical assessment of MDE in
industry. In 33rd Int. Conf. Softw. Eng., pages 471–480, 2011.
[42] P. Huysmans, P. De Bruyn, S. Benazeer, A. De Beuckelaer, S. De Haes, and J. Verelst. On the
Relevance of the Modularity Concept for Understanding Outsourcing Risk Factors. In 47th Hawaii
Int. Conf. Syst. Sci., pages 4416–4425, 2014.
[43] P. Huysmans, J. Verelst, H. Mannaert, and A. Oost. Integrating Information Systems Using Nor-
malized Systems Theory: Four Case Studies. In 17th IEEE Conf. Bus. Informatics, pages 173–
180, 2015.
67
[44] S. M. Ibrahim, S. a. Salem, M. a. Ismail, and M. Eladawy. Identification of Nominated Classes for
Software Refactoring Using Object-Oriented Cohesion Metrics. Int. J. Comput. Sci. Issues, 9(2):
68 – 76, 2012.
[45] ISO/IEC 25010. ISO/IEC 25010:2011. Systems and software engineering — Systems and soft-
ware Quality Requirements and Evaluation (SQuaRE) — System and software quality models.
2011.
[46] P. Khanna. Testability of object-oriented systems: An AHP-based approach for prioritization of
metrics. Proc. 2014 Int. Conf. Contemp. Comput. Informatics, IC3I 2014, pages 273–281, 2014.
[47] F. Khomh, Y. G. Gueheneuc, and G. Antoniol. Playing roles in design patterns: An empirical
descriptive and analytic study. In IEEE Int. Conf. Softw. Maintenance, ICSM, pages 83–92, 2009.
[48] T. Kosar, N. Oliveira, M. Mernik, M. Pereira, M. Crepinsek, D. da Cruz, and P. Henriques. Com-
paring General-Purpose and Domain-Specific Languages: An Empirical Study. Comput. Sci. Inf.
Syst., 7(2):247–264, 2010.
[49] M. M. Lehman. The Role and Impact of Assumptions in Software Development, Maintenance and
Evolution. In IEEE Int. Work. Softw. Evolvability, pages 3–14, 2005.
[50] D. V. D. Linden, P. D. Bruyn, H. Mannaert, and J. Verelst. An Explorative Study of Module Coupling
and Hidden Dependencies based on the Normalized Systems Framework. Int. J. Adv. Syst. Meas.,
6(1):40–56, 2013.
[51] M. Lindvall, R. Tesoriero, and P. Costa. Avoiding architectural degeneration: an evaluation process
for software architecture. Proc. Eighth IEEE Symp. Softw. Metrics, pages 77–86, 2002.
[52] Z. Lu and S. Mukhopadhyay. Model-based static source code analysis of java programs with
applications to android security. Proc. - Int. Comput. Softw. Appl. Conf., pages 322–327, 2012.
[53] E. Majidi, M. Alemi, and H. Rashidi. Software Architecture: A Survey and Classification. In Second
Int. Conf. Commun. Softw. Networks, pages 454–460, 2010.
[54] H. Mannaert, J. Verelst, and K. Ven. Exploring Concepts for Deterministic Software Engineering:
Service Interfaces, Pattern Expansion, and Stability. In Int. Conf. Softw. Eng. Adv. (ICSEA 2007),
page 6, 2007.
[55] H. Mannaert, J. Verelst, and K. Ven. The transformation of requirements into software primitives:
Studying evolvability based on systems theoretic stability. Sci. Comput. Program., 76(12):1210–
1222, 2011.
[56] M. V. Mantyla. Empirical Software Evolvability - Code Smells and Human Evaluations. In 2010
IEEE Int. Conf. Softw. Maint., pages 1–6, 2010.
[57] F. Mao, X. Cai, B. Shen, and B. Jin. Operational Pattern Based Code Generation For Management
Information System : An Industrial Case Study. 2016.
68
[58] Y. Martınez, C. Cachero, and S. Melia. MDD vs. traditional software development: A practitioner’s
subjective perspective. Inf. Softw. Technol., 55(2):189–200, 2013.
[59] A. Mauczka, T. Grechenig, and M. Bernhart. Predicting code change by using static metrics. Proc.
- 7th ACIS Int. Conf. Softw. Eng. Res. Manag. Appl. SERA09, pages 64–71, 2009.
[60] W. B. McNatt and J. M. Bieman. Coupling of Design Patterns: Common Practices and Their
Benefits. In 25th Annu. Int. Comput. Softw. Appl. Conf., pages 574–579, 2001.
[61] T. Mens and S. Demeyer. Future Trends in Software Evolution Metrics Categories and Subject
Descriptors. Proceeding IWPSE ’01 Proc. 4th Int. Work. Princ. Softw. Evol., pages 83–86, 2002.
[62] T. Mens, J. Fernandez-Ramil, and S. Degrandsart. The Evolution of Eclipse. IEEE Int. Conf. Softw.
Maintenance, 2008. ICSM 2008., pages 386–395, 2008.
[63] L. Ming-Chang. Software Quality Factors and Software Quality Metrics to Enhance Software
Quality Assurance. Br. J. Appl. Sci. Technol., 4(21):3069–3095, 2014.
[64] P. Mohagheghi and V. Dehlen. Where Is the Proof - A Review of Experiences from Applying
MDE in Industry. ECMDA-FA ’08 Proc. 4th Eur. Conf. Model Driven Archit. Found. Appl., pages
432–443, 2008.
[65] M. Mouratidou, V. Lourdas, A. Chatzigeorgiou, and C. K. Georgiadis. An Assessment of Design
Patterns’ Influence on a Java-based E-Commerce Application. J. Theor. Appl. Electron. Commer.
Res., 5(1):25–38, 2010.
[66] Normalized Systems Institute. NS Expanders v3.0 Development Guide, 2015.
[67] J. Novak, A. Krajnc, and R. Zontar. Taxonomy of static code analysis tools. MIPRO, 2010 Proc.
33rd Int. Conv., pages 418–422, 2010.
[68] H. M. Olague, L. H. Etzkorn, S. Gholston, and S. Quattlebaum. Empirical Validation of Three
Software Metrics Suites to Predict Fault-Proneness of Object-Oriented Classes Developed Using
Highly Iterative or Agile Software Development Processes. IEEE Trans. Softw. Eng., 33(6):402–
419, 2007.
[69] P. Oliveira, M. T. Valente, and F. P. Lima. Extracting Relative Thresholds for Source Code Met-
rics. Softw. Maintenance, Reengineering Reverse Eng. (CSMR-WCRE), 2014 Softw. Evol. Week
- IEEE Conf., pages 254–263, 2014.
[70] G. Oorts, K. Ahmadpour, H. Mannaert, and J. Verelst. Easily Evolving Software Using Normalized
System Theory: A Case Study. In Int. Conf. Softw. Eng. Adv., pages 322–327, 2014.
[71] G. Oorts, P. Huysmans, P. De Bruyn, H. Mannaert, J. Verelst, and A. Oost. Building Evolvable
Software Using Normalized Systems Theory: A Case Study. In Hawaii Int. Conf. Syst. Sci., pages
4760–4769, 2014.
69
[72] M. Op’t Land, M. R. Krouwel, E. Van Dipten, and J. Verelst. Exploring Normalized Systems
Potential for Dutch MoD’s Agility (A Proof of Concept on Flexibility, Time-to-market, Productivity
and Quality). In Pract. Res. Enterp. Transform. Work. Conf., pages 110–121, 2011.
[73] G. Ortiz, B. Bordbar, and J. Hernandez. Evaluating the Use of AOP and MDA in Web Service
Development. Proc. - 3rd Int. Conf. Internet Web Appl. Serv. ICIW 2008, pages 78–83, 2008.
[74] R. Plosch, H. Gruber, A. Hentschel, G. Pomberger, and S. Schiffer. On the relation between
external software quality and static code analysis. 32nd Annu. IEEE Softw. Eng. Work. SEW-32
2008, pages 169–174, 2009.
[75] P. Pradhan, A. K. Dwivedi, and S. K. Rath. Impact of Design Patterns on Quantitative Assessment
of Quality Parameters. 2015 Second Int. Conf. Adv. Comput. Commun. Eng., pages 577–582,
2015.
[76] A. Rodrigues da Silva. Model-driven engineering: A survey supported by the unified conceptual
model. Comput. Lang. Syst. Struct., 43:139–155, 2015.
[77] D. Rowe, J. Leaney, and D. Lowe. Defining Systems Architecture Evolvability - a taxonomy of
change. In Int. Conf. Work. Eng. Comput. Syst., number December, pages 45–52, 1998.
[78] K. E. M. Sabri and J. J. Al-Ja’Afer. Metrics For Object Oriented Design (MOOD) To Assess Java
Programs. King Abdullah II Sch. Inf. Technol. Univ. Jordan, pages 1–9, 2004.
[79] G. Scanniello. An Investigation of Object-Oriented and Code-Size Metrics as Dead Code Predic-
tors. 40th Euromicro Conf. Softw. Eng. Adv. Appl., pages 392–397, 2014.
[80] B. Selic. Personal reflections on automation, programming culture, and model-based software
engineering. Autom. Softw. Eng., 15(3):379–391, 2008.
[81] A. Sharma, M. Kumar, and S. Agarwal. A Complete Survey on Software Architectural Styles
and Patterns. In 4th Int. Conf. Eco-friendly Comput. Commun. Syst. ICECCS, volume 70, pages
16–28. Elsevier Masson SAS, 2015.
[82] R. Shatnawi. A quantitative investigation of the acceptable risk levels of object-oriented metrics in
open-source systems. IEEE Trans. Softw. Eng., 36(2):216–225, 2010.
[83] J. Shuai and M. Huaxin. Design Patterns in Object Oriented Analysis and Design. In IEEE 2nd
Int. Conf., pages 326–329, 2011.
[84] O. Y. Sowunmi and S. Misra. An Empirical Evaluation of Software Quality Assurance Practices and
Challenges in a Developing Country. In IEEE Int. Conf. Comput. Inf. Technol. Ubiquitous Comput.
Commun. Dependable, Auton. Secur. Comput. Pervasive Intell. Comput., pages 867–871, 2015.
[85] T. Stahl, M. Volter, J. Bettin, A. Haase, and S. Helsen. Model-Driven Software Development:
Technology, Engineering, Management. John Wiley & Sons, Ltd, 2006.
70
[86] R. Subramanyam and M. S. Krishnan. Empirical analysis of CK metrics for object-oriented design
complexity: Implications for software defects. IEEE Trans. Softw. Eng., 29(4):297–310, 2003.
[87] Y. Suresh, J. Pati, and S. K. Rath. Effectiveness of Software Metrics for Object-oriented System.
Procedia Technol., 6:420–427, 2012.
[88] A. Syromiatnikov and D. Weyns. A journey through the land of model-view-design patterns. Proc.
- Work. IEEE/IFIP Conf. Softw. Archit. 2014, WICSA 2014, pages 21–30, 2014.
[89] B. Upadhyaya and S. K. Misra. A Survey on Formulation of Faulty Class Detection System for
Object Oriented Software. 2(2):516–519, 2015.
[90] D. van der Linden, G. Neugschwandtner, and H. Mannaert. Towards evolvable state machines for
automation systems. In Int. Conf. Syst. Towar., pages 148–153, 2013.
[91] D. van der Linden, G. Neugschwandtner, and H. Mannaert. Towards evolvable state machines for
automation systems . 6(3):148–153, 2013.
[92] J. Verelst, K. Ven, and H. Mannaert. Towards evolvable software architectures based on systems
theoretic stability. Softw. - Pract. Exp., 42(1):89–116, 2012.
[93] J. Vlissides, R. Helm, R. Johnson, and E. Gamma. Design Patterns: Elements of Reusable
Object-Oriented Software. Addison Wesley, 1st edition, 1994.
[94] H. Watson, T. McCabe, and D. Wallace. Structured Testing: A Testing Methodology Using the
Cyclomatic Complexity Metric. NIST Spec. Publ., pages 1–114, 1996.
[95] J. Whittle, J. Hutchinson, and M. Rouncefield. The State of Practice in Model-Driven Engineering.
IEEE Softw., 31(3):79–85, 2014.
[96] C. Zhang and D. Budgen. What Do We Know about the Effectiveness of Software Design Pat-
terns? IEEE Trans. Softw. Eng., 38(5):1213–1231, 2012.
[97] C. Zhang and D. Budgen. A survey of experienced user perceptions about software design pat-
terns. Inf. Softw. Technol., 55(5):822–835, 2013.
[98] C. Zhang, D. Budgen, and S. Drummond. Using a Follow-on Survey to Investigate Why Use of the
Visitor, Singleton & Facade Patterns is Controversial. In Proc. ACM-IEEE Int. Symp. Empir. Softw.
Eng. Meas., pages 79–88, 2012.
[99] Z. Zhioua, S. Short, and Y. Roudier. Static Code Analysis for Software Security Verification:
Problems and Approaches. 2014 IEEE 38th Int. Comput. Softw. Appl. Conf. Work., pages 102–
109, 2014.
[100] Y. Zhou, H. Leung, and B. Xu. Examining the potentially confounding effect of class size on the
associations between object-oriented metrics and change-proneness. IEEE Trans. Softw. Eng.,
35(5):607–623, 2009.
71
72
Appendix A
NSX Element Descriptors
Here we present the data element descriptor files necessary to generate the application. We do not
show the Account descriptor file as it is already presented in Section 4.6.1, Figure 4.6.
Figure A.1: NSX data element descriptor file that defines settings necessary for the generation of theLoan data element. It contains the names, fields, relationships, and options of the element.
Figure A.2: NSX data element descriptor file that defines settings necessary for the generation of theCustomer data element. It contains the names, fields, relationships, and options of the element.
Figure A.3: NSX data element descriptor file that defines settings necessary for the generation of theBranch data element. It contains the names, fields, and relationships of the element.
73
Figure A.4: NSX data element descriptor file that defines settings necessary for the generation of theadditional Division entity’s data element, belonging to the ExNE extension. It contains the names, fieldsand relationships.
74