Upload
jsappidi
View
238
Download
0
Embed Size (px)
Citation preview
7/31/2019 Eight Steps to Measure ADM Vendor Deliverables
1/13
White Paper
8 Steps to Measure ADM VendorDeliverables
Ensure Structural Quality with Software Analysis& Measurement
As enterprise IT departments increasingly move towards multi-sourcingenvironments, it is more important than ever to measure ADM deliver-
ablesnot only to manage risks by ensuring overall structural quality
o systems, but also to objectively evaluate vendors and make smarter
sourcing decisions. This paper describes the eight steps or integrating
Sotware Analysis & Measurement (SAM) in your outsourcing relation-
ship liecyclerom RFP preparation to contract development, team
transition and benchmarkingto objectively evaluate the reliability,
security, eciency, maintainability, and size o sotware deliverables.
This measurement can greatly improve the maturity in your outsourcing
relationships to enhance perormance and reduce risk.
7/31/2019 Eight Steps to Measure ADM Vendor Deliverables
2/13
Page 2
www.castsoftware.com
8 Steps to Measure ADM Vendor Deliverables
I. Introduction
To meet high demands rom the business, systems are becoming
increasingly complex and the requency o change is growing exponen-
tially. As a result, tradeos are made when it comes to application struc
tural quality and the inherent risk built into these systems accumulates.
And since sotware is at the core o virtually every business, any break-down in mission-critical applications can potentially result in hundreds
o millions o dollars in losses, not to mention the hit to the companys
reputation, goodwill, and credibility with customers and investors. A
review o most recent high prole sotware ailures indicates that the roo
cause o a majority o these ailures was poor quality o code.
These pressures are urther exacerbated by the growing complexity in
outsourcing, which is not just about cost savings anymore. Outsourcin
partners can bring increased fexibility and expertise on-demand, and
by building strategic relationships you can respond to business aster.
However, outsourcing also means less technical expertise in-house,and a loss o control over the quality o code being developed and the
resources developing the code. This is especially acute in an oshore
outsourcing scenario, where lower experience levels combined with hig
attrition rate can in turn have a compounding eect on the inherent risk
that gets accumulated into systems. I unchecked, application sotware
can become ticking time bombs.
Most vendor management organizations are becoming more mature
and sophisticated in managing outsourcing engagements, and they are
looking or guidance on measuring vendors in an objective way. Despite
many resources detailing how to structure outsourcing SLAs and therelated metrics, there is a dearth o inormation on how to assess and
measure the deliverables agreed upon in the SLAs. This paper oers
eight ways that Sotware Analysis & Measurement (SAM) can help to
mitigate and manage risk in outsourced applications by measuring the
structural quality o vendor deliverables.
II. Transform Structural Quality Review from an Art to a Science
Source code review comes in two ormsmanual and automated
analysis. Manual source code review is labor-intensive and subjective,
and requires highly-skilled sotware experts. Moreover, it is not possible
or a single individual to have the kind o expertise needed to review anapplication across multiple technologies.
Measuring the structural quality o sotware applications is evolving rom
an art to a science with the availability o solutions that automate the
process o code analysis. Automated analysis provides an objective,
in-depth review o the entire codebase, including source code, scripting
and interace languages across all layers o an application, against
hundreds o best practices in a raction o time it would take or manua
analysis.
Contents
I. Introduction
II. Transorm Structural Quality Review
rom an Art to a Science
III. Leveraging SAM in Outsourcing - 8
Steps
IV. Identiy the ideal operating scenario
V. Select a SAM solution that meets busi-
ness needs
VI. Conclusion
7/31/2019 Eight Steps to Measure ADM Vendor Deliverables
3/13
Page 3
www.castsoftware.com
8 Steps to Measure ADM Vendor Deliverables
SAM ocuses on the structural quality o the entire applicationrather
than individual components that are typically evaluated by unit tests
and code analyzersevaluating how its architecture adheres to sound
principles o sotware engineering.
The Consortium or IT Sotware Quality (CISQ) has dened the our
major structural quality characteristics and a size measure1 that isneeded to evaluate the overall health o an application, and consequent
provide business value: Reliability, Eciency, Security, Maintainability,
and (adequate) Size. These characteristics are the primary pillars o
evaluation in SAM and can be computed through a qualitative or quanti
tative scoring scheme, or a mix o both, and then a weighting system
refecting the priorities o a business.
III. Leveraging SAM in Outsourcing - 8 Steps
SAM is becoming increasingly prevalent in the industry since it not only
sheds light on the risks in sotware deliverables, but can be used to
greatly improve the maturity o an organization throughout the liecycle
an outsourcing relationship. In this section we will discuss in detail how
a SAM solution can add value in each o the eight important steps in an
outsourcing engagement, as noted in Figure 1.
Table 1 - Sotware quality characteristics defned by CISQ1
7/31/2019 Eight Steps to Measure ADM Vendor Deliverables
4/13
Page 4
www.castsoftware.com
8 Steps to Measure ADM Vendor Deliverables
1. Prepare data prior to outsourcingBeore transerring applications to an outsourcing partner, it is good
practice to perorm SAM to: Ensure the availability o objective inormation that oers
a realistic picture o the true quality o the applications, bydetermining baseline quality and size.
Make decisions on the best applications or outsourcingor shortlist best outsourcing partners based on the riskindicators.
Oten in outsourcing relationships, clients indicate that they are
unhappy with the quality o the code being delivered, without
realizing that the application was o poor quality to begin with. You
also may want to avoid outsourcing an application with poor qualityand inherently high risk, as it might urther increase the risk. On
the contrary, you might want to bring in an outsourcing partner to
address specic issues in the application.
2. Include application intelligence in RFPs
It is highly recommended that you include SAM outputs in RFP
documentation during the bidding process with potential vendors.
This inormation can provide bidders accurate and objective inorma
tion about:
Highlight
SAM is becoming in-
creasingly prevalent in
the industry since it not
only sheds light on the
risks in sotware deliv-
erables, but can be used
to greatly improve the
maturity o an organi-
zation throughout the
liecycle o an outsourc-
ing relationship
Figure 1 - Leveraging SAM throughout the outsourcing lie-cycle
7/31/2019 Eight Steps to Measure ADM Vendor Deliverables
5/13
Page 5
www.castsoftware.com
8 Steps to Measure ADM Vendor Deliverables
Technical size (lines o code, number o les, number otables, etc.)
Functional size (Function Points)
Technology distribution (% o code that is Java, JSP, XML,
SQL, .NET, COBOL, etc.) Complexity (cyclomatic complexity, an-in, an-out, etc.)
Structural quality metrics (reliability, eciency, security,maintainability)
Architectural blueprint with dependencies between variousmodules
With this level o application intelligence, vendors will be able to not
only provide more accurate bids, but also ensure that they evaluate
the project critically within the context o their capabilities and
resource availability. Whenever there is reluctance to share too muc
dirty laundry with the vendor during pursuit, its worth pointing out
that the problems handed over are not going away. The only impact
lack o transparency has on the process is to orce the vendor to pu
more risk padding into the proposal.
3. Get feedback on quality during vendor evaluation
As part o the evaluation process, vendors should be asked to
provide an assessment o the applications based on the SAM outpu
provided. This not only ensures their understanding o the scope o
the work and technical aspects, but also o the structural quality o
the applications. In addition, part o the technical requirement shoul
be to improve the overall structural quality o the existing code beinadopted by them. I appropriate, they should provide a detailed plan
and roadmap to improve the quality o the applications in uture
releases.
4. Reference SAM metrics in initial contract setup (SLAs and accep-tance criteria)
While it might seem obvious to hold outsourced teams accountable
or the intrinsic quality o the product itsel, acceptance criteria tied
to structural quality have only recently started to show up in SLAs.
Only in the last ew years has there been an eective way to measu
structural quality in a comprehensive way. One o the most importanbenets o using SAM in an outsourcing context is to leverage it in
contract language and make it part o a Service Level Agreement or
Acceptance Criteria.
There are primarily three categories o outputs, representing a
combination o higher-level and lower-level structural quality metrics
o sotware that can be incorporated into SLAs to achieve a specic
business need or objective: Quality Indices, Application-Specic
Rules, and Productivity.
Highlight
It is recommended to
include SAM outputs
in RFPs, since with this
level o application in-
telligence, vendors will
be able to not only pro-
vide more accurate bids,
but also ensure that they
evaluate the project crit-
ically within the context
o their capabilities and
resource availability
7/31/2019 Eight Steps to Measure ADM Vendor Deliverables
6/13
Page 6
www.castsoftware.com
8 Steps to Measure ADM Vendor Deliverables
Quality Indices
Quality indices described in section 2 (Reliability, Eciency,
Security, and Maintainability) can be used to set high-level goals
or overall application health. Ideally, applications should be
analyzed or a minimum o two to three releases, and the averag
scores used as a baseline or each o these health actors. Youcan then set targets to monitor the overall health o the applica-
tion over time.
Application-Specifc Rules
The Quality indices provide a macro picture o the structural
quality o an application. However, there are also oten specic
code patterns (rules) that you want to avoid. For example, i the
application is already suering rom perormance issues, you
want to make sure to avoid any code that would urther degrade
perormance. These specic rules should be incorporated into
SLAs as Critical Rules with zero tolerance.
Productivity
SAM solutions provide the size o the code base that is added
in a given release. Along with KLOC (kilo lines o code), some
advanced solutions like CAST Application Intelligence Platorm
(AIP) provide data on the number o Function Points that have
been modied, added, and deleted in a release. This inormatio
can be combined with the development hours spent or a given
release to generate productivity measures like KLOC/man-hour
or Function Points/man-hour. This is a very relevant metric to
track, especially in a multi-vendor scenario, so you can compare
charges rom dierent service providers and monitor productivitor each vendor.
However, care should be taken to put productivity metrics into
context since the amount o time spent on a given release cant
be ully derived only rom the actual source code delivered. For
example, the conguration tasks related to a sotware package.
It can also take some time to understand the existing code, as it
can be quite dierent rom one technology to another, rom one
architecture to another, or rom one team to another. In addition,
more oten than not user requirements are rarely nalized and
keep changing throughout the projectmaking timelines lessproductive. Moreover, quite oten service providers have their
own proprietary packages or components that sotware analysis
solutions are not able to access and are thereore not refected i
quantity-related outputs.
Automated Function Points is an objective, critical input that
orms part o the overall productivity story. This type o producti
ity inormation can be very useul when monitoring an outsource
and when combined with quality outputs and other indicators
Highlight
Data rom advanced so-
lutions likeCASTAIPprovide the number o
Function Pointsthat
have been modied,
added, and deleted in a
release can be combinedwith the development
hours spent or a given
release to generate pro-
ductivity measures
http://www.castsoftware.com/products/cast-application-intelligence-platformhttp://www.castsoftware.com/solutions/automate-function-pointshttp://www.castsoftware.com/products/cast-application-intelligence-platformhttp://www.castsoftware.com/solutions/automate-function-pointshttp://www.castsoftware.com/solutions/automate-function-pointshttp://www.castsoftware.com/products/cast-application-intelligence-platformhttp://www.castsoftware.com/solutions/automate-function-pointshttp://www.castsoftware.com/products/cast-application-intelligence-platform7/31/2019 Eight Steps to Measure ADM Vendor Deliverables
7/13
Page 7
www.castsoftware.com
8 Steps to Measure ADM Vendor Deliverables
such as the amount o hours spent, can provide some insights
into why a specic release took more man hours per Function
Point than other releasesand help identiy sources o producti
ity improvement.
SLAs vs. Acceptance Criteria: It is important to determine when
to use this type o data in SLAs and when to use it as acceptanccriteria. The recommended best practice is to use SAM data as
part o acceptance criteria (shown in table 2) beore accepting
vendor deliverables or system testing or user acceptance testin
(UAT). I the deliverable does not meet predened criteria, it
should not be accepted to avoid wasting the time o the testing
teams or users. On the other hand, data gathered rom analyzing
the application beore it is put into production can be used to
check against SLA perormance.
Setting Targets for SLAs: When setting targets in SLAs or
structural quality o sotware, it is recommended to collectbaseline data rst. As previously mentioned, ideally this should
be based on the average over two to three prior releases. In the
case o greeneld projects, since there will be no source code to
analyze and create a baseline, it is recommended to use industr
benchmark data to set the targets to match the scores in the top
quartile o the scores or that specic technology. For example,
the CAST benchmarking database has data rom more than two
thousand applications collected rom dierent industries across
all technologies.
For a more thorough discussion of this topic please see the whit
paper CISQ Guidelines to Incorporate Software Productivity &
Structural Quality Metrics in SLAs2
5. Use documentation created during analysis to ease transition
Transitioning code to a vendor team is one o the most dicult parts
o an outsourcing engagement, since documentation is typically out
o date and the original team that wrote the application may not be
available or knowledge transer. Ask your vendor how they acilitat
their teams transition and understanding o the system, identiy
and monitor the structural hotspots, and perorm impact analysis on
system changes and movement.Sotware analysis, especially system-level analysis solutions, can
acilitate the transition process. As the code is analyzed, the analyz-
ers reverse engineer the code and create a comprehensive blueprin
o the entire application, creating documentation that is current and
accurate. This inormation will not only greatly reduce transition time
but also helps the vendor teams to reduce code duplication and
understand system dependencies so they can thoroughly test as ne
additions are made.
Highlight
Transitioning code to
a vendor team is oneo the most difcult
parts o an outsourcing
engagementsotware
analysis, especially
system-level analysissolutions can efciently
assist in the transition
process
7/31/2019 Eight Steps to Measure ADM Vendor Deliverables
8/13
Page 8
www.castsoftware.com
8 Steps to Measure ADM Vendor Deliverables
6. Evaluate vendor performance with objective measures
Quality perormance evaluation is a sensitive subject, so its impor-
tant to have an agreed-upon, independent assessment o quality th
is repeatable. SAM outputs should be an important part o vendor
perormance score cards. Clients can evaluate the perormance o
vendors against SLAs and provide specic guidance with actionable
lists (shown in table 3).
7. Incorporate benchmarking into evaluation
Benchmarking o SAM metrics can be very useul to identiy opport
nities or improvement, and can be done among a group o applica-
tions within the organization or with similar technology applications
rom industry peers.
Internal Benchmarking: Automated sotware analysis provides
an objective and consistent measurement that can be used to
benchmark dierent teams within the same vendor or benchmark
the perormance o dierent vendors in a multi-sourcing scenario.
Table 2 - Sample acceptance criteria or structural quality
7/31/2019 Eight Steps to Measure ADM Vendor Deliverables
9/13
Page 9
www.castsoftware.com
8 Steps to Measure ADM Vendor Deliverables
New
Expected Cap
New
Minimum
Months
1
43.4 3.46%
3 3.05
3.4
3
Figure 2 - Sample improvement quality control chart with new targets
Table 3 - Sample perormance review process
7/31/2019 Eight Steps to Measure ADM Vendor Deliverables
10/13
Page 10
www.castsoftware.com
8 Steps to Measure ADM Vendor Deliverables
Benchmarking allows you to have a meaningul act-based dialogue
with vendors on opportunities or improvement and to measure thei
progress.
External Benchmarking: In addition to internal benchmarking,
companies can benchmark the structural quality o their application
to their industry peers. For example, CAST publishes benchmarkinginormation or applications across a wide range o industries and
technologies, through its Appmarq database. An example o this
benchmarking data is shown in Figure 3.
Leader Boards: In addition to ormal benchmarking, companies can
use SAM data to publish leader boards to highlight the applications
with the highest structural quality or teams that are most productive
Leader boards can be very eective to motivate teams to improve
their perormance.
8. Strive for continuous improvement
As well as ensuring the quality o deliverables rom vendors, SAM
metrics can be used to improve quality on a continuous basis.
Most applications have been existence several years prior to being
outsourced, so there may be some code o poor quality, a lot o
copy-pasted code, and several security vulnerabilities. Usually
vendors are not responsible or all the issues that they have inherite
unless they have been hired exclusively to remedy them. However,
companies can use SAM as a way to ensure that in addition to
making sure the new code is risk ree and is o higher quality, the
existing code base can be improved by setting improvement targets
that can be revised on an annual basis.
Highlight
Many organizations cre-
ate leader boards basedon SAM outputs, to pro-
vide visibility into the
ongoing perormance o
different teams within
the same vendor or di-
erent vendors in a
multi-sourcing scenario
Figure 3 - Sample o benchmarking chart or .NET application in fnancial services industry
Applicaon
Health
Score
Quality Characteriscs
7/31/2019 Eight Steps to Measure ADM Vendor Deliverables
11/13
Page 11
www.castsoftware.com
8 Steps to Measure ADM Vendor Deliverables
Table 4 - Dierent operating scenarios or SAM solution
IV. Identify the ideal operating scenario
There are many ways to incorporate a SAM solution into your organiza-
tion or monitoring vendor deliverables. Table 4 provides various
scenarios.
Figure 4 illustrates a typical integration o a SAM solution into the SDLCo a sotware development organization, as an example or scenarios 1
and 2 outlined in Table 4.
Figure 4 - A sample integration o SAM solution into SDLC and IT executive dashboards
7/31/2019 Eight Steps to Measure ADM Vendor Deliverables
12/13
Page 12
www.castsoftware.com
8 Steps to Measure ADM Vendor Deliverables
V. Select a SAM solution that meets business needs
It is important to understand that there are two broad categories o
solutions to measure sotware structural quality, and that SAM solutions
oer a variety o capabilities, ranging rom developer-centric tools to
enterprise-wide solutions. The rst category o solutions measures
code quality o individual components, which are language-specic and
narrowly ocused. The second category o solutions measures applica-
tion quality, in addition to analyzing the code at component leveland
importantly, they analyze how these components interact with one
another across multiple layers (UI, logic and data) and multiple technolo
gies. The exact same piece o code can be sae and excellent quality, o
highly dangerous, depending on its interaction with other components.
Mission-critical applications must be analyzed in the context o the
numerous interconnections among code components, databases,
middleware, rameworks, and APIs. This results in a holistic analysis othe structural quality o an application. Figure 5 summarizes the diere
types o solutions and their uses.
In the context o measuring outsourcing vendors, it is recommended to
use a comprehensive system-level analysis solution that has the ollow
ing key attributes:
Proactive indication and scoring o risk
Historical baselining and trending
Consistent measures across teams working on diversetechnologies
Standards-based measures that can be benchmarked againstindustry peers
Objective, unbiased KPIs
Figure 5 - High level comparison o dierent types o sotware analysis solutions
7/31/2019 Eight Steps to Measure ADM Vendor Deliverables
13/13
VI. Conclusion
Increasing demand or complex IT projects and the constantly evolving
technology landscape means that outsourcing is no longer an option
anymoreit has become a requirement or any large organization.
However, many organizations, in spite o implementing best practices,
are struggling to achieve a mutually benecial relationship with theiroutsourcing partners. Measuring vendors through automated analysis
not only minimizes risks in applications, but also greatly increases the
maturity o these outsourcing relationships by properly aligning measur
ment with the overall business and IT organization objectives.
About CAST
CAST is a pioneer and world leader in Sotware Analysis and Measure-
ment, with unique technology resulting rom more than $100 million
in R&D investment. CAST introduces act-based transparency into
application development and sourcing to transorm it into a manage-ment discipline. More than 250 companies across all industry sectors
and geographies rely on CAST to prevent business disruption while
reducing hard IT costs. CAST is an integral part o sotware delivery and
maintenance at the worlds leading IT service providers such as IBM an
Capgemini.
Founded in 1990, CAST is listed on NYSE-Euronext (Euronext: CAS)
and serves IT intensive enterprises worldwide with a network o oces
in North America, Europe and India. For more inormation, visit www.
castsotware.com.
Reerences
1. CISQ Specifications for Automated Qual-
ity Characteristic Measures. http://it-cisq.org/, CISQ Technical Work Group. (2012)
2. CISQ Guidelines to Incorporate Software
Productivity & Structural Quality Metrics in
SLAs. http://it-cisq.org/, CISQ. (2012)
3. How to Deliver Resilient, Secure, Efficient,
and Easily Changed IT Systems in Line
with CISQ Recommendations. http://www.omg.org/marketing/CISQ_compli-ant_IT_Systemsv.4-3.pd, Dr. RichardMark Soley (2012)
4. The Importance of Application Quality and
Its Difference from Code Quality. http://www.castsotware.com/resources/docu-ment/whitepapers/the-importance-o-ap-
plication-quality-and-its-dierence-rom-code-quality, CAST Sotware. (2011)
5. Sample Acceptance Criteria with Struc-tural Quality Metrics. http://www.castsot-ware.com/sample-sla
www.castsoftware.com
Europe 3 rue Marcel Allgot 92190 Meudon - France Phone: +33 1 46 90 21 00
Questions?
Email us at [email protected]