34
https://isma12it.wordpress.com/ Risk and AFP Measurement in a digital transformation program, Allianz Italia use case 12 th IFPUG International Software Measurement & Analysis (ISMA12) Conference Rome (Italy) – May 3-5, 2016 Piergiacomo Ferrari Information Security, Quality & Business Continuity IT Quality@Allianz Italia

Risk and AFP Measurement in a digital transformation program

  • Upload
    cast

  • View
    64

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Risk and AFP Measurement in a digital transformation program

https://isma12it.wordpress.com/

Risk and AFP Measurement in a digital transformation

program, Allianz Italia use case

12th IFPUG International Software Measurement & Analysis (ISMA12) Conference

Rome (Italy) – May 3-5, 2016

Piergiacomo Ferrari

Information Security, Quality & Business

Continuity

IT Quality@Allianz Italia

Page 2: Risk and AFP Measurement in a digital transformation program

2 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Approach to IT Quality

Goals of the presentation

G1. Describe a specific use case on IT Quality

G2. Identify key points and enablers

G3. Share lessons learned

Page 3: Risk and AFP Measurement in a digital transformation program

3 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Agenda

• IT Strategy – Industrialization and Agility

• Quality measurements, metrics and software selection

• Project Execution and run

• Suggestions to success in App Dev Management measurement

• Next Steps

• Annexes

Page 4: Risk and AFP Measurement in a digital transformation program

4 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Capabilities

Estimate Methodology

Application Lifecycle Management

Project Portfolio Management

Test Factory

Quality Assurance

Sourcing

Target Coverage

o Full PPM coverage

o Integration with App. Portfolio Management

o Quality assurance for .NET, Java and Cobol

applications

o Full Coverage

o PPM

Integration

o Target vendor consolidation

o Full integration with ALM and full dev areas

coverage

Industrialization & Agility Program

o Auto calculation for all application

managed inside ALM tool

Demand Mngm Process o Capacity fit/gap analysis

AZ Italy has been embarking on a program of Industrialization and Agility of SW

management processes involving various initiatives; Quality Assurance is one of

them.

Page 5: Risk and AFP Measurement in a digital transformation program

5 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Industrialization & Agility - Road Map

Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4

2012 2013 2014 2015

FAST

FUNCTION

POINT

PROJECT MANAGEMENT TOOLS

AND METHODS STANDARDIZATION

FOR SW LIFE-CYCLE

INTEGRATION OF

ARCHITECTURE AND

DEMAND

E2E INTEGRATED

DEVELOPMENT AND

TEST PILOT

TEST

ENVIRONMENT

RENGINEERING

TEST FACTORY

(Claims, Prisma Life)

Solution

Design

Program &

Project

Management

Test Factory

SYSTEM AND

APPLICATION

REPOSITORY

PROJECT

PORTFOLIO

MANAGEMENT

SOFTWARE QUALITY CHECK (focus on Digital Agency Applications)

USABILITY LAB

START UP

PROGRAM

MANAGEMENT

OFFICE

AUTOMATED

FUNCTION POINT

(Pilot)

Quality

Assurance

In 2014 a project on Software Quality Check with focus on Digital Agency

Applications and critical business Systems has been initiated

Page 6: Risk and AFP Measurement in a digital transformation program

6 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Agenda

• IT Strategy – Industrialization and Agility

• Quality measurements, metrics and sw selection

• Project Execution and run

• Suggestions to success in App Dev Management measurement

• Next Steps

• Annexes

Page 7: Risk and AFP Measurement in a digital transformation program

7 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Measuring Structural Quality of Software and Functional

Density

In 2015 we began to calculate measures to assess Sw Quality focusing on :

• Reliability/Security

• Performance/Efficiency

• Maintainability

These quality measures could be used to evaluate the risk in software-intensive

systems in order to minimize continuity, integrity and security issues.

In parallel the acknowledgement of our Function Points model and approach has

reached an appropriate maturity level across our organization and allow us to

integrate and automate the following metrics:

• Lines of code

• Function points calculation

• Quality indexes

• Running cost per unit

Page 8: Risk and AFP Measurement in a digital transformation program

8 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Document

requirements

Identify

Benefits

Prioritize

User

Requests

START

END

Document

Requirements

Propose Functional

Solution

Estimate

Effort

(Function Points)

Business Side IT Side

Solution

OK?

Cost/Benefit

s Evaluation

OK?

Project Execution

Project Governance

OK?

Fast Function Point Demand Management Process

Page 9: Risk and AFP Measurement in a digital transformation program

9 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

The counts in function point / configuration are sent to the Competence Center for validation and baseline maintenance.

IT Analysts Centro di competenza FFPA

START

No

Acquisiz. e classificaz.

Requisiti utenti

Si

END

Conteggio FP

( fase di stima)

Conteggio FP

(fase consuntivo)

+ Costi sostenuti

Conteggio FP

OK?

Esecuzione

progetto

Storicizzazione su

DB

Aggiornamento

Baseline

Conteggio FP

OK?

No

Si

Budget di

Progetto >

5k€?

Si

No

FFP Competence Center

[email protected]

Fast Function Point Process details

Page 10: Risk and AFP Measurement in a digital transformation program

10 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

New Demand Management practice (processes and roles) enables integration of efforts coming from different actors and stakeholders involved in IT solutions lifecycle (including estimation activities).

Fast Function Point FFP Competence Center in Demand Management

Page 11: Risk and AFP Measurement in a digital transformation program

11 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Q2 Q4 Q1 Q2 Q3 Q4 Q1

2011 2012 2013

Pilot (Life

and

Claims)

Sizing required for 50% of new Projects 2013

Q3 Q4

Training SW

Factories

(90 people)

Start up of

Competence

Center

Sizing (optional) of new Projects (17 in 2012)

Baseline

Maintenanc

e

Baseline

Maintenanc

e

Fast Function Point Master Plan

Page 12: Risk and AFP Measurement in a digital transformation program

12 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Quality metrics – The Health Factors

Ability to understand and modify

software quickly Maintainability

Ability to avoid outages and to recover

operations quickly Reliabilty

Ability to avoid response degradation,

resource overuse

Performance/

Efficiency

Ability to prevent unauthorized

intrusions and data theft Security

Damages, lost revenue,

customer loss

Lost customers,

operating cost

Damages, customer

confidence

Quality Characteristic Measures Business outcomes

Cost of ownership, time

to market

Page 13: Risk and AFP Measurement in a digital transformation program

13 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Measures Violations

Ability to understand and modify

software quickly Maintainability

Ability to avoid outages and to recover

operations quickly Reliabilty

Ability to avoid response degradation,

resource overuse

Performance/

Efficiency

Ability to prevent unauthorized

intrusions and data theft Security

Empty exception block

Unreleased resources

Circular dependency

Expensive loop operation

Un-indexed data access

Unreleased memory

SQL injection

Cross-site scripting

Buffer overflow

Quality Characteristic Measures

Example architectural and

coding violations

composing the measures

Excessive coupling

Dead code

Hard-coded literals

Page 14: Risk and AFP Measurement in a digital transformation program

14 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Software selection

In 4Q2013, after a sw selection, Allianz Italy choose CAST Application

Intelligence Platform (AIP) as target solution for Software Quality.

CAST Application Intelligence Platform (AIP) is an enterprise-grade software quality

analysis and measurement solution designed to analyze multi-tiered, multi-

technology applications for technical vulnerabilities and adherence to architectural

and coding standards.

AIP allows:

• Automated analysis of entire applications

• Immediate quality assessment

• Executive level reports of synthesis & trending

• Drill down to root cause in the source code

In addiction following a specific joint initiative the platform enables:

• Allianz Function Point calculation

Page 15: Risk and AFP Measurement in a digital transformation program

15 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Application Metadata Rules Engine

Metrics Repository

Technical Quality Technical Size & Weight

Over 1000 Industry Rules

and Best Practises

Robustness

Performance

Security

Transferability

Changeability

Application Health

Drill-Down to Action

Portfolio

Applications

Modules

Objects

Health Factors

Compliance

Sub Metrics

Rules

ANALYZERS APPLICATION KNOWLEDGE BASE DASHBOARD

Oracle PL/SQL Sybase T-SQL SSIS-SSRS IBM SQLPSM C/C++ Pro C COBOL, Visual Basic .NET J2EE PowerBuilder Oracle Forms PeopleSoft ABAP Tibco Business Works Business Objects Siebel

Many other languages

CAST Application Intelligence Platform (AIP)

CAST AIP gives you the comprehensive visibility and control needed to achieve

significantly more business productivity from complex applications with an objective

and repeatable way to measure and improve the application software quality

Page 16: Risk and AFP Measurement in a digital transformation program

16 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

The CAST Application Intelligence Center

Reports on application health, team performance, benchmarking

CFOs, CIOs and VPs

Software characteristics, cost and

risk drivers, ARB, architectural

governance, root cause analysis…

PMs, QA, Architects

… and feedback & advice

on software quality

and engineering

Development Teams

Actionable visibility across the IT organization

Application Source Code

• CICS, IMS, COBOL, DB2 z/OS

• J2EE, .NET and all Major RDBMS

• Web Apps, BI, EAI, C/C++, VB, PB

• Siebel, SAP, PSFT, OBS, Amdocs

Quality Measurement

& Functional Sizing

Compliance checks

Code, Architecture

and Data Structure

Analysis

CAST AIP

The Platform and the Actors

Page 17: Risk and AFP Measurement in a digital transformation program

17 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Enterprise-grade analysis requires a 3-tier approach A

rch

itec

ture

Co

mp

lian

ce

Intra-technology architecture Intra-layer dependencies Module complexity & cohesion Design & structure Inter-program invocation Security Vulnerabilities

Module Level

Integration quality Architectural compliance Risk propagation

simulation Application security Resiliency checks Transaction integrity

Function point & EFP measurement

Effort estimation Data access control SDK versioning Calibration across

technologies

System Level

Data Flow Transaction Risk

Code style & layout Expression complexity Code documentation Class or program design Basic coding standards

Program Level

Propagation Risk

Java

JSP

EJB

PL/SQL

ASP.NET

Oracle

SQL Server

DB2

T/SQL

Hibernate

Spring

Struts .NET

C# VB

COBOL

C++

COBOL

Sybase IMS

Messaging

Java Web

Services

APIs

1

2

3

Page 18: Risk and AFP Measurement in a digital transformation program

18 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Ove

r 1

00

0+

arc

hite

ctu

ral a

nd

la

ng

ua

ge

-sp

ecific

co

de

ch

ecks

Overview of CAST Quality Model

Transferability

Changeability

Robustness

Performance

Size

Naming

Conventions

Documentation

Architecture

Complexity

Package naming

Class naming

Interface naming

Package comment

Class comment

Method comment

Package size

Class size (methods)

Interface size

Class complexity (Inh. depth)

Class complexity (Inh. width)

Artifacts having recursive calls

Method complexity (control flow)

Security

Programming

Practices

File conformity

Dead code

Controled data access

Structuredness

Modularity

Encapsulation conformity

Empty code

Inheritance

Governance Dashboards

Cost Drivers

Business Criteria Technical Criteria Quality Rules Application Quality

Multiple artifacts inserting data on the same SQL table

Coupling Distribution

SQL Complexity Distribution

Risk Drivers

Page 19: Risk and AFP Measurement in a digital transformation program

19 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Computation of score 1 to 4

•Grade resulting from Compliance Ratio

•Computed per rule, component, module and application

•Below 3, risks and costs increase

Low risk

Moderate risk

High risk

Very high risk

Compliance Ratio

(% successful checks)

4.00

3.00

2.00

1.00

Threshold #1

E.g.: 10%

Threshold #2

E.g.: 70%

Threshold #3

E.g.: 90%

Threshold #4

E.g.: 99%

Compliance Ratio =

(% of Successful checks)

successful checks

successful checks + failed checks

X 100

Low risk Moderate risk High risk Very high risk

Page 20: Risk and AFP Measurement in a digital transformation program

20 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

CED - CAST ENGINEERING DASHBOARD

Root Cause Identification Action Plans

CAST AIP allows you to monitor over time the drivers of risk and cost of the entire portfolio of critical applications

It allows the analysis of the "root-cause" and management of improvement plans between application releases

Page 21: Risk and AFP Measurement in a digital transformation program

21 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

AAD - CAST APPLICATION ANALYTICS DASHBOARD The information displayed in the CAST Application Analytics Dashboard is derived from snapshots generated by the CAST AI Administrator and provides a high level and intuitive

overview of your set of Applications.

Page 22: Risk and AFP Measurement in a digital transformation program

22 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Agenda

• IT Strategy – Industrialization and Agility

• Quality measurements, metrics and sw selection

• Project Execution and run

• Suggestions to success in App Dev Management measurement

• Next Steps

• Annexes

Page 23: Risk and AFP Measurement in a digital transformation program

23 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Project – Start up CAST AIP

Objectives

Build an Application Analytics Dashboard which contains the Quality values of

Applications source Code based on Health Factors like Robustness, Performance,

Security, Transferability, Changeability. These drivers are a starting point to produce

a remediation plan, which could

• reduce the operating risk and the maintenance cost

• enable functional sizing based on Automated Function Points

Macro activities

• Identification of critical business Applications for Area serving Digital Agency

• Definition of on boarding Plan

For each Application the steps are:

• Source code delivery for baseline

• Configure Application Intelligence Platform (AIP) for “baseline” and Run

Analysis

• Share Assessment Results with critical violations

• Share the automated Function Points calculation

• Action Plan for remediation

• Rescanning code

• Economics evaluation on running cost

Page 24: Risk and AFP Measurement in a digital transformation program

24 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Create "Baseline" values of Quality

24

00. Service engagement

01. Kickoff

03. Acquire Source code and

verify completeness

04. Analysis Process

05. Action Plan e Assessment Presentation

06. Publication 02. Pre-

Configuration Stage

The steps to create a “Baseline” are as follow:

• Contact the IT Area to be analyzed (Service engagement) • Organize a meeting with developments team to agree the perimeter of the application

to be measured (Kickoff) • Prepare AIP’s environment to contain new applications (Pre-configuration Stage) • Acquire source code downloading from “one (an only)” repository and verify

completeness • “Run” the Analysis Process • Prepare the presentation of the results for development teams and managers

formulating a remediation plan to improve the overall quality • Publish dashboards with the results of analysis • Share automated Function points evaluation

Page 25: Risk and AFP Measurement in a digital transformation program

25 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Automated Function Points «baseline» sample

25

Data Function: ILF+ EIF

Transactiona Function: EI + EO + EQ

Page 26: Risk and AFP Measurement in a digital transformation program

26 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Agenda

• IT Strategy – Industrialization and Agility

• Quality measurements, metrics and sw selection

• Project Execution and run

• Lessons learned and Suggestions to success in App Dev

Management measurement

• Next Steps

• Annexes

Page 27: Risk and AFP Measurement in a digital transformation program

27 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Lessons learned

27

• Service Availability is the key IT Sec dimension mainly impacted by the benefit of

a SW Quality assurance process; our experimentally measures tell us that the

higher the quality indicators the lower the bug fixed at application level

• Data integrity would then benefit from a careful and precise formal code definition

with special regards for connection management and memory flush at the end of

sessions

• We intend quality assurance metrics as a leverage to optimize IT investment

freeing up budget for quality improvements

• On boarding Applications require some manual effort. Automation comes over

time.

• Expect resistance to change and be focus only on main issues (less is more!)

• Function points adoption is about cultural change and not about process

CAST Automated Function Points (AFP) capability is an automatic function points counting method based on the rules defined by theInternational Function Point User Group (IFPUG). CAST automates this manual counting process by using the structural information retrieved by source code analysis, database structure and transactions. CAST AFP generates three types of counts:

•Added Function Points: Function Points identified from the functions that are added by the enhancement efforts.

•Modified Function Points: Function Points counted because existing function is changed during the enhancement project.

•Deleted Function Points: Function Points counted for functions deleted during the project.

CAST AFP supports the four primary objectives for analyzing the productivity of software development and maintenance, which require normalized units of production and effort such as function points:

•Improvement – determine the high-leverage targets and track progress

•Estimation – identify sources of variation in effort and cost that predict outcomes

•Benchmarking – assess status against competitors and internal baselines

•Vendor management – establish a fact- and performance-based relationship

CAST Automated Function Points demonstrates a 10 X reduction in the cost of manual counted function points. The CAST Application Intelligence Platform (AIP) provides precise measures of the functional size as well as changes in the complexity of all application components; with this information, IT organizations use CAST to accurately measure software productivity.

Download the CAST Automated Function Points brochure

•Comprehensive: Counts are based on the entire application thus, there is no need to extrapolate the count from parts of an application.

•Consistent: CAST’s automation produces repeatable, consistent results--no human intervention required! Objectivity and speed needed for benchmarking applications and teams.

•Cost-effective: After the initial set up time, the cost of subsequent counts is zero.

What's important is that generated counts remove the issue of inter-counter variability, the need for documentation, and the amount of time necessary to count manually, all three of which have limited function point usage. Automation may be the single key enabler necessary to drive deeper acceptance of function points...

Matthew HotleGartner Research

CAST Makes the Invisible Visible •Calculate productivity baseline: CAST AIP establishes a baseline of the number of function points per staff month against which improvement in team productivity can be measured.

•Track productivity over time: Calculate the number of function points per staff month for every major release of a mission-critical application and compare it with the established baseline. The difference translates into an increase or decrease in productivity.

•Generate measures of quality and complexity: For all application components and use these additional measures to get a more accurate value for productivity.

•Identify process inefficiencies: Analyze differences in productivity to highlight points of process inefficiency. Having quality and complexity information in addition to size makes it easier to find and quickly fix the root causes of inefficiency.

•Measure effectiveness of process improvements: Quantify the effectiveness of process improvement.

Data-driven Guidance to Improve Productivity •Map productivity hotspots in your key applications and attain actionable advice to improve productivity

•Report on team productivity to inform portfolio prioritization, resource allocation and vendor management

•Measure and communicate improvements in operational efficiency to business partners

•Improve estimation and resource allocation by measuring productivity based on historical performance

•Measure and improve the effectiveness of your operational processes and controls

Products

Page 28: Risk and AFP Measurement in a digital transformation program

28 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Suggestions to success in ADM measurement *(ADM = Application Development Management)

28

• The activities of quality control software should be promoted and sponsored

because it poses additional effort for developers

• From the very beginning start a well-defined application boundary for the analysis

should be agreed, clearly identifying all the involved architectural layers (less is

more!)

• A software repository availability is definitely a key issue to enable continuous

improvement

• Remediation plan must be reasonable and feasible in order to improve the most

critical indicators : select only a limited amount of remediation on a risk based

approach

• Static code analysis won’t tell us anything about ….runtime!

• “Smart start” with focus on areas of greatest impact.

CAST Automated Function Points (AFP) capability is an automatic function points counting method based on the rules defined by theInternational Function Point User Group (IFPUG). CAST automates this manual counting process by using the structural information retrieved by source code analysis, database structure and transactions. CAST AFP generates three types of counts:

•Added Function Points: Function Points identified from the functions that are added by the enhancement efforts.

•Modified Function Points: Function Points counted because existing function is changed during the enhancement project.

•Deleted Function Points: Function Points counted for functions deleted during the project.

CAST AFP supports the four primary objectives for analyzing the productivity of software development and maintenance, which require normalized units of production and effort such as function points:

•Improvement – determine the high-leverage targets and track progress

•Estimation – identify sources of variation in effort and cost that predict outcomes

•Benchmarking – assess status against competitors and internal baselines

•Vendor management – establish a fact- and performance-based relationship

CAST Automated Function Points demonstrates a 10 X reduction in the cost of manual counted function points. The CAST Application Intelligence Platform (AIP) provides precise measures of the functional size as well as changes in the complexity of all application components; with this information, IT organizations use CAST to accurately measure software productivity.

Download the CAST Automated Function Points brochure

•Comprehensive: Counts are based on the entire application thus, there is no need to extrapolate the count from parts of an application.

•Consistent: CAST’s automation produces repeatable, consistent results--no human intervention required! Objectivity and speed needed for benchmarking applications and teams.

•Cost-effective: After the initial set up time, the cost of subsequent counts is zero.

What's important is that generated counts remove the issue of inter-counter variability, the need for documentation, and the amount of time necessary to count manually, all three of which have limited function point usage. Automation may be the single key enabler necessary to drive deeper acceptance of function points...

Matthew HotleGartner Research

CAST Makes the Invisible Visible •Calculate productivity baseline: CAST AIP establishes a baseline of the number of function points per staff month against which improvement in team productivity can be measured.

•Track productivity over time: Calculate the number of function points per staff month for every major release of a mission-critical application and compare it with the established baseline. The difference translates into an increase or decrease in productivity.

•Generate measures of quality and complexity: For all application components and use these additional measures to get a more accurate value for productivity.

•Identify process inefficiencies: Analyze differences in productivity to highlight points of process inefficiency. Having quality and complexity information in addition to size makes it easier to find and quickly fix the root causes of inefficiency.

•Measure effectiveness of process improvements: Quantify the effectiveness of process improvement.

Data-driven Guidance to Improve Productivity •Map productivity hotspots in your key applications and attain actionable advice to improve productivity

•Report on team productivity to inform portfolio prioritization, resource allocation and vendor management

•Measure and communicate improvements in operational efficiency to business partners

•Improve estimation and resource allocation by measuring productivity based on historical performance

•Measure and improve the effectiveness of your operational processes and controls

Products

*

Page 29: Risk and AFP Measurement in a digital transformation program

29 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Agenda

• IT Strategy – Industrialization and Agility

• Quality measurements, metrics and sw selection

• Project Execution and run

• Suggestions to success in App Dev Management measurement

• Next Steps

• Annexes

Page 30: Risk and AFP Measurement in a digital transformation program

30 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Next Steps

30

• Continue with a progressive Applications Onboarding based on risk measurement

• Automate and optimize process of rescanning for continuous improvement

• Model an Application Analytics Dashboard for roll-out the consumption scenarios • Create a dataset «efficiency / value» using:

Internal data AIP Quality

Total Quality Index and Health Factors Quantitative

Technical Size by kLOCs Functional Weight Complexity Level

External data AIP N. of Defects

Budget • Think about how to apply in “contracts to suppliers” values code quality and functional

sizing to evaluate the code released • Check the conformity of the sw according to an Architectural BluePrint: the code shall be

compared according to an architectural model. The single architectural blocks should be then certified with a man made sw scan

CAST Automated Function Points (AFP) capability is an automatic function points counting method based on the rules defined by theInternational Function Point User Group (IFPUG). CAST automates this manual counting process by using the structural information retrieved by source code analysis, database structure and transactions. CAST AFP generates three types of counts:

•Added Function Points: Function Points identified from the functions that are added by the enhancement efforts.

•Modified Function Points: Function Points counted because existing function is changed during the enhancement project.

•Deleted Function Points: Function Points counted for functions deleted during the project.

CAST AFP supports the four primary objectives for analyzing the productivity of software development and maintenance, which require normalized units of production and effort such as function points:

•Improvement – determine the high-leverage targets and track progress

•Estimation – identify sources of variation in effort and cost that predict outcomes

•Benchmarking – assess status against competitors and internal baselines

•Vendor management – establish a fact- and performance-based relationship

CAST Automated Function Points demonstrates a 10 X reduction in the cost of manual counted function points. The CAST Application Intelligence Platform (AIP) provides precise measures of the functional size as well as changes in the complexity of all application components; with this information, IT organizations use CAST to accurately measure software productivity.

Download the CAST Automated Function Points brochure

•Comprehensive: Counts are based on the entire application thus, there is no need to extrapolate the count from parts of an application.

•Consistent: CAST’s automation produces repeatable, consistent results--no human intervention required! Objectivity and speed needed for benchmarking applications and teams.

•Cost-effective: After the initial set up time, the cost of subsequent counts is zero.

What's important is that generated counts remove the issue of inter-counter variability, the need for documentation, and the amount of time necessary to count manually, all three of which have limited function point usage. Automation may be the single key enabler necessary to drive deeper acceptance of function points...

Matthew HotleGartner Research

CAST Makes the Invisible Visible •Calculate productivity baseline: CAST AIP establishes a baseline of the number of function points per staff month against which improvement in team productivity can be measured.

•Track productivity over time: Calculate the number of function points per staff month for every major release of a mission-critical application and compare it with the established baseline. The difference translates into an increase or decrease in productivity.

•Generate measures of quality and complexity: For all application components and use these additional measures to get a more accurate value for productivity.

•Identify process inefficiencies: Analyze differences in productivity to highlight points of process inefficiency. Having quality and complexity information in addition to size makes it easier to find and quickly fix the root causes of inefficiency.

•Measure effectiveness of process improvements: Quantify the effectiveness of process improvement.

Data-driven Guidance to Improve Productivity •Map productivity hotspots in your key applications and attain actionable advice to improve productivity

•Report on team productivity to inform portfolio prioritization, resource allocation and vendor management

•Measure and communicate improvements in operational efficiency to business partners

•Improve estimation and resource allocation by measuring productivity based on historical performance

•Measure and improve the effectiveness of your operational processes and controls

Products

Page 31: Risk and AFP Measurement in a digital transformation program

31 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Agenda

• IT Strategy – Industrialization and Agility

• Quality measurements, metrics and sw selection

• Project Execution and run

• Suggestions to success in App Dev Management measurement

• Next Steps

• Annexes

Page 32: Risk and AFP Measurement in a digital transformation program

32 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

CAST: Verification of CISQ’s Standards

CISQ Characteristic

System Level & Architecture Coding Practices

Component Level Coding Practices

RELIABILITY

Multi-layer design compliance Software manages data integrity and consistency Exception handling through transactions Class architecture compliance

Protecting state in multi-threaded environments Safe use of inheritance and polymorphism Patterns that lead to unexpected behaviors Resource bounds management, Complex code Managing allocated resources, Timeouts, Built-in remote addresses

PERFORMANCE EFFICIENCY

Appropriate interactions with expensive and/or remote resources Data access performance and data management Memory, network and disk space management Centralized handling of client requests Use of middle tier components versus stored procedures and database functions

Compliance with Object-Oriented best practices Compliance with SQL best practices Expensive computations in loops Static connections versus connection pools Compliance with garbage collection best practices

SECURITY

Input validation SQL injection Cross-site scripting Failure to use vetted libraries or frameworks Secure architecture design compliance

Error and exception handling Use of hard-coded credentials Buffer overflows Broken/risky cryptographic algorithms Missing initialization Improper validation of array index Improper locking References to released resources Uncontrolled format string

MAINTAINABILITY

Strict hierarchy of calling between architectural layers Excessive horizontal layers

Tightly coupled modules Unstructured and Duplicated code Cyclomatic complexity Controlled level of dynamic coding Encapsulated data access Over-parameterization of methods Hard coding of literals Commented out instructions Excessive component size Compliance with OO best practices

Page 33: Risk and AFP Measurement in a digital transformation program

33 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Quality indexes - Examples

33

Performance: Determines the risk of performance issues of an application

Robustness: Determines the risk of failures or defects that could occur in production

Security: Determines the risk of security breaches for an application

Technical Criteria Name Grade Evolut

ion

Complexity - SQL Queries 3.99 +8.72

%

Efficiency - Expensive Calls in

Loops 1.00 0 %

Efficiency - Memory, Network and

Disk Space Management 2.49 -5.68 %

Efficiency - SQL and Data Handling

Performance 3.56 -1.11 %

Technical Criteria Name Grade Evolut

ion

Architecture - Multi-Layers and

Data Access 1.00 0 %

Efficiency - Memory, Network and

Disk Space Management 2.49 -5.68 %

Programming Practices - Error and

Exception Handling 2.04 -16.7 %

Secure Coding - Encapsulation 1.74 0 %

Secure Coding - Input Validation 4.00 0 %

Technical Criteria Name Grade Evolut

ion

Architecture - Multi-Layers and

Data Access 1.00 0 %

Architecture - Object-level

Dependencies 3.40

+4.62

%

Architecture - Reuse 2.05 +13.3

%

Complexity - Algorithmic and

Control Structure Complexity 3.40

+19.3

%

Complexity - OO Inheritance and

Polymorphism 3.44

+1.47

%

Complexity - SQL Queries 3.99 +8.72

%

Complexity - Technical

Complexity 2.46

+54.7

%

Rule Names Count

Avoid instantiations inside loops 208

Avoid String concatenation in loops 21

Rule Names Count

The exception Exception should never

been thrown. Always Subclass

Exception and throw the subclassed

Classes.

87

Avoid empty catch blocks 13

Avoid cyclical calls and inheritances

between namespaces content

2

Rule Names Count

The exception Exception should never

been thrown. Always Subclass Exception

and throw the subclassed Classes.

87

Avoid empty catch blocks 13

Avoid cyclical calls and inheritances

between namespaces content

2

2.56 2.83 2.36

Critical rules

Page 34: Risk and AFP Measurement in a digital transformation program

34 ISMA 12 – May 3-5, 2016 https://isma12it.wordpress.com/

Quality indexes - Examples

34

Transferability: Evaluate how easily a new team or team member can be

productive when assigned to work on the application

Changeability: Evaluate how easily and quickly an application can be modified

Technical Criteria Name Grade Evolution

Architecture - Object-level Dependencies 3.40 +4.62 %

Complexity - Algorithmic and Control Structure

Complexity 3.40 +19.3 %

Complexity - OO Inheritance and

Polymorphism 3.44 +1.47 %

Complexity - SQL Queries 3.99 +8.72 %

Dead code (static) 2.04 +39.7 %

Documentation - Bad Comments 3.70 +0.82 %

Documentation - Naming Convention

Conformity 3.64 0 %

Technical Criteria Name Grade Evolution

Architecture - Multi-Layers and Data Access 1.00 0 %

Architecture - Object-level Dependencies 3.40 +4.62 %

Complexity - Algorithmic and Control Structure

Complexity 3.40 +19.3 %

Complexity - OO Inheritance and

Polymorphism 3.44 +1.47 %

Complexity - SQL Queries 3.99 +8.72 %

Dead code (static) 2.04 +39.7 %

Documentation - Naming Convention

Conformity 3.64 0 %

3.18

2.52

CAST Complexity Distribution

CAST provides a distribution of objects based on several distributions:

-Algorithm Complexity (based on Cyclomatic complexity)

-SQL Complexity

-Coupling (Fan in, Fan out)

-Ratio of documentation

-Size of components

Cast Complexity Current total

Low Complexity 8747

Average Complexity 801

High Complexity 123

Very High Complexity 85