Upload
nesma
View
365
Download
0
Tags:
Embed Size (px)
DESCRIPTION
Citation preview
Agenda
• Background• Set-up of the study • Data collection• Analysis of the core metrics• Analysis of the performance
indicators• Characteristics of best practices• Characteristics of worst practices• Supplementary study on striking
factors• Index numbers
2Agile werkt - © Goverdson 2012
Background
3
• Hennie Huijgens• Working as a measurement & analysis
expert since 20 years• Clients; large information-intensive
organizations (e.g. banking, insurance, pension funds, government, telco)
• Specialism; Measurement & Analysis, Information Management, Risk Management
• NESMA board member 1999 to 2007• MSc in Information Management
(University of Amsterdam) in 2010
Agile werkt - © Goverdson 2012
Set-up of the study
4
Best Practices & Worst Practices
Best Practices & Worst Practices
Core metricsCore metrics
Size
Duration
Cost
Defects & Incidents
Performance Indicators
Performance Indicators
Time-to-market
Productivity
Process Quality & Product Quality
Success factorsEffort
Data Analysis
Supplementary study on striking factors
Predictability Project planning Scalability
Index Numbers
Delivery model
Fail factors
Agile werkt - © Goverdson 2012
Data collection
5
• 2 Comparable information-intensive organizations• 278 finalized IT-projects; 23 running IT-projects• 249 Waterfall projects; 29 Agile projects (Scrum)
• For all projects the size was measured according to the ISO/IEC 24570 (NESMA) counting practice (Function Points)
• All assessed IT-projects were about solution delivery; focus at software development (new or enhancements); in some cases with hardware or middleware implementations within the project scope
• The investigated population was divers in subject (e.g. internet, mobile apps, call enter solutions, marketing & sales, products, client based systems, transactional services, business intelligence)
• Both organisations started of with a process improvement program (CMMI) and (at a later stage) moved from waterfall towards Scrum
Agile werkt - © Goverdson 2012
Analysis of the Core Metrics
6
Size (Functionpoints)
Duration(Months)
Effort / Cost (Hours / Euros)
Quality(Defects / Incidents)
Agile werkt - © Goverdson 2012
Size
• Size measured in functionpoints• ISO/IEF 24570 (NESMA) counting practice• Smallest project was 9 FP; largest was 4.600
FP• 60% of the projects (presenting 32% of the
project cost) was smaller than 200 FP (small)• 31% of the projects (42% of the project cost)
was between 200 and 600 FP (medium)• 9% of the projects was larger than 200 FP;
representing 27% of the project cost (large)• Medium and large projects deliver most end-
user functionality; resp. 41% and 37% of the functionpoints are delivered by these projects. Small projects delivered 21% of the functionpoints
7
< 200 FP 200 - 600 FP > 600 FP0%
10%
20%
30%
40%
50%
60%
70%
Scalability based at numbers of projects
AgileWatervalTotaal
Agile werkt - © Goverdson 2012
Duration (Months) versus Size (Functionpoints)
10 100 1.000 10.000
Size (FP)
1
10
100
Life Duration (M
onths)
Agile Waterval Avg. Line Style
Duration
• Duration measured in months; from start-up phase to aftercare
• Mean duration waterfall projects: 9,25 months
• Mean duration agile projects: 7,94 months
8Agile werkt - © Goverdson 2012
Project cost (euro's) versus Size (functionpoints)
10 100 1.000 10.000
Size (FP)
10
100
1.000
10.000
100.000
Life Cost (E
UR
) (thousands)
Agile Waterval Avg. Line Style
Project Cost
• Project cost measured in euro’s; from start-up phase to aftercare
• Including supplier cost; excluding investment cost (e.g. software licences, hardware / middleware investment)
• Mean cost waterfall projects: € 781 K
• Mean cost agile projects: € 834 K
9Agile werkt - © Goverdson 2012
Process Quality (Defects) versus Size (Functionpoints)
10 100 1.000 10.000
Size (FP)
1
10
100
1.000
10.000
Errors (S
ysInt-Del)
Agile Waterval Avg. Line Style
Quality (Defects)
• Quality measured in number of defects (findings) during development (unit test to go live)
• Mean quality waterfall projects: 80
• Mean quality agile projects: 128
10Agile werkt - © Goverdson 2012
Analysis of the Performance Indicators
11
Size (Functionpoints)
Duration(Months)
Effort / Cost (Hours / Euros)
Quality(Defects / Incidents)
Time-to-market (Days / FP)
Productivity(Cost / FP)
Process Quality (Defects / FP)
Product Quality (Incidents / FP)Not in the study
Agile werkt - © Goverdson 2012
Time-to-Market (Calendar Days per FP) versus Size (FP)
10 100 1.000 10.000
Size (FP)
0,01
0,1
1
10
100
1.000
Calender D
ays/FP
Agile Waterval Avg. Line Style
Time-to-market
• Time-to-market is expressed in (calendar) days per functionpoint (‘how fast is a function-point delivered?’)
• Mean Time-to-market waterfall projects: 2,91 days / FP
• Mean Time-to-market agile projects: 1,93 days / FP
• Remark; an alternative measure for TTM is a weighted average of days per FP, where size is the weighting factor.
12Agile werkt - © Goverdson 2012
Productiviy (Euro's per FP) versus Size (FP)
10 100 1.000 10.000
Size (FP)
0,1
1
10
100
Life Cost/F
P (thousands)
Agile Waterval Avg. Line Style
Productivity
• Productivity is expressed in project cost per functionpoint (the price of one functionpoint)
• Mean Productivity waterfall projects: 4.613 euro / FP
• Mean Productivity agile projects: 3.360 euro / FP
• Not in scope of the study: net versus gross productivity
• Remark; an alternative measure for productivity is a weighted average of cost per FP, where size is the weighting factor.
13Agile werkt - © Goverdson 2012
Process Quality (Defects per FP) versus Size (FP)
10 100 1.000 10.000
Size (FP)
0,01
0,1
1
10
Defects/F
P
Agile Waterval Avg. Line Style
Process Quality
• Process Quality (in-process product quality) is expressed in number of defects per functionpoint
• Coherence with Product Quality (before versus after Go Live)
• Mean Process Quality waterfall projects: 0,38 defects / FP
• Mean Process Quality agile projects: 0,30 defects / FP
• Remark; an alternative measure for Process Quality is a weighted average of defects per FP, where size is the weighting factor.
14Agile werkt - © Goverdson 2012
Best Practices & Worst Practices
Analysis of the performance scores based at Star-rating• For every performance indicator that scores better than average
(under the trend line in the figures) a project gets a star. For every performance indicator that scores above Sigma+1 (above the highest dotted line in the figures) a project loses a star
• 3-star projects: performed better than average for all 3 performance indicators Characteristics of best practices
• 0-star projects: performed worse than average for all 3 performance indicators (or no data was measured) Characteristics of worst practices
15Agile werkt - © Goverdson 2012
Success factors for IT-projects
16
The 7 reason’s behind Best Practices*:
1. Single application (no application clustering)
2. Working in releases
3. Fixed and experienced project team
4. Scrum
5. Close cooperation with external party (same party mentioned several times)
6. Dedicated test resources (e.g. test environment, deployment tools)
7. Project type was Business Intelligence
*) Based at 30 measured projects that scored better than average for both productivity, time-to-market and process quality.
Close cooperation with external party
7%
Dedicated test resources
7%
Project type was Business Intelligence
5%
Working in releases20%
Single application (no application clustering)
26%
Scrum15%
Fixed and experienced project
team20%
Agile werkt - © Goverdson 2012
Fail factors for IT-projects
17
Complex environment
18%
New technology14%
dependencies with other13%
Preconditioned or 'technical' projects
11%
Pilot or PoC11%
Complex legacy environment
9%
Scope changes7%
Dependencies with other domains
7%
Bad performing external supplier
5%
Package with customization
5%
The 10 reason’s behind Worst Practices*:
1. Complex environment (e.g. back-offices, infrastructure, many stakeholders)
2. New technology (causing technical problems)
3. Dependencies with other projects
4. Preconditioned or ‘ technical’ projects
5. Pilot or PoC in project (incl. complex RFP/RFI)
6. Complex legacy environment (e.g. bad documentation)
7. Scope changes during project (exceptions)
8. Dependencies with other domains
9. Bad performing external supplier
10. Package with high amount of customization
*) Based at 15 measured projects that scored worse than average for both productivity, time-to-market and process quality.
Agile werkt - © Goverdson 2012
Supplementary study on 4 striking factors
Factors that patently obvious influenced the performance: • Predictability• Project planning• Scalability• Delivery model
18Agile werkt - © Goverdson 2012
Predictability
Cost predictability• On average almost a perfect match
between planning and realisation of project cost
• However: good steering on cost expenditure is not the same as aiming for a good performance….
19
Schedule predictability• On average the planned Go Live date was
3 months to early• The measure shows a bias on
underestimating the delivery date
0.00 0.20 0.40 0.60 0.80 1.00
0.00
0.50
1.00
1.50
2.00
2.50
F/A Plot (Cost)
Project completion
Fore
cast
/ A
ctua
l
0.00 0.20 0.40 0.60 0.80 1.00
0.00
0.50
1.00
1.50
2.00
F/A Plot (Schedule)
Project completion
Fore
cast
/ A
ctua
l
Agile werkt - © Goverdson 2012
Project planning
The period preceding the big-bang towards Scrum; 3 dilemma’s occurred:1. Experts do not plan for improvement2. Managing uncertainties is not on the agenda3. Managers steer at cost expenditure;
‘Flying an airplane with only one instrument… a fuel meter…’
The result: planning realised; performance declined
20Agile werkt - © Goverdson 2012
Scalability: the effect of size
21
Added value expressed in function points
However medium and large sized projects deliver the greater part of the value, measured in end-user functionality (the number of function points). Large agile projects (> 600 FP) even deliver more than 60% of the value.
Lesson: The majority of finalized projects is small as to size; while conversely medium and large sized projects deliver the greater part of the value for the end-user.
Number of projects
More than half of the projects are small projects (size less than 200 FP). Slightly less than one third of the projects are medium sized (between 200 and 600 FP).
< 200 FP 200 - 600 FP > 600 FP0%
10%
20%
30%
40%
50%
60%
70%
Schaalgrootte op basis van aantal projecten
AgileWatervalTotaal
< 200 FP 200 - 600 FP > 600 FP0%
10%
20%
30%
40%
50%
60%
70%
Schaalgrootte op basis van omvang (FP)
AgileWatervalTotaal
Agile werkt - © Goverdson 2012
Delivery model: agile wins
22
Time-to-market
The delivery of a function point within both waterfall and agile projects takes more time in small projects than in medium sized projects. Large projects accordingly deliver faster than medium sized projects. What strikes the most is that agile projects in all cases deliver a function point faster than waterfall projects.
Lesson: Larger projects deliver a function point faster than small projects.
Lesson: Agile projects show a better time-to-market than waterfall projects.
< 200 FP 200 - 600 FP > 600 FP€ 0
€ 1,000
€ 2,000
€ 3,000
€ 4,000
€ 5,000
Schaalgrootte op basis van kosten per FP
AgileWatervalTotaal
Productivity
The production of a function point in small waterfall projects cost on average more (euro 4.728) than in medium sized (euro 3.344) and large projects (euro 2.423). Agile projects show another result; one function point in a small project cost on average 3.787 euro's, while the production cost for a function point in medium and large sized projects is almost equal (respectively 1.515 euro’s and 1.475 euro’s).
Lesson: Large projects are cheaper than small projects.
Lesson: Agile projects show a better productivity than waterfall projects.
< 200 FP 200 - 600 FP > 600 FP0.00
0.50
1.00
1.50
2.00
2.50
3.00
Schaalgrootte op basis van dagen per FP
AgileWatervalTotaal
Agile werkt - © Goverdson 2012
Time-to-Market (Calendar Days / FP)* 1,13 0,56 51%
Productivity (Cost in Euros / FP)* 3.374 1.814 46%
Some index numbers
23
Waterfall Agile Improvement
Time-to-Market (Calendar Days / FP) 2,91 1,93 34%
Productivity (Cost in Euros / FP) 4.613 3.360 27%
Process Quality (Defects / FP) 0,38 0,30 21%
Standard Error (avg. of 3 indicators) 0,78 0,63 19%
* Weighted average with size as weighting factor
Agile werkt - © Goverdson 2012
Summary
24
Lessons from the study:• The majority of finalized projects is small as to size; while conversely medium
and large sized projects deliver the greater part of the value for the end-user• Large projects are cheaper, deliver a function point faster, and show less
defects per functionpoint than small projects• Agile projects show a better productivity, time-to-market and process quality
than waterfall projects, meaning;• Agile teams work faster, cheaper and deliver better quality • Standard error (r2): agile trends are more reliable as a source for project
estimates
Agile werkt - © Goverdson 2012