26
1 Copyright © 2004. The David Consulting Group, Inc., IFPUG Performance Measurement of Software Application Development & Maintenance 2004 IFPUG Conference The David Consulting Group www.davidconsultinggroup.com

PPT - Performance Measurement of Software Application

  • Upload
    others

  • View
    5

  • Download
    0

Embed Size (px)

Citation preview

Page 1: PPT - Performance Measurement of Software Application

1Copyright © 2004. The David Consulting Group, Inc., IFPUG

Performance Measurementof Software Application

Development & Maintenance

2004 IFPUG Conference

The David Consulting Groupwww.davidconsultinggroup.com

Page 2: PPT - Performance Measurement of Software Application

2Copyright © 2004. The David Consulting Group, Inc., IFPUG

Measurement Must Consider

Strategic Positioning

(Business &Technical)

Improve ProductivityReduce Time to Market

Increase Quality

LowerCosts

ShareholderValue

Satisfy CustomerImprove Competitive Position

Increase Market Share

ImproveMargins

ContinuousProcess

ImprovementDeliverValue

IncreaseRevenues

Page 3: PPT - Performance Measurement of Software Application

3Copyright © 2004. The David Consulting Group, Inc., IFPUG

Presentation TopicsPresentation Topics

Measurement for Process Improvement

Baseline your Performance

Model Performance Improvements

Page 4: PPT - Performance Measurement of Software Application

4Copyright © 2004. The David Consulting Group, Inc., IFPUG

Measurement for Process Improvement

QUANTITATIVE QUALITATIVE

Deliverable SizeEffort/CostDurationQuality

ProcessMethods

SkillsTools

Environment

MeasuredPerformance

CapabilityMaturity

Baseline ofPerformance

Measureshow you are doing

Identifieswhat you are doing

Standard of performance

Page 5: PPT - Performance Measurement of Software Application

5Copyright © 2004. The David Consulting Group, Inc., IFPUG

What Gets Measured?

Business Related MeasuresUnit Delivery CostTime To MarketCustomer Satisfaction

Process Related MeasuresEffectivenessIntegrationCompliance

Project Related MeasuresProject TrackingEstimatingChange Management

Contribution

Measures the impactof IT on the business

Identifies trends andhelps to monitorprogress

Effective utilization of measures ina pro-active format

Page 6: PPT - Performance Measurement of Software Application

6Copyright © 2004. The David Consulting Group, Inc., IFPUG

Keys to Measurement Program

Measure From User Viewpoint

Base Metrics on the Goals of the

Process Being Measured

Keep Measures Simple & Flexible

Page 7: PPT - Performance Measurement of Software Application

7Copyright © 2004. The David Consulting Group, Inc., IFPUG

Utilize Results in Decision Making

Improvements resulting from current and future initiatives must be measured

The basis for measuring improvements may include: Industry data Organizational baseline data

It is necessary for the organization to put a “stake in the ground” relative to current performance level in order to improve development practices

Page 8: PPT - Performance Measurement of Software Application

8Copyright © 2004. The David Consulting Group, Inc., IFPUG

Presentation TopicsPresentation Topics

Measurement for Process Improvement

Baseline your Performance

Model Performance Improvements

Page 9: PPT - Performance Measurement of Software Application

9Copyright © 2004. The David Consulting Group, Inc., IFPUG

Baseline Activities

Identify sample set (typically project oriented)

Collect baseline data Project measures (e.g., effort, size, cost, duration, defects) Project attributes (e.g., skill levels, tools, process, etc.)

Analyze data Performance comparisons (identification of process strengths and

weaknesses) Industry averages and best practices Performance modeling (identify high impact areas)

Report results

Page 10: PPT - Performance Measurement of Software Application

10Copyright © 2004. The David Consulting Group, Inc., IFPUG

Identify Sample Set & Collect Data

MEASURESResearch

PROFILESPERFORMANCE LEVELS

CHARACTERISTICS

Software SizeLevel of EffortTime to Market

Delivered DefectsCost

Skill LevelsAutomation

ProcessManagement

User InvolvementEnvironment

Analysis

Results • Correlate Performance Levels to Characteristics• Substantiate Impact of Characteristics• Identify Best Practices

Page 11: PPT - Performance Measurement of Software Application

11Copyright © 2004. The David Consulting Group, Inc., IFPUG

Quantitative DataQuantitative Data

Start End Effort Schedule Delivered Project Date Date FP (Mths) (Months) Cost Defects

Project abc 8/18/03 1/9/04 122 24.08 4.75 $375,600 17

Project xyz 3/15/03 12/10/03 111 8.63 8.75 $134,640 1

Project 123 5/27/02 5/9/03 83 25.77 11.50 $401,958 3

Project 890 8/15/03 10/19/03 52 5.50 2.25 $85,800 0

Capstone 12/1/02 4/1/03 20 1.34 4.00 $20,880 6

CRS ROCI Release 4.09.02 2/19/04 3/4/04 19 1.23 0.50 $19,200 0

S-Tracker Upgrade 8/1/03 8/22/03 19 0.92 0.75 $14,400 1

Low and Grow CLIP Enhancement 3/12/03 5/28/03 12 1.27 2.50 $19,800 2

Account Status Change 9/15/03 2/9/04 12 6.15 4.75 $95,880 0

Reversal Automation 8/27/03 1/5/04 9 1.64 4.25 $25,560 4

NBR Cash Differentiation 8/5/03 1/5/04 8 3.43 4.50 $53,460 1

CROI Multi-Accounts 8/1/03 9/15/03 7 0.77 1.50 $12,000 0

Low and Grow CLIP Delay 11/13/02 1/19/03 7 0.35 2.00 $5,400 1

Dialer Redesign 12/22/02 5/18/03 6 0.69 5.00 $10,800 2

CROI Enhancements (Feb 03) 2/2/03 2/13/03 3 0.25 0.40 $3,960 0

Project data is collected for:Start and end date – durationSize – expressed in function pointsEffort – laborDefects – pre and post implementation

Page 12: PPT - Performance Measurement of Software Application

12Copyright © 2004. The David Consulting Group, Inc., IFPUG

MANAGEMENT

Team Dynamics

High Morale

Project Tracking

Project Planning

Automation

Management Skills

DEFINITION

Clearly Stated Requirements

Formal Process

Customer Involvement

Experience Levels

Business Impact

DESIGN

Formal Process

Rigorous Reviews

Design Reuse

Customer Involvement

Experienced Development Staff

Automation

BUILD

Code Reviews

Source Code Tracking

Code Reuse

Data Administration

Computer Availability

Experienced Staff

Automation

TEST

Formal Testing Methods

Test Plans

Development Staff Experience

Effective Test Tools

Customer Involvement

ENVIRONMENT

New Technology

Automated Process

Adequate Training

Organizational Dynamics

Certification

Analyze Project Attributes

Page 13: PPT - Performance Measurement of Software Application

13Copyright © 2004. The David Consulting Group, Inc., IFPUG

Software Practices ProfileSoftware Practices Profile

Profile scores reflect the “goodness” of the development practices for a given project. Six categories are evaluated and scored. The higher the score the higher the probability of a successful delivery.

ProfileProject Score Mgmt Def Des Build Test Env

Project 1 75.9

Project 2 60.3 68.18 64.10 56.82 61.54 59.38 50.00

Project 3 60.0 61.36 33.33 81.82 57.69 65.63 65.38

Project 4 54.5 61.36 64.10 45.45 61.54 50.00 42.31

Project 5 39.4 68.18 20.51 50.00 53.85 25.00 46.15

Project 6 35.3 38.64 20.51 15.91 69.23 43.75 38.46

Project 7 31.4 36.36 30.77 4.55 42.31 43.75 46.15

84.09 76.92 77.27 65.38 81.25 65.38

68.18 64.10 56.82 61.54 59.38 50.00

61.36 33.33 81.82 57.69 65.63 65.38

61.36 64.10 45.45 61.54 50.00 42.31

68.18 20.51 50.00 53.85 25.00 46.15

38.64 20.51 15.91 69.23 43.75 38.46

36.36 30.77 4.55 42.31 43.75 46.16

Page 14: PPT - Performance Measurement of Software Application

14Copyright © 2004. The David Consulting Group, Inc., IFPUG

Definition

Strengths

Requirements are clearly stated and stableDevelopment and customers are experienced in applications

Opportunities for Improvement

More formal requirements gathering process on larger projectsMore consistent use of prototyping on larger projectsA formal review process

Strengths And Opportunities(An Example)

Page 15: PPT - Performance Measurement of Software Application

15Copyright © 2004. The David Consulting Group, Inc., IFPUG

Rate of DeliveryFunction Points per Person Month

0200400600800

1000120014001600180020002200

0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36

SoftwareSize

Establish A Baseline

Performance Productivity

A representative selectionof projects is measured

Size isexpressedin terms of functionalitydelivered to theuser

Rate of delivery is a measure of productivity

Organizational Baseline

Page 16: PPT - Performance Measurement of Software Application

16Copyright © 2004. The David Consulting Group, Inc., IFPUG

Baseline Data Measures Performance

D

ProductDeliverable Performance

IndicatorsIndustry Baselines

Functional SizeDurationCost Quality

Average and Best in ClassPerformance Levels

PRODUCTIVITYMEASURES

Delivery RateTime to MarketCost per Unit of WorkDefect Density

AB

CD

FUNCTIONALITYDELIVERED

Business ValueOperational EfficiencyAbility to CompeteCustomer Satisfaction

COMPARATIVE ANALYSIS

Performance ComparisonIdentification of RisksProfile of PerformanceBest Practices Identification

CUSTOMER IT

Page 17: PPT - Performance Measurement of Software Application

17Copyright © 2004. The David Consulting Group, Inc., IFPUG

Evaluating Performance Without Size

Evaluating Performance Without Size

Project Cost Quality(000’s) (Defects Released)

PO Special $500 12

Vendor Mods $760 18

Pricing Adj. $ 40 5

Store Sys. $990 22

Which project produced the best results?

Page 18: PPT - Performance Measurement of Software Application

18Copyright © 2004. The David Consulting Group, Inc., IFPUG

Evaluating Performance With Size

Evaluating Performance With Size

Project Size Cost Rate Quality Density (000’s) (Defects Released)

PO Special 250 $500 $2,000 12 .048

Vendor Mods 765 $760 $ 993 18 .023

Pricing Adj. 100 $ 80 $ 800 5 .050

Store Sys. 1498 $990 $ 660 22 .014

Size --- Function PointsRate --- Cost per Function PointDensity -- Defects per FP

Page 19: PPT - Performance Measurement of Software Application

19Copyright © 2004. The David Consulting Group, Inc., IFPUG

Compare to Industry Benchmarks

Industry baseline performance

Rate of DeliveryFunction Points per Person Month

0200400600800

1000120014001600180020002200

0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36

SoftwareSize

Page 20: PPT - Performance Measurement of Software Application

20Copyright © 2004. The David Consulting Group, Inc., IFPUG

Web 25e-business Web 15

Function Points Per Person Month

Client Server 17Main Frame 13

Vendor Packages 18Data Warehouse 9

Average of Recent Projects Across Different Platforms

Page 21: PPT - Performance Measurement of Software Application

21Copyright © 2004. The David Consulting Group, Inc., IFPUG

Web 756e-business Web 438

Function Points Supported By One FTE

Vendor Packages 740Data Warehouse 392

Client Server 642Main Frame 978

Average of Support Provided for Corrective Maintenance by One FTE

Page 22: PPT - Performance Measurement of Software Application

22Copyright © 2004. The David Consulting Group, Inc., IFPUG

Track Improvements

Track Progress

Rate of DeliveryFunction Points per Person Month

0200400600800

1000120014001600180020002200

0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36

SoftwareSize

Year 2 grouping of projects

Page 23: PPT - Performance Measurement of Software Application

23Copyright © 2004. The David Consulting Group, Inc., IFPUG

Presentation TopicsPresentation Topics

Measurement for Process Improvement

Baseline your Performance

Model Performance Improvements

Page 24: PPT - Performance Measurement of Software Application

24Copyright © 2004. The David Consulting Group, Inc., IFPUG

Model the impact of implementing selected process improvementsEvaluate the impact on productivityModeling is performed from several perspectives: Management Improvements, Design Improvements, Definition Improvements, Build Improvements, Test Improvements, Environment Improvements and SEI CMM Specific Improvements

EXAMPLE: Improvements are measured from the following baseline --

Average Project Size: 133 Function Points (FPs)Average Productivity: 10.7 FP/Effort Month (EM)Average Time-to-Market: 7.3 MonthsAverage Cost/FP: $934.58Projected Delivered Defects/FP: .0301

Model Improvements

Page 25: PPT - Performance Measurement of Software Application

25Copyright © 2004. The David Consulting Group, Inc., IFPUG

Modeled ImprovementsCurrent improvement initiatives (SEI) are appropriately targeted at the majority

of “weak spots” revealed by the baseline results.

Page 26: PPT - Performance Measurement of Software Application

26Copyright © 2004. The David Consulting Group, Inc., IFPUG

Strive for Continuous Improvement

PERFORMANCEPRODUCTIVITY

CAPABILITIES

PERFORMANCESOFTWAREPROCESS

IMPROVEMENT

TIME TO MARKET

RESOURCESDEFECTS

BUSINESS VALUE

DELIVERABLESSKILL LEVELS

PROCESS

TECHNOLOGYCOSTS

IMPROVEMENT INITIATIVES / BEST PRACTICES

RISKS

MEASUREDBASELINE

0.00

5.00

10.00

15.00

20.00

25.00

30.00

35.00

0 100 200 400 800 1600 3200 6400

SubPerformance

BestPracticesIndustry

Averages

OrganizationBaseline