View
214
Download
0
Category
Preview:
Citation preview
Carnegie MellonSoftware Engineering Institute
© 2006 by Carnegie Mellon University
Software ProcessPerformance Measures
James Over
Software Engineering InstituteCarnegie Mellon UniversityPittsburgh, PA 15213-3890
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 2
PurposeWhy are you interested in process improvement?
Hopefully for the process performance benefits.
If so, process performance measurement is a key concern.
Many of the examples in this presentation are from the Team Software ProcessSM , however the concepts are broadly applicable.
SMTeam Software Process is a registered service mark of Carnegie Mellon University
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 3
Team Software ProcessThe Team Software Process (TSP) is an integrated set of practices for developing software.
TSP is a process-based solution to common software engineering and management issues.• cost and schedule predictability• productivity and product quality• process improvement
Unlike other methods, TSP• teams are self-directed.• emphasizes measurement and quality management.• provides immediate and measurable benefits.• accelerates CMMI-based improvement.
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 4
TSP Performance Summary -1
* From a study of 20 projects in 13 organizations conducted in 2003** Of the unsuccessful projects, average schedule error was 222%
Cancelled29%
On-Time26%
101%-200% late16%
51%-100% late9%
21%-50% late8%
Less than 20% late6%
More than 200% late6%
PerformanceCategory
TSP Impact Study (2003)*
Typical Industry Performance(Standish Group)**
Schedule error
average
6%
Schedule error range
-20% to +27%
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 5
TSP Performance Summary -2
* From a study of 20 projects in 13 organizations conducted in 2003
PerformanceCategory
TSP Impact Study (2003)*
Typical Industry Performance
System test defects per thousand instructions
0.4 avg. 0.0 to 0.9
2 to 14
Released defects per thousand instructions
0.06 avg.0.0 to 0.2
1 to 7
System test effort (% of
total effort) 4% avg.
2% to 7% 40%
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 6
TSP Performance Summary -3An analysis of 20 projects in 13 organizations showed TSP teams averaged 0.06 defects per thousand lines of new or modified code.
Approximately 1/3 of these projects were defect-free.
These results are substantially better than those achieved in high maturity organizations.
7.5
6.24
4.73
2.28
1.05
0.060
1
2
3
4
5
6
7
8
Level 1 Level 2 Level 3 Level 4 Level 5 TSP
Defects/KLOC
Source: CMU/SEI-2003-TR-014
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 7
TSP-CMMI Overall Coverage
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Level 2 Level 3 Level 4 Level 5
CMMI Maturity Level
Perc
enta
ge o
f SPs Unrated
Not Addressed
Part ially Addressed
Supported
Direct ly Addressed
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 8
TopicsProcess management concepts
TSP measurement framework
Performance measures
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 9
SEI Process Management Premise
“The quality of a software system is governed by the quality of the process
used to develop and evolve it.”
- Watts Humphrey
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 10
Managed ProcessThe CMMI defines a managed process as a process with the following characteristics.• a performed process that is planned an executed in
accordance with policy• the process employs skilled people who have adequate
resources to produce controlled outputs• it involves relevant stakeholders• it is monitored, controlled, and reviewed• it is evaluated for adherence to its process description
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 11
Process ManagementProcess ownership: key responsibilities for designing, establishing, and implementing the process and the mechanisms for measurement and corrective action is assigned.
Process definition: the design and formal documentation of the components of the process and their relationships.
Process control: the function of ensuring that the process output meets specifications including• measurement• control variable(s)• feedback loop(s)• defect detection, correction, and defect prevention
Source: Quality Process Management by Gabriel Pall
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 12
Process Management Concept
Work Process
Input OutputControl
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 13
Example
InspectionProcess
Input Output
Control
ReviewRate
ProcessYield
System test yield
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 14
Process Management ConclusionsA defined process is a prerequisite for process management.
The enactment of the process should not differ from the defined process in any substantive way.
The key determinants of process performance must be instrumented and measured.
Failure to measure or limited measurement scope can lead to sub-optimization or “process tampering”
The process and measures should be designed to support process management from the start.
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 15
TopicsProcess management concepts
TSP measurement framework
Performance measures
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 16
Process Measurement IssuesSome common process measurement issues…• substantial variation in measurement reporting
requirements across development groups and suppliers• few measures of quality• standards emphasize derived measures instead of
common base measures • inability to summarize, aggregate, drill-down, extend• cannot benchmark or make comparisons• limited use as a management indicator• lack of accountability
Measurement framework “literally” tied to CMMI process areas and the examples of derived measures from CMMI.
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 17
Measurement SystemDesign and a “systems” approach solves many measurement issues.
• Define a few common base measurement categories and establish standards for the most used instances.
• Develop a measurement framework that relates the base measures to the key elements of software process work.
• Create derived measures from the standard base measures.
• Identify process performance models and benchmarks that predict future performance.
• Integrate into monitoring and decision-making processes.
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 18
TSP Measurement Framework -1
Base measurement categories Example derived measures• Estimation accuracy• Prediction intervals• Productivity• Cost performance index• Planned value• Earned value• Predicted earned value• Defect density• Defect density by phase• Defect removal rate by phase• Defect removal leverage• Review rates• Process yield• Phase yield• Failure cost of quality• Appraisal cost of quality• Appraisal/Failure COQ ratio• Percent defect free• Defect removal profiles• Quality profile• Quality profile index• …
Size
ScheduleDefects
Effort
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 19
TSP Measurement Framework -2A model of key process elements and their relations provides a context for the base measures.• processes and phases• projects and sub-projects• products and parts• teams and team members• tasks• period (week, month, etc.)
The model facilitates• analysis• aggregation and drill-down• queries and views• scalability
Process
Project
Team
Product
Tasks
Period
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 20
Estimated and Actual SizeSize is a measure of the magnitude of the software deliverable, e.g. lines of code or function points.
Size is estimated and actual size is measured for each component.
Five size accounting categories are used.• Base• Modifications to the base• Deletions from the base• Added or new• Reused
Size data are used to• estimate effort• track progress• normalize other measures
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 21
Estimated and Actual EffortEffort is a measure of time on task.
The TSP effort measure is called a task hour.
Task hours are estimated and measured by• process phase• task• day or week How many task hours
are there in a 40 hour week?
About 15 to 20.
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 22
Estimated and Actual ScheduleSchedule has two components• resource availability• task completion dates
Planned task dates are calculated from estimates of resource availability and planned task hours.
Actual date completed is recorded as tasks are finished.
Actual resource availability is derived from actual task hours.
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 23
Estimated and Actual DefectsDefects are the measure of quality.
Estimates of the number of defects injected and removed.
A count of the actual number of defects injected and removed.
Defect data includes• component• phase injected• phase removed
Definition: a work product element that must be changed, after it was completed, in order to ensure proper design, implementation, test, use, or maintenance.
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 24
TopicsProcess management concepts
TSP measurement framework
Performance measures
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 25
TSP Performance MeasuresThe most often used TSP performance measures are:• Planned value, earned value, predicted earned value• Planned and actual task hours• Estimation error• Growth• Defect density• Percent defect-free• Quality profile and index
These measures support planning and tracking.
Combined with historical data and/or benchmarks, these measures also support process performance modeling.
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 26
Process Performance Models
Project dataProcess
performance model
Predicted project
performance
Historical data and Benchmarks
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 27
Example: Quality Profile
Project data
•Time in design, design review, coding, and code review•Defects found in compile and unit test.•Product size
Process performance
model
Quality Profile
Predicted value
Likelihood of post system test
defects
Benchmarks
•Development time ratio criteria•Defect density criteria
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 28
Quality Profile BenchmarksThese software quality benchmarks predict post-development defects.
Modules that meet these criteria were found to be largely defect free in system test and after deployment.
Software Quality Benchmarks
DerivedMeasure
Desired Value
Design Time vs. Code Time Ratio
1 to 1
Design vs. Design Review Time Ratio
2 to 1
Code vs. Code Review Time Ratio
2 to 1
Compile defect density
< 10 per KLOC
Unit test defect density
< 5 per KLOC
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 29
Quality ProfileThe quality profile is a process performance model that provides an early warning indicator for post-development defects.
The quality profile uses the five software quality benchmarks.
Satisfied criteria are plotted at the outside edge of the chart.
Component 2 Risk Factors
Design/Code Time
Code Review Time
Compile D/KLOCUnit Test D/KLOC
Design Review Time
Component 5 Risk Factors
Design/Code Time
Code Review Time
Compile D/KLOCUnit Test D/KLOC
Design Review Time
High quality component Poor quality component
Inadequate design review time results in
design defects escaping to
test and production.
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 30
Using the Quality Profile
Quality Profile for Assembly Common Query Changes (BE)
0
0.2
0.4
0.6
0.8
1Design/Code Time
Code Review Time
Compile Defects/KLOCUnit Test Ddefects/KLOC
Design Review Time
Plan
Actual
Quality Profile for Assembly BOM Query Sproc Changes (BE)
0
0.2
0.4
0.6
0.8
1Design/Code Time
Code Review Time
Compile Defects/KLOCUnit Test Ddefects/KLOC
Design Review Time
Plan
Actual
Quality Profile for Assembly User Report Settings (BE)
0
0.2
0.4
0.6
0.8
1Design/Code Time
Code Review Time
Compile Defects/KLOCUnit Test Ddefects/KLOC
Design Review Time
Plan
Actual
Quality Profile for Assembly OEMMOO Delivery.aspx (FE-Server)
0
0.2
0.4
0.6
0.8
1Design/Code Time
Code Review Time
Compile Defects/KLOCUnit Test Ddefects/KLOC
Design Review Time
Plan
Actual
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 31
Quality Performance IndexThe Quality Performance Index is the product of the five parameters in the quality profile.
QPI predicts the likelihood of post-development defects in a system.
Unit Test and Compile Defects and Design and Code Review Time vs. Post-Development
Defects/KLOC
0
5
10
15
20
25
0 0.2 0.4 0.6 0.8 1 1.2
Process Quality Index
Po
st-
De
ve
lop
me
nt
De
fec
ts/K
LO
C
Quality Performance Index
Pos
t-D
evel
opm
ent
Def
ect D
ensi
ty
Quality Performance Index vs.Post-Development Defect Density
Interpreting the Quality Performance Index
Range Interpretation
0.0 to 0.2 Re-inspect; test and post-development defects likely
0.2 to 0.4 Re-inspect if test defects are found
0.4 to 1.0 Component is of high-quality
© 2006 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Software Process Performance Measures.2006.02.01 32
ConclusionMeasurement and process management are inseparable, you should incorporate measurement in your initial processes.
A common problem with software process measurement is a lack of integrated, well designed measurement systems resulting in unnecessary complexity and usability issues such as lack of scalability and extensibility.
Process management can be successfully applied to the software process with a few, simple, derived measures that are integrated into a measurement framework.
Recommended