View
213
Download
0
Tags:
Embed Size (px)
Citation preview
1
Testing MetricsSoftware Reliability
CSSE 376, Software Quality Assurance
Rose-Hulman Institute of Technology
April 5, 2007
2
Outline
Testing MetricsAn Important Metric: Reliability
3
Common Metrics
Product KLOC
thousands of lines of code
need to remove comment lines?
Function Points #Bugs
Process Staff hours Tests planned Tests passed
4
Bug Density
Measure #Bugs/KLOCExpect different densities at different
stages of a projectMay categorize bugs by severity
5
Example Bug Density
0
5
10
15
20
25
30
Reqts HLD LLD Code UnitTest SysTest
Bugs/KLOC
Catastrophic
Major
Minor
6
Cartoon of the Day (1/3)
7
Using Bug Metrics
Count bugs discovered during each phase of a project
Compare to previous projectsprovides estimates of expected values at
each phase -- could use to set milestonedeviation of more than 20% from expected
indicates need for investigation
8
Analysis of Bug Data
Root cause analysisSearch for explanationsMight look at other process data (effort,
experience of team, etc.)Trend analysis
Make predictions from current data
9
Reliability
10
Failures vs. Faults
Fault:developer-oriented6 faults/1000 source lines
Failures:customer-oriented3 failures/1000 CPU hours
11
Calculating Reliability
Reliabilityprobability of failure-free operation for a
specified time interval0.82 for 8 CPU hours
Failure Intensitynumber of observed failures within a
specified time interval3 failures/1000 CPU hours
12
Factors Influencing Reliability
Fault removalby error correction (debugging)
Fault introductionby error correction (unintended)by new feature development
Operational profile
13
Operational Profile
0
0.005
0.01
0.015
0.02
0.025
0.03
0.035
0.04
0.045
0.05
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Functions
Probabilityof Use
14
Example
FunctionUsage Probability
Distribution Interval
Change 32% 0-31
Delete 14% 32-45
Insert 46% 46-91
Print 8% 92-99
15
Test Generation
TestRandom Numbers Test Cases
1 29, 11, 47, 52, 26, 94
C, C, I,I, C, P
2 62, 98, 39, 78, 82, 65
I, P, D,I, I, I
3 83, 32, 58, 41, 36, 17
I, D, I,D, D, C
4 36, 49, 96, 82, 20, 77
D, I, P,I, C, I
16
Test Compression
Real use of a product involves repetitive operationsdifferent users invoke the same operationssame user invokes the same operations on
different daysRedundant tests waste computer and
personnel timeCompression: when generating random
tests, do not duplicate previous tests
17
Cartoon of the Day (2/3)
18
Cartoon of the Day (3/3)
19
Curve Fitting
Reliability models focus on failure removal
Use a random process to model the failure removal process
20
Execution Time Model
FailureIntensity
Execution Time
Goal
21
Resource Constraints
Early phase of a projectconstrained by availability of developers
(debuggers)Middle phase
constrained by availability of testersLate phase
constrained by availability of machinesmay run tests in parallel to increase number
of tests per CPU hour
22
Adjusting for Calendar Time
Estimate resource usage during each phase of the project
Model calculates failure intensity in terms of execution time
Model adjusts fault removal rate according to resource constraints
23
Calendar Time Component
FailureIntensity
Execution Time
Goal
Constrained by debuggers
Constrained by testers
Constrained by machines
24
Calculating CalendarTime/ExecutionTime Ratio
10 staff-hours to fix each failure
2 failures/CPU-hr
That means it will take 10 * 2 = 20 staff-hrs per CPU-hr
Suppose you have 5 developers
Then you have 20 / 5 = 4 hrs per CPU-hrEach CPU-hr will take 4 calendar hours
25
Estimating Completion
1. Establish a failure intensity objective
2. Record execution times of failures
3. Run model to estimate reliability
4. Model reports estimated completion date
Values are not absolute---within confidence bounds
26
Estimating Completion
FailureIntensity
Calendar Time
Goal
ShipDate
27
Acceptance Charts
Bugs
Time
Reject
AcceptContinue Testing