Upload
crinelll
View
222
Download
0
Embed Size (px)
Citation preview
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 1/54
SOFTWARE ENGINEERINGMihaela Dinsoreanu, PhD
Computer Science Department
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 2/54
SOFTWARE QUALITY METRICS OVERVIEW
Software metrics:y Product
Size
Complexity
Design features
Performancey Process
effectiveness of defect removal during development,
pattern of testing defect arrival,
response time of the fix process
y Project number of software developers
the staffing pattern over the life cycle of the software
cost
schedule
productivity
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 3/54
SOFTWARE QUALITY METRICS
A subset of software metrics that focus on the
quality aspects of the product, process, and
project
Divided into
y end-product quality metrics
y in-process quality metrics
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 4/54
END-PRODUCT QUALITY METRICS
Mean time to failure (MTTF)
y Safety-critical systems (ex. US air traffic controlsystem cannot be unavailable for more than 3s/year)
Defect density
y many commercial software systems
Customer problems Customer satisfaction
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 5/54
DEFECTS
According to the IEEE/ American National
Standards Institute (ANSI) standard (982.2):
y An error is a human mistake that results in
incorrect software.
y The f ault is an accidental condition that causes a
unit of the system to fail to function as required.
y A def ect is an anomaly in a product.
y A f ailure occurs when a functional unit of a
software-related system can no longer perform itsrequired function or cannot perform it within
specified limits.
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 6/54
DEFECTS
Errors (development process) => faults and
defects in the system
Faults/defects => failures (run-time)
1 fault => 0«* failures
´defect sizeµ = the probability of failure
associated with a latent defect
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 7/54
THE DEFECT DENSITY METRIC
Nominator?
Denumerator?
Time?
Conceptual definition
Defect rate = number of defects/ the opportunities
for error (OFE) in a given timeframe
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 8/54
PRACTICAL DEFINITION
Defect rate = number of unique causes of observed
failures/ size of the software (KLOC or FP)
LOCy Count only executable lines.
y Count executable lines plus data definitions.
y Count executable lines, data definitions, and comments.
y Count executable lines, data definitions, comments, and job
control language.y Count lines as physical lines on an input screen.
y Count lines as terminated by logical delimiters.
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 9/54
LOC
Straight LOC count
y Comparisons?
Normalized to Assembler-equivalent LOC
What if ?
y First release (50 KLOC; latent defect rate = 2.0
defects /KLOC during the next four years)y Next releases?
Old code
New/changed code
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 10/54
DEFECT RATE FOR NEW/CHANGED CODE
LOC count:
y For the entire software product
y For the new and changed code of the release
Defect tracking: Defects must be tracked to the
release origin
y the portion of the code that contains the defects
y
at what release the portion was added, changed, orenhanced.
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 11/54
CHANGE FLAGGING
a new function => new and changed lines of code
are flagged with a specific identification (ID)
number.
The ID is linked to the requirements number If the change-flagging IDs and requirements IDs
are further linked to the release number of the
product => LOC counting tools can use the
linkages to count the new and changed code in
new releases.
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 12/54
EXAMPLE (IBM ROCHESTER)F al l 2 0 1 1
LOC = instruction statements (logical LOC)
Shipped Source Instructions (SSI) ² total product
Changed Source Instructions (CSI) - new and
changed code of the new release
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 13/54
POST-RELEASE DEFECT RATE METRICS
Total defects per KSSI (a measure of code qualityof the total product) ² process metric
Field defects per KSSI (a measure of defect ratein the field) ² customer·s perspective
Release-origin defects (field and internal) perKCSI (a measure of development quality) ²
process metric
Release-origin field defects per KCSI (a measureof development quality per defects found bycustomers) ² customer·s perspective
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 14/54
CUSTOMER·S PERSPECTIVEF al l 2 0 1 1
Initial Release of Product Y
KCSI = KSSI = 50 KLOC
Defects/KCSI = 2.0
Total number of defects = 2.0 x 50 = 100
Second Release
KCSI = 20
KSSI = 50 + 20 (new and
changed lines of code) -4 (assuming 20% are
changed lines of code ) = 66
Defect/KCSI = 1.8 (assuming10% improvement over the
first release)
Total number of additional
defects = 1.8 x 20 = 36
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 15/54
CUSTOMER·S PERSPECTIVE
Third Release
KCSI = 30
KSSI = 66 + 30 (new and changed lines of code) -
6 (assuming the same % (20%) of changed lines
of code) = 90
Targeted number of additional defects (no more
than previous release) = 36
Defect rate target for the new and changed lines
of code: 36/30 = 1.2 defects/KCSI or lower
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 16/54
CONCLUSION
Release 1 -> Release 2
y 64% reduction [(100 - 36)/100] in the number of
defects perceived by client
Release 2 -> Release 3
y Defect rate has to be better (1.2/1.8) to preserve the
number of defects
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 17/54
FP
Definition:
´A function can be defined as a collection of
executable statements that performs a certain
task, together with declarations of the formalparameters and local variables manipulated by
those statements.µ
(Conte et al., 1986).
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 18/54
FP COMPONENTS ² AVERAGE WEIGHTING
FACTORS
Number of external inputs (e.g., transaction types) x 4
Number of external outputs (e.g., report types) x 5
Number of logical internal files (files as the user might
conceive them, not physical files) x 10
Number of external interface files (files accessed by the
application but not maintained by it) x 7
Number of external inquiries (types of online inquiries
supported) x 4
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 19/54
FP ² LOW AND HIGH WEIGHTING FACTORS
External input: low complexity -3; high
complexity- 6
External output: low complexity- 4; high
complexity- 7 Logical internal file: low complexity- 7; high
complexity- 15
External interface file: low complexity- 5; high
complexity- 10
External inquiry: low complexity- 3; high
complexity- 6
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 20/54
COMPLEXITY
Classified based on a set of standards
Ex. external output component:
y if ((number of data element types <= 5) && (numberof file types referenced)<=3)) then complexity :=
´lowµ;
y if ((number of data element types >= 20) &&
(number of file types referenced)>= 2 )) then
complexity := ´highµ;
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 21/54
FP STEP 1: FUNCTION COUNTSF al l 2 0 1 1
wij are the weighting factors of the five components
by complexity level (low, average, high)
xij are the numbers of each component in the
application.
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 22/54
FP STEP 2: IMP ACT OF GENERAL SYSTEM
CHARACTERISTICS
Assign a score in [0..5] (ci)
1. Data communications
2. Distributed functions
3. Performance
4. Heavily used configuration5. Transaction rate
6. Online data entry
7. End-user efficiency
8. Online update
9. Complex processing
10. Reusability
11. Installation ease
12. Operational ease
13. Multiple sites
14. Facilitation of change
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 23/54
FP STEP 3: VALUE ADJUSTMENT FACTORF al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 24/54
EXAMPLE
Estimated defect rates per function point
y SEI CMM Level 1: 0.75
y SEI CMM Level 2: 0.44
y
SEI CMM Level 3: 0.27y SEI CMM Level 4: 0.14
y SEI CMM Level 5: 0.05
[Jones, Software Assessments, Benchmarks, and Best
Practices, 2000]
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 25/54
CUSTOMER PROBLEMS METRIC
valid defects
usability problems
unclear documentation or information
duplicates of valid defects (defects that werereported by other customers and fixes were
available but the current customers did not know
of them)
user errors
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 26/54
PROBLEMS PER USER MONTH (PUM)
PUM = Total problems that customers reported
(true defects + non-defect-oriented problems) for
a time period / Total license-months of the
software during the period
Number of license-months = Numbers of install
licenses of the software * Number of months in
the calculation period
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 27/54
SUMMARY
Def ect rate PUM
Numerator Valid and unique product
defects
All customer problems (defects
and nondefects, first time and
repeated)Denominator Size of product (KLOC or
function point)
Customer usage of the product
(user-months)
Measurement perspective Producer³
software development
organization
Customer
Scope Intrinsic product quality Intrinsic product quality plusother factors
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 28/54
CUSTOMER S ATISFACTION METRICS
Customer survey
y Very satisfied
y Satisfied
y Neutral
y Dissatisfied
y Very dissatisfied.
Several metrics
y Percent of completely satisfied customers
y Percent of satisfied customers (satisfied andcompletely satisfied)
y Percent of dissatisfied customers (dissatisfied andcompletely dissatisfied)
y Percent of nonsatisfied (neutral, dissatisfied, andcompletely dissatisfied)
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 29/54
IN-PROCESS QUALITY METRICS
Defect Density During Machine Testing
Defect Arrival Pattern During Machine Testing
Phase-Based Defect Removal Pattern
Defect Removal Effectiveness
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 30/54
DEFECT DENSITY DURING M ACHINE
TESTING
´The more defects found during testing, the more
defects will be found laterµ
release-to-release comparisons
Current defect rate <= previous release defect rate
Does the testing for the current release deteriorate?
y No => the quality perspective is positive.
y Yes => extra testing needed (e.g. add test cases to increase
coverage, blitz test, customer testing, stress testing, etc.).
Current defect rate > previous release defect rateDid we plan for and actually improve testing
effectiveness?
y No => the quality perspective is negative.
y Yes => then the quality perspective is the same or positive.
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 31/54
DEFECT A RRIVAL P ATTERN DURING
M ACHINE TESTINGF al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 32/54
DEFECT ARRIVAL P ATTERN DURING
TESTING
The defect arrivals (defects reported) during the
testing phase by time interval (e.g., week).
The pattern of valid defect arrivals-whenproblem determination is done on the reported
problems. This is the true defect pattern.
The pattern of defectb
acklog over time. Thismetric is a workload statement as well as a
quality statement.
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 33/54
PHASE-B ASED DEFECT REMOVAL
P ATTERNF al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 34/54
PHASE-B ASED DEFECT REMOVAL
P ATTERN
Phases
y high-level design review (I0)
y low-level design review (I1)
y code inspection (I2)
y unit test (UT)
y component test (CT)
y system test (ST)
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 35/54
DEFECT REMOVAL EFFECTIVENESSF al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 36/54
METRICS FOR SOFTWARE M AINTENANCE
Fix backlog and backlog management index
Fix response time and fix responsiveness
Percent delinquent fixes
Fix quality
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 37/54
FIX BACKLOG (FB) AND BACKLOG
MANAGEMENT INDEX (BMI)F al l 2 0 1 1
FB = workload statement for software
maintenance = count of reported problems that
remain at the end of each month or each week.
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 38/54
Fall 2011
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 39/54
FIX RESPONSE TIME AND FIX
RESPONSIVENESS
Fix response time = Mean time of all problems
from open to closed
Fix Responsiveness = f(customer expectations,the agreed-to fix time, the ability to meet one's
commitment to the customer)
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 40/54
PERCENT DELINQUENT FIXESF al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 41/54
EXAMPLES OF METRIC PROGRAMS -
MOTOROLA
Motorola Quality Policy for Software
Development (QPSD)
Goals
y
Goal 1: Improve project planning.y Goal 2: Increase defect containment.
y Goal 3: Increase software reliability.
y Goal 4: Decrease software defect density.
y Goal 5: Improve customer service.
y Goal 6: Reduce the cost of nonconformance.
y Goal 7: Increase software productivity.
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 42/54
MOTOROLA QUALITY POLICY FOR
SOFTWARE DEVELOPMENT (QPSD)
Measurement Areas
y Delivered defects and delivered defects per size
y Total effectiveness throughout the process
y Adherence to schedule
y Accuracy of estimates
y Number of open customer problems
y Time that problems remain open
y Cost of nonconformance
y Software reliability
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 43/54
GOAL 5: IMPROVE CUSTOMER SERVICE
Question 5.1 What is the number of new
problems opened during the month?
Metric 5.1: New Open Problems (NOP)
NOP = Total new postrelease problems opened during the month
Question 5.2 What is the total number of open
problems at the end of the month?
Metric 5.2: Total Open Problems (TOP)
TOP = Total postrelease problems that remained
open at the end of the month
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 44/54
GOAL 5: IMPROVE CUSTOMER SERVICE
Question 5.3: What is the mean age of open problemsat the end of the month?
Metric 5.3: Mean Age of Open Problems (AOP)
AOP = (Total time postrelease problems remaining open at
the end of the month have been open)/(Number of openpostrelease problems remaining open at the end of themonth)
Question 5.4: What is the mean age of the problemsthat were closed during the month?
Metric 5.4: Mean Age of Closed Problems (ACP)
ACP = (Total time postrelease problems closed within themonth were open)/(Number of open postrelease problemsclosed within the month)
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 45/54
COLLECTING SOFTWARE ENGINEERING
D ATA
must be based on well-defined metrics and
models
data classification schemes to be used
level of precision must be specified the collection form should be pretested
the information extracted from the data be
focused, accurate, and useful
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 46/54
D ATA COLLECTION METHODOLOGY
Basili and Weiss (1984):
y Establish the goal of the data collection.
y Develop a list of questions of interest.
y Establish data categories.
y Design and test data collection forms.
y Collect and validate data.
y Analyze data.
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 47/54
EXAMPLE
Inspection defect: A problem found during the
inspection process which, if not fixed, would
cause one or more of the following to occur:
y A defect condition in a later inspection phase
y A defect condition during testing
y A field defect
y Nonconformance to requirements and specifications
y Nonconformance to established standards such as
performance, national language translation, andusability
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 48/54
INSPECTION SUMMARY FORMF al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 49/54
INTERFACE DEFECTS
An interface defect is a defect in the way two
separate pieces of logic communicate. These are
errors in communication between:
y Components
y Products
y Modules and subroutines of a component
y User interface (e.g., messages, panels)
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 50/54
EXAMPLES OF INTERFACE DEFECTS PER
DEVELOPMENT PHASE
High-Level Design (I0)y Use of wrong parameter
y Inconsistent use of function keys on user interface (e.g., screen)
y Incorrect message used
y Presentation of information on screen not usable
Low-Level Design (I1)y Missing required parameters (e.g., missing parameter on module)
y Wrong parameters (e.g., specified incorrect parameter on module)
y Intermodule interfaces: input not there, input in wrong order
y Intramodule interfaces: passing values/data to subroutines
y Incorrect use of common data structures
y Misusing data passed to code
Code (I2)y Passing wrong values for parameters on macros, application program
interfaces (A PIs), modules
y Setting up a common control block/area used by another piece of codeincorrectly
y Not issuing correct exception to caller of code
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 51/54
LOGIC DEFECT
A logic defect is one that would cause incorrect
results in the function to be performed by the
logic. High-level categories of this type of defect
are as follows:
y Function: capability not implemented or
implemented incorrectly
y Assignment: initialization
y Checking: validate data/values before use
y
Timing: management of shared/real-time resourcesy Data Structures: static and dynamic definition of
data
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 52/54
EXAMPLES OF LOGIC DEFECTS PER
DEVELOPMENT PHASE High-Level Design (I0)
y Invalid or incorrect screen flow
y High-level flow through component missing or incorrect in the review package
y Function missing from macros you are implementing
y Using a wrong macro to do a function that will not work (e.g., using XXXMSG to
receive a message from a program message queue, instead of YYYMSG).
y Missing requirements
y Missing parameter/field on command/in database structure/on screen you are
implementing
y Wrong value on keyword (e.g., macro, command)
y Wrong keyword (e.g., macro, command)
Low-Level Design (I1)
y Logic does not implement I0 design
y Missing or excessive functiony Values in common structure not set
y Propagation of authority and adoption of authority (lack of or too much)
y Lack of code page conversion
y Incorrect initialization
y Not handling abnormal termination (conditions, cleanup, exit routines)
y Lack of normal termination cleanup
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 53/54
EXAMPLES OF LOGIC DEFECTS PER
DEVELOPMENT PHASE
Code (I2)
y Code does not implement I1 design
y Lack of initialization
y Variables initialized incorrectly
y
Missing exception monitorsy Exception monitors in wrong order
y Exception monitors not active
y Exception monitors active at the wrong time
y Exception monitors set up wrong
y Truncating of double-byte character set data incorrectly
(e.g., truncating before shift in character)y Incorrect code page conversion
y Lack of code page conversion
y Not handling exceptions/return codes correctly
F al l 2 0 1 1
8/3/2019 W3_Software Quality Metrics
http://slidepdf.com/reader/full/w3software-quality-metrics 54/54
WRA P-UP
Software quality metrics focus on the quality
aspects of the product, process, and project.
Can be grouped into three categories in
accordance with the software life cycle:
y end-product quality metrics,
y in-process quality metrics,
y maintenance quality metrics.
Example of metrics program at Motorola
F al l 2 0 1 1