22
Software Quality Assurance – Lec13 1 © Dr. A. Williams, Fall 2008 Process Maturity Models COMP 4004 – Fall 2008 Notes Adapted from Dr. A. Williams Software Quality Assurance – Lec13 2 © Dr. A. Williams, Fall 2008 Process Maturity Models Evaluation of the maturity of an organization development process Benchmark for SQA practice Breakdown of software process into process areas Provides means of measuring maturity, of complete process and/or of separate process areas Examples: Capability Maturity Model (CMM), from the Software Engineering Institute (SEI), at Carnegie-Mellon University ISO 9000 quality standard series, from the International Organization for Standardization

Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 1© Dr. A. Williams, Fall 2008

Process Maturity Models

COMP 4004 – Fall 2008

Notes Adapted from Dr. A. Williams

Software Quality Assurance – Lec13 2© Dr. A. Williams, Fall 2008

Process Maturity Models• Evaluation of the maturity of an organization development

process– Benchmark for SQA practice

• Breakdown of software process into process areas

• Provides means of measuring maturity, of complete process and/or of separate process areas

• Examples:– Capability Maturity Model (CMM), from the Software

Engineering Institute (SEI), at Carnegie-Mellon University

– ISO 9000 quality standard series, from the International Organization for Standardization

Page 2: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 3© Dr. A. Williams, Fall 2008

CMM Maturity levels

1. Initial – chaotic unpredictable (cost, schedule, quality)

2. Repeatable - intuitive; cost/quality highly variable, some control of schedule, informal/ad hoc procedures.

3. Defined - qualitative; reliable costs andschedules, improving but unpredictablequality performance.

4. Managed – quantitativereasonablestatistical controlover product quality.

5. Optimizing - quantitative basis for continuous improvement.

Level 1Initial

Level 2Repeatable

Level 3Defined

Level 4Managed

Level 5Optimizing

Software Quality Assurance – Lec13 4© Dr. A. Williams, Fall 2008

CMM Key Process Areas

• Functions that must be present at a particular level

• Level 1: Initial– Any organization that does not meet requirements of

higher levels falls into this classification.

• Level 2: Repeatable– Requirements management– Software project planning and oversight– Software subcontract management– Software quality assurance– Software configuration management

Page 3: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 5© Dr. A. Williams, Fall 2008

CMM Key Process Areas

• Level 3: Defined– Organizational process improvement– Organizational process definition– Training program– Integrated software management– Software product engineering– Inter group coordination– Peer reviews

Software Quality Assurance – Lec13 6© Dr. A. Williams, Fall 2008

CMM Key Process Areas

• Level 4: Managed– Process measurement and analysis

– Statistics on software design/code/test defects

– Defects projection – Measurement of test coverage– Analysis of defects process related causes– Analysis of review efficiency for each

project– Quality management

Page 4: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 7© Dr. A. Williams, Fall 2008

CMM Key Process Areas

• Level 5: Optimizing– Defect prevention.

– Mechanism for defect cause analysis to determine process changes required for prevention

– Mechanism for initiating error prevention actions

– Technology innovation– Process change management

Software Quality Assurance – Lec13 8© Dr. A. Williams, Fall 2008

ISO 9000

• Set of standards and guidelines for quality management system

• Registration involves passing a third party audit, and passing regular audits to ensure continued compliance

• ISO 9001 standard applies to software engineering – 20 areas

Page 5: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 9© Dr. A. Williams, Fall 2008

ISO 90001. Management responsibility

2. Quality system

3. Contract review

4. Design control

5. Document control

6. Purchasing

7. Purchaser-supplied product

8. Product identification and traceability

9. Process control

10. Inspection and testing

Software Quality Assurance – Lec13 10© Dr. A. Williams, Fall 2008

ISO 900011. Inspection, measuring, and test equipment

12. Inspection and test status

13. Control of nonconforming product

14. Corrective action

15. Handling, storage, packaging, and delivery

16. Quality records

17. Internal quality audits

18. Training

19. Servicing

20.Statistical Techniques

Page 6: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 11© Dr. A. Williams, Fall 2008

Software Quality Metrics• Measure

– quantitative indication of the extent, amount, dimension, capacity or size of some attribute of a product or process

• Measurement– act of determining a measure

• Metric (IEEE Std. 610.12)– quantitative measure of the degree to which a system,

component, or process possesses a given attribute– may relate individual measures

• Indicator – metric(s) that provide insight into software process,

project, or product

Software Quality Assurance – Lec13 12© Dr. A. Williams, Fall 2008

Objective

• Use of metrics is vital for:– Estimates of development effort:

– Schedule– Cost– Size of team

– Progress tracking– Deviations from schedule, cost– Determining when corrective action is needed.

• A history of project metrics can help an organization improve its process for future product releases, and improve software quality.

Page 7: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 13© Dr. A. Williams, Fall 2008

Types of Metrics• Product Metrics:

– Measurements about the size / complexity of a product.– Help to provide an estimate of development effort.

• Process Metrics – Progress tracking.– Staffing pattern over the life cycle– Productivity– Determine the effectiveness of various process stages,

and provide targets for improvement.

• Operational Metrics:– Collection of data after product release.

Software Quality Assurance – Lec13 14© Dr. A. Williams, Fall 2008

Problems of Measurement

• Data collection:– How to collect data without impacting

productivity?– Usually need to collect data via tools, for

accuracy, currency.

• Inappropriate focus:– Developers target “ideal” metric values instead

of focusing on what is appropriate for the situation.

– Perception that metrics will be used to rank employees.

Page 8: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 15© Dr. A. Williams, Fall 2008

Product metrics

• Objective: obtain a sense of the size or complexity of a product for the purpose of:– Scheduling:

– How long will it take to build?– How much test effort will be needed?– When should the product be released?

– Reliability:– How well will it work when released?

Software Quality Assurance – Lec13 16© Dr. A. Williams, Fall 2008

Lines of Code• This is the “classic” metric for the size of a product.

– Often expressed as KLOC (thousand lines of code)– Sometimes counted as (new + modified) lines of code for

subsequent releases, when determining development effort.

• Advantages:– Easy to measure with tools.– Appears to have a correlation with effort.

• Disadvantages:– “What is a line of code?”

– Comments? Declarations? Braces?– Not as well correlated with functionality.– Complexity of statements in different programming languages.– Automated code generation from GUI builders, parser

generators, UML, etc.

Page 9: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 17© Dr. A. Williams, Fall 2008

“Hello, world” in C and COBOL

#include <stdio.h>

int main(void)

{

printf("Hello World");

}

• 5 lines of C, versus 17 lines of COBOL

000100 IDENTIFICATION DIVISION.

000200 PROGRAM-ID. HELLOWORLD.

000300

000400*

000500 ENVIRONMENT DIVISION.

000600 CONFIGURATION SECTION.

000700 SOURCE-COMPUTER. RM-COBOL.

000800 OBJECT-COMPUTER. RM-COBOL.

000900

001000 DATA DIVISION.

001100 FILE SECTION.

001200

100000 PROCEDURE DIVISION.

100100

100200 MAIN-LOGIC SECTION.

100300 BEGIN.

100400 DISPLAY " " LINE 1 POSITION 1 ERASE EOS.

100500 DISPLAY "Hello world!" LINE 15 POSITION 10.

100600 STOP RUN.

100700 MAIN-LOGIC-EXIT.

100800 EXIT.

Software Quality Assurance – Lec13 18© Dr. A. Williams, Fall 2008

Function Point Analysis

• Defined in 1977 by Alan Albrecht of IBM

• Details: http://www.nesma.nl/english/menu/frsfpa.htm

• Attempts to determine a number that increases for a system with:– More external interfaces– More complex

Page 10: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 19© Dr. A. Williams, Fall 2008

Function Point Analysis• Steps to performing function point analysis:

1. Identify user functions of the system (5 types of functions are specified).

2. Determine the complexity of each function (low, medium, high).

3. Use the function point matrix to determine a value for each function in step 1, based on its complexity assigned in step 2.

4. Rate the system on a scale of 0 to 5 for each of 14 specified system characteristics.– Examples: communications, performance,

5. Combine the values determined in steps 3 and 4 based on specified weightings.

Software Quality Assurance – Lec13 20© Dr. A. Williams, Fall 2008

Using Function Points Analysis

• To use function point analysis for estimation, the following information is needed:– Standard rate of development per function

point.– Collected from previous projects.

– Special modifications (plus or minus) to the standard rate for the project to be estimated;– Factors: new staff, new development tools,

etc. – Include additional factors not based on project

size.

Page 11: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 21© Dr. A. Williams, Fall 2008

McCabe’s Cyclomatic Complexity

• Based on a flow graph model of software.

• Measures the number of linearly independent paths through a program.

• Uses graph components:– E: Number of edges in flow graph– N: Number of nodes in flow graph

• Cyclomatic complexity: M = E – N + 2

• Claim: if M > 10 for a module, it is “likely” to be error prone.

Software Quality Assurance – Lec13 22© Dr. A. Williams, Fall 2008

E13

Calculating Cyclomatic Complexity

N2

N3

N4N6

N5

N7

E1

E10

E8

E2

E3

E4

E5

E6

E7

E9

E11

E12

E14

E15

N8 N9 N10

M = 15 – 11 + 2 = 6

N1

N11

Page 12: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 23© Dr. A. Williams, Fall 2008

Cyclomatic Complexity

• With a cyclomatic complexity of 6, there are 6 independent paths through the flow graph:1. E1-E2-E152. E1-E5-E3-E14-E153. E1-E5-E6-E4-E13-E14-E154. E1-E5-E6-E7-E12-E13-E14-E155. E1-E5-E6-E7-E8-E10-E11-E12-E13-E14-E156. E1-E5-E6-E7-E8-E9-E11-E12-E13-E14-E15

Software Quality Assurance – Lec13 24© Dr. A. Williams, Fall 2008

Software Package Metrics• From R.C. Martin, Agile Software Development: Principles,

Patterns, and Practices

• Metrics determined from a package of classes in an object-oriented language– Number of classes and interfaces– Coupling:

– Number of other packages that depend on classes in this package (CA)

– Number of packages depended on by classes in this package (CE).

– Ratio of abstract classes to total classes (0 to 1)– “Instability”: ratio CE / (CA+CE)

Page 13: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 25© Dr. A. Williams, Fall 2008

Additional OO metrics

• Class dependencies: classes used, classes used by

• Degree of inheritance:– Number of subclasses for a class– Maximum number of inheritances from root of

tree to a class.

• Number of method calls invoked to respond to a message.

Software Quality Assurance – Lec13 26© Dr. A. Williams, Fall 2008

Process Metrics

• Uses:– Keep the current project on track.– Accumulate data for future projects.– Provide data for process improvements.

Page 14: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 27© Dr. A. Williams, Fall 2008

Defect Density

• Measurements of the number of defects detected, relative to a measurement of the code size.– Defects found per line of code– Defects found per function point

• Can be determined to determine effectiveness of process stages:– Density of defects removed by code inspection– Density of defects removed by testing– Total defect density.

Software Quality Assurance – Lec13 28© Dr. A. Williams, Fall 2008

More defect metrics

• Average severity of defects:– Classify defects by severity (e.g. scale of 1 to

5), and then determine the average severity of all defects.

• Cost of defect removal:– Estimated $ cost of removing a defect at each

stage of the development process.

Page 15: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 29© Dr. A. Williams, Fall 2008

Timetable / cost metrics

• Timetable metrics:– Ratio of milestones that were achieved on

schedule, to the total number of milestones.– Average delay of milestone completion.

• Cost metrics:– Development cost per KLOC– Development cost per function point.

Software Quality Assurance – Lec13 30© Dr. A. Williams, Fall 2008

Productivity Metrics

• Number of development hours per KLOC

• Number of development hours per function point.

• Number of development hours per test case.

Page 16: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 31© Dr. A. Williams, Fall 2008

In-Process Metrics for software testing

• Metrics that are effective for managing software testing, – They are used to indicate “good” or “bad” in

terms of quality or schedule and thus to drive improvements,

– A baseline for each metric should always be established for comparison

• Helps to answer the difficult question: How do you know your product is good enough to ship?

Software Quality Assurance – Lec13 32© Dr. A. Williams, Fall 2008

S curve

• This metric tracks the progress of testing over time.– Is the testing behind schedule? – If it falls behind, actions need to be taken.

• Planned curves can be used for release-to- release comparison

• Planned curve needs to be done carefully

Page 17: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 33© Dr. A. Williams, Fall 2008

S curve

• Test Progress S curve (Planned, Attempted, Actual)– The X-axis of the S curve is the time units (e.g.

weeks); – The Y-axis is the number of test cases;– The S curve depicts the cumulative

– Planned test cases for each time unit– Test cases attempted – Test cases completed successfully

Software Quality Assurance – Lec13 34© Dr. A. Williams, Fall 2008

S Curve

0

20

40

60

80

100

120

1 3 5 7 9 11 13 15 17

Week

Tes

t p

oin

ts Plannedattemptedsuccessful

Page 18: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 35© Dr. A. Williams, Fall 2008

Defect arrivals over time

• This metric is concerned with the pattern of defect arrival. More specifically, – The X-axis is the time units (e.g. weeks) before

product release; – The Y-axis is the number of defects reported

during each time unit. – Include data for a comparable baseline

whenever possible.– A comparison of the patterns of several

releases is useful.

Software Quality Assurance – Lec13 36© Dr. A. Williams, Fall 2008

Defect arrival metric

0

10

20

30

40

50

60

70

80

Weeks before Product Ship

Nu

mb

er

of

De

fec

ts R

ep

ort

ed

Release A

Release B

Release C

Page 19: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 37© Dr. A. Williams, Fall 2008

Defect arrival metric

• A positive pattern of defect arrival is – Higher arrival earlier,– An earlier peak (relative to base line) and – A decline to a lower level earlier before the

product ship date.

• The tail-end is especially important as it indicates the quality of the product in the field.

Software Quality Assurance – Lec13 38© Dr. A. Williams, Fall 2008

Defect backlog• Defect backlog: the number of testing defects remaining at

any given time.

• Like defect arrivals, defect backlog can be used and graphed to track test progress.– the X-axis is the time units (weeks) before product ship; – the Y-axis is the number of defects in backlog.

• Unlike defect arrivals, the defect backlog is under control of development– Comparison of actual data to target suggests actions for

defect removal

• Defect backlog should be kept at a reasonable level – Release-to-release comparison is useful.

Page 20: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 39© Dr. A. Williams, Fall 2008

When is the product ready to release?

• This is not an easy question and there is no general criterion.

• The application of those in-process metrics together with the baselines may provide good insight for answering the question.

• Recommended: – Weekly defect arrivals– Defects in backlog

Software Quality Assurance – Lec13 40© Dr. A. Williams, Fall 2008

Operational Metrics • Collection of data after the release of a product.

• Used for:– Reliability estimation– Customer satisfaction– Feedback for enhancements.

• Data can be obtained from:– Product logs / monitors, if available.– Support line usage– Customer problem reports.– Customer surveys– Maintenance effort.

Page 21: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 41© Dr. A. Williams, Fall 2008

Support line metrics

• Number of support calls, based on each type of issue:– Defect not previously known– Duplicate of defect already known– Installation problem– Usability problems– Unclear documentation– User errors

• Call rate per time period.

• Average severity of issue.

Software Quality Assurance – Lec13 42© Dr. A. Williams, Fall 2008

Customer Satisfaction• Using customer surveys – e.g. customer rating

– Very satisfied– Satisfied– Neutral– Dissatisfied– Very dissatisfied

• Examples of metrics:– Percent of completely satisfied customers– Percent of satisfied customers (satisfied and completely

satisfied)– Percent of dissatisfied customers (dissatisfied and completely

dissatisfied) – Percent of non satisfied (neutral, dissatisfied and completely

dissatisfied)

Page 22: Process Maturity Models COMP 4004 – Fall 2008 Notes ...people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec13-Process... · Software Quality Assurance – Lec13 1 © Dr. A. Williams,

Software Quality Assurance – Lec13 43© Dr. A. Williams, Fall 2008

Reliability

• Measures the availability of the product for service.

• Possible measures:– Percentage of time available out of total time.– Expected amount of outage time over a

standard time interval.– Mean time to failure (MTTF): average amount

of operational time before a failure would occur.