22
EXAMPLES OF METRICS PROGRAMS EXAMPLES OF METRICS PROGRAMS MOTOROLLA MOTOROLLA Software metric program articulated by Software metric program articulated by Daskalantonakis (1992). He followed Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili and Goal/Question/Metric paradigm of Basili and Weiss. Weiss. Goals Goals Goal 1: Goal 1: Improve project planning Improve project planning Goal 2: Goal 2: Increase defect containment Increase defect containment Goal 3: Goal 3: Increase software reliability Increase software reliability Goal 4: Goal 4: Decrease software defect density Decrease software defect density Goal 5: Goal 5: Improve customer service Improve customer service Goal 6: Goal 6: Reduce the cost of Reduce the cost of nonconformance

EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

  • View
    223

  • Download
    0

Embed Size (px)

Citation preview

Page 1: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

EXAMPLES OF METRICS PROGRAMSEXAMPLES OF METRICS PROGRAMS

MOTOROLLAMOTOROLLA

Software metric program articulated by Daskalantonakis (1992). Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili and Weiss.He followed Goal/Question/Metric paradigm of Basili and Weiss.

GoalsGoals

Goal 1:Goal 1: Improve project planningImprove project planningGoal 2:Goal 2: Increase defect containmentIncrease defect containmentGoal 3:Goal 3: Increase software reliabilityIncrease software reliabilityGoal 4:Goal 4: Decrease software defect densityDecrease software defect densityGoal 5:Goal 5: Improve customer serviceImprove customer serviceGoal 6:Goal 6: Reduce the cost of nonconformanceReduce the cost of nonconformanceGoal 7:Goal 7: Increase software productivityIncrease software productivity

Page 2: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

Measurement AreasMeasurement Areas

Delivered defects and delivered defects per sizeDelivered defects and delivered defects per size Total effectiveness throughout the processTotal effectiveness throughout the process Adherence to scheduleAdherence to schedule Estimation accuracyEstimation accuracy Number of open customer problemsNumber of open customer problems Time that problems remain openTime that problems remain open Cost of nonconformanceCost of nonconformance

For each goal the question to be asked and the For each goal the question to be asked and the corresponding metrics were also formulated.corresponding metrics were also formulated.

Page 3: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

Goal 1:Goal 1: Improve project planningImprove project planning

Question 1.1: Question 1.1: What was the accuracy of estimating the actual value of What was the accuracy of estimating the actual value of project schedule?project schedule?

Metric 1.1:Metric 1.1: Schedule Estimation Accuracy (SEA)Schedule Estimation Accuracy (SEA)

SEASEA = = Actual project duration Actual project duration Estimated project durationEstimated project duration

Question 1.2: Question 1.2: What was the accuracy of estimating the actual value of What was the accuracy of estimating the actual value of project effort?project effort?

Metric 1.2:Metric 1.2: Effort Estimation Accuracy (EEA)Effort Estimation Accuracy (EEA)

EEAEEA == Actual project effort Actual project effort Estimated project effortEstimated project effort

Page 4: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

Goal 2:Goal 2: Increase defect containmentIncrease defect containment

Question 2.1:Question 2.1: What is the currently known effectiveness of What is the currently known effectiveness of the the defect detection process prior to release?defect detection process prior to release?

Metric 2.1:Metric 2.1: Total Defect Containment Effectiveness (TDCE)Total Defect Containment Effectiveness (TDCE)

TDCETDCE == Number of prerelease defectsNumber of prerelease defectsNumber of prerelease defects + Number of prerelease defects + Number of post-release defectsNumber of post-release defects

Question 2.2: Question 2.2: What is the currently known containment effectiveness of What is the currently known containment effectiveness of faults introduced during each constructive phase of faults introduced during each constructive phase of software development for a particular software product?software development for a particular software product?

Metric 2.2:Metric 2.2: Phase Containment Effectiveness for phase i (PCEi)Phase Containment Effectiveness for phase i (PCEi)

PCEPCEii == Number of phase Number of phase i i errorserrorsNumber of phase i errors + Number of phase Number of phase i errors + Number of phase i i defectsdefects

Page 5: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

Goal 2:Goal 2: Increase defect containmentIncrease defect containmentContd…Contd…

Error: Error: A problem found during the review of the A problem found during the review of the phase where it was introduced.phase where it was introduced.

Defect:Defect: A problem found later than the review of the A problem found later than the review of the phase where it was introduced. phase where it was introduced.

Fault:Fault: Both errors and defects are considered faults.Both errors and defects are considered faults.

Page 6: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

Goal 3:Goal 3: Increase software reliabilityIncrease software reliability

Question 3.1:Question 3.1: What is the rate of software failures, What is the rate of software failures, and how does it change over time?and how does it change over time?

Metric 3.1:Metric 3.1: Failure Rate (FR)Failure Rate (FR)

FRFR = = Number of failuresNumber of failures

Execution timeExecution time

Page 7: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

Goal 4:Goal 4: Decrease software defect densityDecrease software defect density

Question 4.1:Question 4.1: What is the normalized number of in-process What is the normalized number of in-process faults, and how does it compare with the number faults, and how does it compare with the number of in-process defects?of in-process defects?

Metric 4.1a:Metric 4.1a: In-process Faults (IPF)In-process Faults (IPF)

IPFIPF == In-process faults caused by incremental software developmentIn-process faults caused by incremental software development

Assembly-equivalent delta sourceAssembly-equivalent delta source sizesize

Metric 4.2b:Metric 4.2b: In-process Defects (IPD)In-process Defects (IPD)

IPDIPD == In-process defects caused by incremental software developmentIn-process defects caused by incremental software development

Assembly-equivalent delta source sizeAssembly-equivalent delta source size

Page 8: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

Goal 4:Goal 4: Decrease software defect densityDecrease software defect densityContd…Contd…

Question 4.2: Question 4.2: What is the currently known defect content of What is the currently known defect content of software delivered to customers, normalized by software delivered to customers, normalized by assembly-equivalent size?assembly-equivalent size?

Metric 4.2a:Metric 4.2a: Total Released Defects total (TRD total)Total Released Defects total (TRD total)

TRD total =TRD total = Number of released defectsNumber of released defectsAssembly-equivalent total source sizeAssembly-equivalent total source size

Metric 4.2b:Metric 4.2b: Total Released Defects total (TRD delta)Total Released Defects total (TRD delta)

TRD total =TRD total = No. of released defects caused by incremental s/w dev.No. of released defects caused by incremental s/w dev.

Assembly-equivalent total source sizeAssembly-equivalent total source size

Page 9: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

Goal 4:Goal 4: Decrease software defect densityDecrease software defect densityContd…Contd…

Question 4.3:Question 4.3: What is the currently known customer-found What is the currently known customer-found defect defect content of software delivered to customers, normalized by content of software delivered to customers, normalized by assembly-equivalent source size?assembly-equivalent source size?

Metric 4.3a:Metric 4.3a: Total Released Defects total (CFD total)Total Released Defects total (CFD total)

CFD total =CFD total = Number of customer-found defectsNumber of customer-found defectsAssembly-equivalent total source sizeAssembly-equivalent total source size

Metric 4.2b:Metric 4.2b: Total Released Defects total (CFD delta)Total Released Defects total (CFD delta)

CFD delta =CFD delta = No. of customer-found defects caused by incremental s/w dev.No. of customer-found defects caused by incremental s/w dev.Assembly-equivalent total source sizeAssembly-equivalent total source size

Page 10: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

Goal 5:Goal 5: Improve customer serviceImprove customer service

Question 5.1:Question 5.1: What is the number of new problems that were What is the number of new problems that were opened during the month?opened during the month?

Metric 5.1:Metric 5.1: New Open Problems (NOP)New Open Problems (NOP)

NOP =NOP = Total new postrelease problems opened during the month Total new postrelease problems opened during the month

Question 5.2:Question 5.2: What is the number of open problems at the end What is the number of open problems at the end of the month?of the month?

Metric 5.2:Metric 5.2: Total Open Problems (TOP)Total Open Problems (TOP)

TOPTOP == Total new postrelease problems that remain open at the Total new postrelease problems that remain open at the end of the monthend of the month

Page 11: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

Goal 5:Goal 5: Improve customer serviceImprove customer service Contd…Contd…

Question 5.3: Question 5.3: What is the mean age of open problems at the end of the What is the mean age of open problems at the end of the month?month?

Metric 5.3:Metric 5.3: Mean Age of Open Problems (AOP)Mean Age of Open Problems (AOP)

AOP =AOP = (Total time post-release problems remaining open at the (Total time post-release problems remaining open at the end of the month have been open)/(Number of open post-end of the month have been open)/(Number of open post-release problems remaining open at the end of the month)release problems remaining open at the end of the month)

Question 5.3:Question 5.3: What is the mean age of problems that were closed during What is the mean age of problems that were closed during the month?the month?

Metric 5.3:Metric 5.3: Mean Age of Closed Problems (ACP)Mean Age of Closed Problems (ACP)

AOP =AOP = (Total time post-release problems closed within the month (Total time post-release problems closed within the month were open)/(Number of open post-release problems closed were open)/(Number of open post-release problems closed within the month)within the month)

Page 12: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

Goal 6: Reduce cost of nonconformanceGoal 6: Reduce cost of nonconformance

Question 6.1:Question 6.1: What was the cost to fix postrelease problems What was the cost to fix postrelease problems during the month?during the month?

Metric 6.1:Metric 6.1: Cost of Fixing Problems (CFP)Cost of Fixing Problems (CFP)

CFP =CFP = Dollar cost associated with fixing postrelease Dollar cost associated with fixing postrelease problems within the monthproblems within the month

Page 13: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

Goal 7: Increase software productivityGoal 7: Increase software productivity

Question 7.1:Question 7.1: What was the productivity of software What was the productivity of software development projects (based on source size)?development projects (based on source size)?

Metric 7.1a:Metric 7.1a: Software Productivity total (SP total)Software Productivity total (SP total)

SP total =SP total = Assembly-equivalent total source sizeAssembly-equivalent total source sizeSoftware development effortSoftware development effort

Metric 7.1b:Metric 7.1b: Software Productivity delta (SP delta)Software Productivity delta (SP delta)

SP total =SP total = Assembly-equivalent delta source sizeAssembly-equivalent delta source sizeSoftware development effortSoftware development effort

Page 14: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

Daskalantonakis also described additional in-Daskalantonakis also described additional in-process metrics that can be used for schedule, process metrics that can be used for schedule, project, and quality controlproject, and quality control

Life-cycle phase and schedule tracking metric: Life-cycle phase and schedule tracking metric: Track schedule Track schedule based on life-cycle phase comparing actual to plan.based on life-cycle phase comparing actual to plan.

Cost/earned value tracking metric: Cost/earned value tracking metric: Track actual cumulative cost of Track actual cumulative cost of the project versus budgeted cost, and actual cost of the project so far, the project versus budgeted cost, and actual cost of the project so far, with continuous update throughout the project.with continuous update throughout the project.

Requirements tracking metric: Requirements tracking metric: Track the number of requirements Track the number of requirements change at the project level.change at the project level.

Design tracking metric: Design tracking metric: Track the number of requirements Track the number of requirements implemented in design versus the number of requirements written.implemented in design versus the number of requirements written.

Fault-type tracking metric: Fault-type tracking metric: Track fault cause.Track fault cause.

Page 15: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

Remaining defect metrics:Remaining defect metrics: Track faults per month for the project and Track faults per month for the project and use Rayleigh curve to project the number of faults assumed to be use Rayleigh curve to project the number of faults assumed to be found in the months ahead development. found in the months ahead development.

Review effectiveness metric:Review effectiveness metric: Track error density by stages of review Track error density by stages of review and use control chart methods to lag the exceptionally high or low and use control chart methods to lag the exceptionally high or low data points.data points.

Page 16: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

HEWLETT-PACKARDHEWLETT-PACKARD

Software Metrics: Establishing A Company – Wide Program Software Metrics: Establishing A Company – Wide Program byby Grady and Caswell (1986)Grady and Caswell (1986)

The book lists primitive and computed metrics widely used at HP. The book lists primitive and computed metrics widely used at HP.

Primitive MetricsPrimitive Metrics are those that are directly measurable and accountable are those that are directly measurable and accountable such as token, data token, defect, total operands, LOC, and so forth.such as token, data token, defect, total operands, LOC, and so forth.

Computed MetricsComputed Metrics are metrics that are mathematical combinations of are metrics that are mathematical combinations of two or more primitive metrics e.g.two or more primitive metrics e.g.

Average fixed defects/working dayAverage fixed defects/working day

Average engineering hours/fixed defectAverage engineering hours/fixed defect

Average reported defects/working dayAverage reported defects/working day

Page 17: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

Bang: “a quantitative indicator of net usable function from Bang: “a quantitative indicator of net usable function from the user’s point of view”.the user’s point of view”.

Two methods for computing BangTwo methods for computing Bang

i.i. For function-strong systems, involves counting the For function-strong systems, involves counting the tokens entering and leaving the function multiplied tokens entering and leaving the function multiplied

by by the weight of the function.the weight of the function.

ii.ii. For data-strong systems, involves the objects in the For data-strong systems, involves the objects in the database weighted by the number of relationships database weighted by the number of relationships

of of which the object is a member.which the object is a member.

Branches covered/total branches: when running a Branches covered/total branches: when running a program, this metric indicates what percentage of the program, this metric indicates what percentage of the decision points were actually executed.decision points were actually executed.

Defects/KNCSSDefects/KNCSS ---------- end-product quality metricsend-product quality metrics

Page 18: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

Defects/LODDefects/LOD ---------- end-product quality metricsend-product quality metrics

wherewhereLOD –LOD –Lines of documentation not included in Lines of documentation not included in

program source code, and program source code, and KNCSS –KNCSS – Thousand non-comment Thousand non-comment source statements.source statements.

Defects/testing time ---Defects/testing time ---

Design weight: “Design weight is a simple sum of the Design weight: “Design weight is a simple sum of the module weights over the set of all modules in the design”module weights over the set of all modules in the design”

NCSS/engineering month ---- productivity measureNCSS/engineering month ---- productivity measurePercent overtime: average overtime/40 hours per weekPercent overtime: average overtime/40 hours per week

(Phase) engineering months/total engineering months(Phase) engineering months/total engineering months

Page 19: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

IBM RochesterIBM Rochester

IBM Rochester uses corporate 5-UP software quality IBM Rochester uses corporate 5-UP software quality metrics defined by the metrics defined by the “IBM corporate software “IBM corporate software measurement council”.measurement council”.

Overall customer satisfaction as well as satisfaction with the Overall customer satisfaction as well as satisfaction with the CUPRIMDS parametersCUPRIMDS parameters

Post-release defect rate for three-year LOP tracking: TVUA/MSST Post-release defect rate for three-year LOP tracking: TVUA/MSST based on the release the defects are reported.based on the release the defects are reported.

Customer problem callsCustomer problem calls

Fix response timeFix response time

Number of defective fixesNumber of defective fixes

Page 20: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

In addition to 5-UP metrics other product quality metrics In addition to 5-UP metrics other product quality metrics and many in-process metrics are also used.and many in-process metrics are also used.

TVUA/KSSI based on whether the release contains the defects TVUA/KSSI based on whether the release contains the defects or not (regardless of which reported release) for 4 year LOP or not (regardless of which reported release) for 4 year LOP trackingtracking

TVUA/KSSI based on release origin for 4 year LOP trackingTVUA/KSSI based on release origin for 4 year LOP tracking

Customer reported problems per user monthCustomer reported problems per user month Backlog management indexBacklog management index

Postrelease arrival patterns for defects and problems (both Postrelease arrival patterns for defects and problems (both defects and non-defect-oriented problems)defects and non-defect-oriented problems)

Defect removal model for the software development process Defect removal model for the software development process with target defect removal rate for each phasewith target defect removal rate for each phase

Page 21: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

Phase effectiveness (for each phase of inspection and test)Phase effectiveness (for each phase of inspection and test)

Inspection coverage, effort, and defect ratesInspection coverage, effort, and defect rates

In-process inspection escape rateIn-process inspection escape rate

Compile failures and build/integration defectsCompile failures and build/integration defects

Driver stability indexDriver stability index

Weekly defect arrivals and backlog during testingWeekly defect arrivals and backlog during testing

Defect severityDefect severity

Defect causeDefect cause

Page 22: EXAMPLES OF METRICS PROGRAMS MOTOROLLA Software metric program articulated by Daskalantonakis (1992). He followed Goal/Question/Metric paradigm of Basili

Reliability: mean time to initial program loading (IPL) Reliability: mean time to initial program loading (IPL) during testduring test

Models for post-release defect estimationModels for post-release defect estimation

S curves for project progress comparing actual to plan for S curves for project progress comparing actual to plan for each phase of development such as number of each phase of development such as number of inspections conducted by week, LOC integrated by week, inspections conducted by week, LOC integrated by week, number of test cases attempted and succeeded by week, number of test cases attempted and succeeded by week, and so forth.and so forth.