76
www.construx.com Measuring Software Development Productivity @stevemconstrux

Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

www.construx.com

Measuring Software Development Productivity

@stevemconstrux

Page 2: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

• Learning Center tools for professional development: http://learning.acm.org • 4,500+ trusted technical books and videos by O’Reilly, Morgan Kaufmann, etc. • 1,000+ courses, virtual labs, test preps, live mentoring for software professionals covering

programming, data management, cybersecurity, networking, project management, more • Training toward top vendor certifications (CEH, Cisco, CISSP, CompTIA, ITIL, PMI, etc.) • Learning Webinars from thought leaders and top practitioner • Podcast interviews with innovators, entrepreneurs, and award winners

• Popular publications:

• Flagship Communications of the ACM (CACM) magazine: http://cacm.acm.org/ • ACM Queue magazine for practitioners: http://queue.acm.org/

• ACM Digital Library, the world’s most comprehensive database of computing literature:

http://dl.acm.org.

• International conferences that draw leading experts on a broad spectrum of computing topics: http://www.acm.org/conferences.

• Prestigious awards, including the ACM A.M. Turing and Infosys: http://awards.acm.org

• And much more… http://www.acm.org.

ACM Highlights

Page 3: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

www.construx.com

Measuring Software Development Productivity

@stevemconstrux

Page 4: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

Copyright Notice These materials are © 2013-2016 Construx Software Builders, Inc. All Rights Reserved. No part of the contents of this presentation may be reproduced or transmitted in any form or by any means without the written permission of Construx Software Builders, Inc.

Page 5: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

I Invite You to Join Me on a Journey

Page 6: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

9 Construx®

A Journey to Measure Productivity?

Page 7: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

10 Construx®

A Journey to Measure

Productivity!

Page 8: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

Why Measure Productivity?

Page 9: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

12 Construx®

Levels of Productivity Measurement

Organization/Company Team/Workgroup (5-9 people) Individual Contributor

Page 10: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

13 Construx®

Why Measure Productivity? (Organizational Level)

Assess competitiveness with other organizations Track and evaluate progress over time Support performance evaluation of software executives Support bonus allocation among software executives Decide allocation of resources to onshore / offshore / outsourced

These are not my reasons; these are reasons given by clients who state they want to measure productivity.

Page 11: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

14 Construx®

Why Measure Productivity? (Team/Workgroup Level)

Compare teams to see who should be learning from whom Support performance evaluations of team managers Support allocation of bonuses across teams Support allocation of work onshore/offshore

These are not my reasons; these are reasons given by clients who state they want to measure productivity.

Page 12: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

15 Construx®

Why Measure Productivity? (Individual Level)

Support allocation of resources (people) across teams Contribute to individual performance review process Support allocation of bonuses among individual contributors

These are not my reasons; these are reasons given by clients who state they want to measure productivity.

Page 13: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

16 Construx®

Potential Issues

Two issues that are potentially problematic in measuring productivity

Page 14: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

17 Construx®

Two Potential Issues

(1) Measuring (2) Productivity

Page 15: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

What is Productivity?

Page 16: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

20 Construx®

Simple Definition

Productivity = Output / Input This is simple, but many issues related to productivity can be resolved by referring back to this definition

Page 17: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

21 Construx®

What is “Output”

THIS IS the Key Question! Is “lines of code” an “output” in economic terms? Are “function points” an “output” in economic terms? Is work on a project that gets cancelled “output”? Is work on a project that is delivered successfully but ultimately fails in the marketplace “output”?

Page 18: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

22 Construx®

Candidate Outputs

Number of product releases Number of products Revenue Profit Bug fixes Closed change requests Hours of up time / service level attained Support for company strategy Score on balanced score card

Page 19: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

23 Construx®

Candidate Outputs

Most of these measures of output are: Impossible to measure at the individual level Extremely difficult to measure at the team level Problematic to measure even at the company level

Page 20: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

24 Construx®

So What Outputs do we Measure?

We measure proxies Which makes the measures of output approximations at best

Page 21: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

25 Construx®

Candidate Inputs

Technical staff hours Technical staff cost (which is not exactly the same as staff hours) Business staff hours needed to support technical staff, e.g., in defining requirements * Business staff hours needed to support technical staff, e.g., manager travel time to India, etc. * Investment in travel * Investment in hardware and other infrastructure, especially communications infrastructure * Delta in technical debt pre-project vs. post-project *

* Difficult to measure above the individual contributor level

Page 22: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

26 Construx®

Even the Easy Inputs are More Difficult to Measure Than You Think

For example: Technical staff hours

Page 23: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

27 Construx®

Observations about the Simple Definition of Productivity

The definitions of Input and Output change from the individual level to workgroup level, to team level, to business unit level, to company level Attaining business buy-in to the definitions of Input and Output at any of the levels is typically a challenging task

Page 24: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

28 Construx®

Observations about the Simple Definition of Productivity

Note we have not found any great solutions to the problem of even defining what input or output we want to measure, much less actually measuring it

Page 25: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

Measurement

Page 26: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

Underlying 10x Differences in Productivity

Page 27: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

31 Construx®

Origin of 10x

Suitably for purposes of this talk, “10x” originated in a search for measuring productivity We have to talk about “10x” because everything else in measuring software productivity depends on understanding that, first

Page 28: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

32 Construx®

Sackman, Erickson, Grant, 1968

Purpose of their study was to obtain data on differences in online vs. offline performance The original goal of their research was thwarted by the fact that individual productivity differences drowned out differences attributable to online vs. offline performance

Page 29: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

33 Construx®

Sackman, Erickson, Grant, 1968

All programmers had at least 7 years experience Range of initial coding times: 20:1 Range of “debugging” times: 25:1 Range of program sizes produced: 5:1 Range of program execution speeds: 10:1

These differences have continued to plague software engineering research since 1968

Page 30: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

34 Construx®

Differences in Productivity

Productivity

Productivity of Team A A

B Productivity of Team B

Page 31: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

35 Construx®

Differences in Methods

Productivity

Team A Used Pair Programming

A

B Team B Used Formal Inspections

Which method is

better?

Page 32: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

36 Construx®

Differences in Capability

Productivity

Team A Had Star Performers

A

B Team B Had Average Performers

Now which

method is better?

Page 33: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

37 Construx®

Differences in Capability

Productivity

Team A’s Normal Range A

B Team B’s Normal Range

Now which

method is better?

Page 34: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

38 Construx®

Effect of Variations in Human Productivity on Measuring Productivity

Productivity

Typical Variation in Method Productivity (~20%)

Typical Variation in Individual Productivity (20:1) and Team Productivity (3-10:1)

Page 35: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

39 Construx®

Selected Other Research

Study Observed Ratio

Sackman, Erickson, & Grant 1968 5:1 to 28:1*

Daly, Brooks, et al, 1996 3.2:1 to 7.3:1*

Cartwright & Sheppard, 1998 1.9:1 to 2.2:1*

Curtis, 1981 7.8:1 to 22.3:1*

DeMarco & Lister 1985 5.6:1*

Humphrey 1995 20.4:1*

Boehm 1981 4.2:1†

Boehm 2000 5.3:1†

Card 1987 8.8:1 to 21.6:1*

*Individual Productivity †Team Productivity

Page 36: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

40 Construx®

Why Does This Matter?

Why do you want to measure productivity?

If you’re measuring to assess the impact of a process, practice, or environmental factor:

The measurement will be subject to the confounding factor of 10x Variation The measurement is not likely to be valid

Page 37: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

41 Construx®

Why Does This Matter?

Why do you want to measure productivity?

Even if you’re measuring to assess individual or team productivity per se, the measurement will be confounded by process, practice, and environmental differences between projects

The measurement is not likely to be accurate

Page 38: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

42 Construx®

Measurement in Research vs. Commercial Settings

Research: Academia vs. Commercial Purposes of measurement are different Possible ways to measure are different The problem just described is a problem even in academic research The problem becomes much more significant in commercial settings, because of more confounding factors This presentation focuses on measuring productivity in a commercial setting

Page 39: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

Evaluating Measures of

Individual Productivity

Page 40: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

44 Construx®

Evaluating Productivity Measures

We’ll look at specific measures We’ll define criteria for evaluating (scoring) the measures I’ll present a score card

Page 41: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

45 Construx®

Common Individual Productivity Measures (for Developers)

Lines of Code / staff month (LOC/SM) Function Points / staff month (FP/SM) Story points / staff month 360 degree peer evaluations Manager evaluation Task-completion predictability Test cases passed Defect counts

Page 42: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

46 Construx®

Criteria for a Good Individual Productivity Measurement

Measurement truly reflects “productivity” Directly or indirectly accounts for all work output Useful for measuring work of non-programmers (e.g., testers), directly or indirectly Resists “gaming” by Individual Contributors Strongly correlated with business value created Objective, independently verifiable Measures "output" the same, regardless of programming language used Supports cross-project comparisons Accounts for best people getting most difficult assignments Data can be collected easily and cheaply

Page 43: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

47 Construx®

Measurement Considerations

Measurement truly reflects “productivity”

Page 44: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

48 Construx®

Measurement Considerations

Directly or indirectly accounts for most or all work output

Page 45: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

49 Construx®

Measurement Considerations

Useful for measuring work of non-programmers (e.g., testers, documenters, scrum masters, business analysts, etc.), directly or indirectly

Page 46: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

50 Construx®

Measurement Considerations

Resists “gaming” by Individual Contributors

I hope this drives the

right behavior

I’m going to code myself

a new minivan this afternoon

Page 47: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

51 Construx®

Measurement Considerations

Resists “gaming” by Individual Contributors This is a big, big, big deal with simplistic measurement approaches “What gets measured gets done” (What doesn’t get measured doesn’t get done)

Count on work sliding from measured activities into unmeasured activities Avoid single-dimension productivity measures

Page 49: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

53 Construx®

Measurement Considerations

Objective, repeatable, independently verifiable

Page 50: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

54 Construx®

Measurement Considerations

Measures "output" the same, regardless of programming language used

LOC/FP___ C 128 C++ 53 Cobol 91 Java 53 VB 29 ...

Page 51: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

55 Construx®

Measurement Considerations

Supports cross-project comparisons

Page 52: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

56 Construx®

Measurement Considerations

Accounts for the best people getting the most difficult assignments

Page 53: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

57 Construx®

Measurement Considerations

Data can be collected easily and cheaply

Page 54: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

58 Construx®

Evaluating the Measures: Scale

Excellent / 5

Good / 4

Neither Good nor Bad / 3

Bad / 2

Terrible / 1

Page 55: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

59 Construx®

Evaluating the Measures

The following scoring is subjective

But It is explicit It is structured

Page 56: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

60 Construx®

Comparison of Individual (Developer) Productivity Measures

LOC

/SM

FP/S

M

Sto

ry p

oint

s /

staf

f mon

th

360

degr

ee P

eer R

evie

ws

Man

ager

Eva

luat

ion

Task

Com

plet

ion

Pre

dict

’y

Test

Cas

es P

asse

d

Def

ect R

ates

Measurement truly reflects “productivity” 3 4 4 3 2 1 4 3

Directly or indirectly accounts for all or most work output 4 4 5 5 5 2 4 3

Useful for measuring work of non-programmers (e.g., testers) 1 4 4 5 5 5 4 4

Resists “gaming” by Individual Contributors 2 5 4 3 3 4 4 5

Strongly correlated with business value created 2 3 4 3 4 2 4 3

Objective, independently verifiable 4 4 3 2 2 4 4 4

Measures "output" the same, regardless of programming language used 1 5 3 5 5 5 5 5

Supports person-to-person comparisons within a team 3 4 4 5 4 4 4 4

Accounts for best people getting most difficult assignments 2 4 4 5 5 1 5 2

Data can be collected easily and cheaply 3 1 3 3 4 3 3 4

Average 2.5 3.8 3.8 3.9 3.9 3.1 4.1 3.7

Rank 8 4 4 2 2 7 1 6

Page 57: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

61 Construx®

Comparison of Individual (Developer) Productivity Measures

LOC/

SM

FP/S

M

Stor

y po

ints

/ s

taff

mon

th

360

degr

ee P

eer R

evie

ws

Man

ager

Eva

luat

ion

Task

Com

plet

ion

Pred

ict’y

Test

Cas

es P

asse

d

Defe

ct R

ates

Measurement truly reflects “productivity”Directly or indirectly accounts for all or most work outputUseful for measuring work of non-programmers (e.g., testers)Resists “gaming” by Individual ContributorsStrongly correlated with business value createdObjective, independently verifiableMeasures "output" the same, regardless of programming language usedSupports person-to-person comparisons within a teamAccounts for best people getting most difficult assignmentsData can be collected easily and cheaplyAverageRank 8 4 4 2 2 7 1 6

Page 58: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

62 Construx®

Observations about Individual Productivity Measures

LOC/SM is easily the worst measure (score of 2.5)

Page 59: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

63 Construx®

Observations about Individual Productivity Measures

No measure does better than 4.1 on a 5 point scale

Page 60: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

64 Construx®

Observations about Individual Productivity Measures

The top 6 measures are closely ranked (3.7-4.1)

Page 61: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

65 Construx®

Observations about Individual Productivity Measures

“Test cases passed” resists gaming if you have independent testing; without independent testing its overall score drops to 3.9 (i.e., moves into 6-way tie for 1st place (3.7-3.9))

Page 62: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

66 Construx®

Observations about Individual Productivity Measures

“Manager evaluation” does better than you might expect as a “measure,” it has the best “effort” rating, and is normally the most readily available

Page 63: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

67 Construx®

Steve’s Conclusion

The business problems that individual productivity “measurement” needs to address can be addressed more effectively by a non-measurement

technique

Page 64: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

Evaluating Measures of Team Productivity

Page 65: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

69 Construx®

Possible Team Productivity Measures

LOC / staff month FP / staff month Story points / staff month 360 degree peer reviews Manager stack-rank evaluations * Project-completion predictability * Test cases passed Defect rates Score card *

* Different from individual measures

Page 66: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

70 Construx®

Team-Level Measurement Considerations

Measurement truly reflects “productivity” Directly or indirectly accounts for all work output Useful for measuring work of the whole team, directly or indirectly * Resists gaming by the team * Strongly correlated with business value created * Objective, independently verifiable Measures "output" the same, regardless of programming language used Accurately reflects output of teams working on diverse kinds of projects * Accounts for best teams getting most difficult assignments * Data can be collected easily and cheaply

* Different from individual measures

Page 67: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

71 Construx®

Comparison of Team-Level Productivity Measures

LOC

/SM

FP/S

M

Sto

ry p

oint

s /

staf

f mon

th

Man

ager

Sta

ck-R

ank

Eva

ls

Pro

ject

Com

plet

ion

Pre

dict

’y

Test

Cas

es P

asse

d

Def

ect R

ates

Sco

re C

ard

Measurement truly reflects “productivity” 3 4 2 3 1 3 3 5

Directly or indirectly accounts for all work output 2 4 5 4 2 4 3 5

Useful for measuring work of whole team 1 4 4 3 5 4 4 5

Resists gaming by the team 2 5 1 4 5 2 5 5

Strongly correlated with business value created 2 3 3 4 3 4 4 5

Objective, independently verifiable 5 5 1 3 5 4 4 3

Measures "output" the same, regardless of programming language 1 5 5 5 5 5 5 5

Accurately reflects output of teams working on diverse projects 2 3 4 4 4 4 4 4

Accounts for best teams getting most difficult assignments 3 4 3 4 3 4 3 3

Data can be collected easily and cheaply 3 1 3 4 5 3 4 5

Average 2.4 3.8 3.1 3.8 3.8 3.7 3.9 4.5

Rank 8 3 7 3 3 6 2 1

Page 68: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

72 Construx®

Comparison of Team-Level Productivity Measures

LOC

/SM

FP/S

M

Stor

y po

ints

/ s

taff

mon

th

Man

ager

Sta

ck-R

ank

Eval

s

Proj

ect C

ompl

etio

n Pr

edic

t’y

Test

Cas

es P

asse

d

Def

ect R

ates

Scor

e C

ard

Measurement truly reflects “productivity”Directly or indirectly accounts for all work outputUseful for measuring work of whole teamResists gaming by the teamStrongly correlated with business value createdObjective, independently verifiable

Measures "output" the same, regardless of programming language

Accurately reflects output of teams working on diverse projectsAccounts for best teams getting most difficult assignmentsData can be collected easily and cheaplyAverageRank 8 3 7 3 3 6 2 1

Page 69: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

73 Construx®

Observations about Measures of Team Productivity

Again, LOC/SM is easily the worst measure (score of 2.4) Score card is clear #1 (score of 4.5) The next 5 measures are closely ranked (3.7-3.9) “Test cases passed” is more susceptible to gaming at the team level than at the individual level

Page 70: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

74 Construx®

Scorecard Approach: What Output do we want from the Team?

Category Possible Score

Actual Score

On-Time Delivery 15 10

Initial Defect Rate 5 5

+90 Day Defect Rate 10 10

% of Planned Functionality Delivered 15 15

Early notification of problems 5 4

Executive’s Satisfaction with Project Execution and Delivery 5 5

Sales’ Satisfaction with Project Execution and Delivery 5 4

Total 60 53 (88%)

Page 71: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

75 Construx®

Comments about the Scorecard

Only significant weakness of the score card is its independent verifiability It is not “measurement” per se, but …

It is structured It avoids over-optimization for one measure It can provide for cross-project comparisons It can be made public and can be reviewed

The score card can and should be tailored to support organizational goals

Page 72: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

Conclusions

Page 73: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

77 Construx®

Conclusions

True productivity measures in software are significantly limited:

Agreeing on a definition of productivity is significantly problematic Meaningful Outputs are difficult or impossible to measure

Real business outputs are rarely measured, instead proxies for the real output are measured and are subject to significant measurement error

Inputs are subject to significant measurement error The whole measurement exercise is subject to massive measurement error because of the “10x variation” phenomenon

Page 74: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

78 Construx®

The End of Our Journey

The questions that businesses want to address through measuring productivity

can be addressed effectively through non-measurement or quasi-measurement approaches

These alternative approaches stack up very favorably

vs. measurement, especially when you account fully for the limitations involved in true measurement

Page 75: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

Construx 10900 NE 8th Street, Suite 1350 Bellevue, WA 98004 +1 (866) 296-6300 www.construx.com

Construx Software is committed to helping individuals and organizations improve their software development practices. For information about our

training and consulting services, contact [email protected].

Page 76: Measuring Software Development Productivitycs.uni.edu › ~east › teaching › se › spr16 › notes › SteveMcConnell... · 2016-03-31 · Measuring Software Development Productivity

ACM: The Learning Continues…

Questions about this webcast? [email protected] ACM Learning Webinars (on-demand archive): http://learning.acm.org/webinar ACM Learning Center: http://learning.acm.org ACM SIGSOFT: http://sigsoft.org/ ACM Queue: http://queue.acm.org/