35
Sponsored by the U.S. Department of Defense © 2005 by Carnegie Mellon University page 1 Pittsburgh, PA 15213-3890 Carnegie Mellon Software Engineering Institute CMMI ® --Version 1.2 and Beyond! Mike Phillips CMMI Program Manager Software Engineering Institute Carnegie Mellon University March 15, 2005 ® CMMI is registered in the U.S. Patent and Trademark Office by Carnegie Mellon University.

CarnegieMellon Software Engineering Institute

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: CarnegieMellon Software Engineering Institute

Sponsored by the U.S. Department of Defense© 2005 by Carnegie Mellon University

page 1

Pittsburgh, PA 15213-3890

Carnegie MellonSoftware Engineering Institute

CMMI®--Version 1.2 and Beyond!Mike PhillipsCMMI Program ManagerSoftware Engineering InstituteCarnegie Mellon University

March 15, 2005

® CMMI is registered in the U.S. Patent and Trademark Office by Carnegie Mellon University.

Page 2: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 2

Carnegie MellonSoftware Engineering Institute

CMMI TodayVersion 1.1 CMMI Product Suite in use for over 3 years.

Errata sheets cover known errors and changes with book publication; FAQs cover broader issues.

CMMI web pages hits

• 1.5M/month

• 50K/day

Page 3: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 3

Carnegie MellonSoftware Engineering Institute CMMI ®

CMMI Transition Status as of 2-28-05

TrainingIntroduction to CMMI – 30,004 trainedIntermediate CMMI – 1,375 trainedIntroduction to CMMI Instructors - 318 trainedSCAMPI Lead Appraisers - 514 trained

AuthorizedIntroduction to CMMI V1.1 Instructors – 253SCAMPI V1.1 Lead Appraisers – 364

Page 4: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 4

Carnegie MellonSoftware Engineering Institute

0

5000

10000

15000

20000

25000

30000

1995 1996 1997 1998 1999 2000 2001 2002 2003 2004

Intro to the CMM and CMMI Attendees (Cumulative)

CMM Intro

CMMI Intro

CMMI Intermediate

8-31-04

Page 5: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 5

Carnegie MellonSoftware Engineering Institute

Adoption—What’s Available?Publication of SEI Series Book with Addison-Wesley; others include:

• CMMI Distilled: Second Edition• Systematic Process Improvement Using ISO

9001:2000 and CMMI• Balancing Agility and Discipline• Practical Insight into CMMI• Interpreting the CMMI• Real Process Improvement Using the CMMI• Making Process Improvement Work• CMMI: Un Itinéraire Fléché• A Guide to the CMMI• CMMI: A Framework…

Page 6: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 6

Carnegie MellonSoftware Engineering Institute

How about SEI Publications?Technical notes and special reports:• Interpretive Guidance Project: Preliminary and Final

Reports• CMMI and Product Line Practices• CMMI and Earned Value Management• Interpreting CMMI for Operational Organizations• Interpreting CMMI for COTS Based Systems• Interpreting CMMI for Service Organizations • CMMI Acquisition Module (CMMI-AM)• Interpreting CMMI for Marketing (in progress)• Providing Safety and Security Assurance (in progress)• Demonstrating the Impact and Benefits of CMMI

Page 7: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 7

Carnegie MellonSoftware Engineering Institute

Examples of Impact: Schedule• 50% reduction in release turn around time (Boeing, Australia)

• Increased the percentage of milestones met from approximately 50 percent to approximately 95 percent (General Motors)

• Decreased the average number of days late from approximately 50 to fewer than 10 (General Motors)

• Increased through-put resulting in more releases per year (JP Morgan Chase)

• Met every milestone (25 in a row) on time, with high quality andcustomer satisfaction (Northrop Grumman Defense Enterprise Systems)

Page 8: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 8

Carnegie MellonSoftware Engineering Institute

Examples of Impact: Productivity• Improved productivity substantially, with “significantly more

rigorous engineering practices” due to CMMI (Fort Sill Fire Support Software Engineering Center)

• Improved software productivity (including reuse) from a 1992 baseline by approximately 80 percent at SW-CMM maturity level 5 In 1997 to over 140 percent at CMMI ML 5 in 2001 (Lockheed Martin Systems Integration)

• 25 percent productivity improvement in 3 years (Siemens Information Systems Ltd, India)

• Used Measurement & Analysis to realize an 11 percent increase in productivity, corresponding to $4.4M in additional value (reported under non-disclosure)

Page 9: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 9

Carnegie MellonSoftware Engineering Institute

Examples of Impact: Quality• Reduced software defects per million delivered SLOC by over

50 percent compared to defects prior to CMMI(Lockheed Martin Systems Integration)

• Reduced defect rate at CMMI ML5 approximately one third compared to performance at SW-CMM ML5 (Lockheed Martin Maritime Systems & Sensors – Undersea Systems)

• Improved defect removal before test from 50 percent to 70 percent, leaving 0.35 post release defects per KLOC (Siemens Information Systems Ltd, India)

• Only 2 percent of all defects found in the fielded system (Northrop Grumman Defense Enterprise Systems)

• 44 percent defect reduction following causal analysis cycle at maturity level 2 (reported under non disclosure)

Page 10: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 10

Carnegie MellonSoftware Engineering Institute

Examples of Impact: Return on Investment• 5:1 ROI for quality activities (Accenture)

• 13:1 ROI calculated as defects avoided per hour spent in training and defect prevention (Northrop Grumman Defense Enterprise Systems)

• Avoided $3.72M in costs due to better cost performance (Raytheon North Texas Software Engineering)

- As the organization improved from SW-CMM level 4 to CMMI level 5

• 2:1 ROI over 3 years (Siemens Information Systems Ltd, India)

• 2.5:1 ROI over 1st year, with benefits amortized over less than 6 months (reported under non disclosure)

Page 11: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 11

Carnegie MellonSoftware Engineering Institute

Countries where Appraisals have been Performed and Reported to the SEI

Red country name: New additions with this reporting

Argentina Australia Belarus Brazil Canada Chile ChinaColombia Denmark Egypt Finland France Germany Hong KongIndia Israel Japan Korea, Republic of Malaysia New Zealand PhilippinesRussia Singapore South Africa Spain Sweden Switzerland TaiwanThailand United Kingdom United States

Page 12: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 12

Carnegie MellonSoftware Engineering Institute

Number of Appraisals Conducted by YearReported as of 5 January 2005

0

100

200

300

400

500

600

700

800

1987 1988 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004

SPA SEI CBA IPI SCAMPI vX Class A

Page 13: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 13

Carnegie MellonSoftware Engineering Institute

25 or fewer10.6%

101 to 20016.2%

201 to 30011.1%

76 to 1008.4%

51 to 759.6%

26 to 5011.4%

301 to 50010.9%

501 to 10008.9%

1001 to 20009.4%

2000+3.5%

Organization SizeBased on the total number of employees within the area of the organization that was appraised

1 to 10040.0%

201 to 2000+43.8%

Based on organizations reporting size data395

Page 14: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 14

Carnegie MellonSoftware Engineering Institute

Maturity Profile by Organization Size Based on the total number of employees within the area of the organization that was appraised

7.1%

13.3

%

7.9%

18.2

%

14.1

%

4.5% 7.

0%

11.4

%

8.1%

14.3

%

7.1%

4.4% 7.

9%

4.7%

4.5%

2.3% 2.9% 5.

4% 7.1%

73.8

%

57.8

%

47.4

%

42.4

%

32.8

%

20.5

%

18.6

%

5.7% 8.

1%

7.1%

17.8

%

18.4

%

9.1%

31.3

%

50.0

%

34.9

%

22.9

%

18.9

%

35.7

%

2.4%

9.1%

6.3%

11.4

%

4.7%

8.6%

5.4%

2.4%

6.7%

18.4

% 21.2

%

10.9

%

9.1%

32.6

%

48.6

% 54.1

%

42.9

%

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

25 or fewer 26 to 50 51 to 75 76 to 100 101 to 200 201 to 300 301 to 500 501 to 1000 1001 to 2000 2000+

Not Given Initial Managed Defined Quantitatively Managed Optimizing

% o

f O

rgan

izat

ions

Based on organizations reporting size data395

Page 15: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 15

Carnegie MellonSoftware Engineering Institute

Disciplines Selected for Appraisals

0

20

40

60

80

100

120

140

160

180

200

SE/SW SW SE/SW/IPPD/SS SE SE/SW/SS SE/SW/IPPD SW/IPPD SE/IPPD/SS SE/SS SW/SS

Based on appraisals reporting coverage367

SE = System Engineering SW = Software EngineeringIPPD = Integrated Product and

Process Development SS = Supplier Sourcing

Num

ber o

f App

rais

als

See http://www.sei.cmu.edu/cmmi/background/aspec.html for Allowable Models & Combinations

Page 16: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 16

Carnegie MellonSoftware Engineering Institute

Published Appraisal Results

Page 17: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 17

Carnegie MellonSoftware Engineering Institute

Aggregated Appraisal Results

Results from 18 Defence Community* appraisals conducted over the period Mid 2000 - Present• *Includes Defence Industry and Department of Defence appraisal

results

(C) Copyright Commonwealth of Australia (C) Copyright Commonwealth of Australia -- September 2003September 2003

Page 18: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 18

Carnegie MellonSoftware Engineering Institute

TSP CMMI Practice ProfilePA -> RM RD TS PI VE VAL CM PPQA MA CAR DAR OEI OPD OPF OID OT OPP PP PMC IPM QPM SAM RSKM IT

Specific Goal 1 U FI NR S S S S S U U NR S S S U S U S S U U U U SSP1.1 FI FI NR FI FI FI FI FI PI PI NR FI FI FI LI FI PI FI FI FI FI FI FI FISP1.2 FI FI NR FI FI FI FI FI PI PI NR FI FI FI LI FI PI FI FI FI PI FI LI FISP1.3 FI FI FI FI FI FI LI NR FI FI FI FI FI FI FI FI FI PI LI FI FISP1.4 PI PI NR FI FI FI LI FI FI FI FISP1.5 FI NR FI PI FI PISP1.6 NR FISP1.7 FI

Specific Goal 2 S NR S S S S U U U NR U U U S S NR U U S SSP2.1 FI FI FI FI FI FI LI LI PI PI FI FI LI FI FI NR PI NR FI FISP2.2 FI FI FI FI FI FI FI LI PI NR FI FI FI FI FI FI PI LI FI FISP2.3 FI FI FI PI NR PI FI LI FI FI FI FI LI FI FISP2.4 NR LI LI FI LI FI FISP2.5 FI FISP2.6 FISP2.7 FISP2.8

Specific Goal 3 NR S S S S S S SSP3.1 NR FI FI FI FI FI FI FISP3.2 FI FI FI FI FI FI FI FISP3.3 LI FI FISP3.4 NR FISP3.5 NR

Specific Goal 4 SSP4.1 FISP4.2 FISP4.3 FI

PA -> RM RD TS PI VE VAL CM PPQA MA CAR DAR OEI OPD OPF OID OT OPP PP PMC IPM QPM SAM RSKM ITGeneric Goal 2 S S S U S S S

Generic Goal 3 S U U S S S S S U U NR NR S S NR S U S S S S S S S

LEGENDS

Practices GoalsFI Fully Implemented or Satisfied S Satisfied LI Largely Implemented U Unsatisfied (Goals)PI Partially Implemented NR Not Rated NI Not ImplementedNR Not Rated

Page 19: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 19

Carnegie MellonSoftware Engineering Institute

CMMI Version 1.2 Plan“Single book, single course” strategy begun• V1.2, like the Addison-Wesley book, will consolidate both

staged and continuous representations• Single course for “Intro to CMMI” has been created

- First public offering in Denver prior to CMMI Conference- New instructors will only be trained in single course- Existing instructors are receiving upgrade training- Staged and Continuous courses will be available until

planned upgrade training is complete• Phased SCAMPI refinements will complement strategy

Pilot proposed V1.2 changes in Fall 2005Release V1.2 Summer 2006

Page 20: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 20

Carnegie MellonSoftware Engineering Institute

Current planned enhancementsTo address size and complexity:• “Single book” approach

• Eliminate concept of advanced practices

• Eliminate concept of common features

• Improve model-method interactions for artifacts

• Clarify material based on 1000+ CRs

To enhance coverage:

• Add “hardware” amplifications to assure all of development is covered

• Add baseline coverage of “Work Environment”

Page 21: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 21

Carnegie MellonSoftware Engineering Institute

Beyond V1.2 - 1Improved architecture will allow post-V1.2 expansion

• Extensions of the life cycle (Services, Outsourcing) could expand use of a common organizational framework

- Allows coverage of more of the enterprise, or potential partnering organizations

- Adapts model features to fit non-developmental efforts (e.g., CMMI-Services)

Page 22: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 22

Carnegie MellonSoftware Engineering Institute

Beyond V1.2 - 2First such constellation, CMMI-Services, has been “commissioned” by CMMI Steering Group

• Northrop-Grumman has committed to lead industry group

• Initial focus will be for organizations providing “DoD services” as well as internal IT

• Development will be in parallel with V1.2 effort

• Publication will be sequenced after V1.2 rollout

Page 23: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 23

Carnegie MellonSoftware Engineering Institute

The possible options for assessment and surveillance

Current ISO 9001

ISO 9001ISO 9001IAIA

Current CMMI

SCAMPISCAMPI‘A’‘A’

SCAMPI ‘A’&

ISO 9001

SCAMPI ‘A’

VisitReport

Rating letter indicating level

achieved

… continues todemonstrate

compliance withISO 9001:2000

…no behavioursinconsistent with

operating at level X

(Combined ISO Surveillance using Cat ‘C’ appraisal)

(Cat ‘C’ appraisal)

Rating letter & or certificatewith scope indicating

“… in accordance with Level X”

Page 24: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 24

Carnegie MellonSoftware Engineering Institute

For More Information About CMMI• Go to CMMI Website

- http://sei.cmu.edu/cmmi- http://seir.sei.cmu.edu/seir/- https://bscw.sei.cmu.edu/pub/bscw.cgi/0/797

83- http://dtic.mil/ndia (first, second, and third

annual CMMI Conferences)- http://seir.sei.cmu.edu/pars (publicly

released SCAMPI summaries)

Page 25: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 25

Carnegie MellonSoftware Engineering Institute

CMMI Staged and Six Sigma

Process unpredictable and poorly controlled

Process characterized for projects and is often reactive

Process characterized for the organization and is proactive

Process measuredand controlled

Processimprovement

Optimizing

Quantitatively Managed

Defined

Initial

Managed

4

5

• 6σ “drilldown” drives local (but threaded) improvements

• 6σ may drive toward and accelerate CMMI solution

1

2

3

• Organization-wide 6σ improvements and control• Correlation between process areas & 6σ methods• 6σ used within CMMI efforts

• 6σ philosophy & method focus

Six Sigma is enterprise wide.Six Sigma addresses product and process.

Six Sigma focuses on “critical to quality” factors.

• Infrastructure in place • Defined processes feed 6σ

Page 26: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 26

Carnegie MellonSoftware Engineering Institute

Six Sigma and CMMI ContinuousAchieve high capability in PAs that build Six Sigma skills.• MA, QPM, CAR, OPP

Use capability to help prioritize remaining PAs

0

1

2

3

4

5

MA

QPM CA

R

OPP DA

R

PP

PMC

SAM

IPM

RSK

REQ

M RD TS PI

VER

VAL

CM

PPQ

A

OPF

OPD O

T

OID

Cap

abili

ty

[Vickroy 03]

Foundational PAs

Remaining PAs ordered by business factors, improvement opportunity, etc. which are better understood using foundational capabilities. CMMI Staged groupings and DMAIC vs. DMADV are also factors that may drive the remaining order.

Page 27: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 27

Carnegie MellonSoftware Engineering Institute

Examples of Impact: Return on Investment• 5:1 ROI for quality activities (Accenture)

• 13:1 ROI calculated as defects avoided per hour spent in training and defect prevention (Northrop Grumman Defense Enterprise Systems)

• Avoided $3.72M in costs due to better cost performance (Raytheon North Texas Software Engineering)

- As the organization improved from SW-CMM level 4 to CMMI level 5

• 2:1 ROI over 3 years (Siemens Information Systems Ltd, India)

• 2.5:1 ROI over 1st year, with benefits amortized over less than 6 months (reported under non disclosure)

Page 28: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 28

Carnegie MellonSoftware Engineering Institute

Dollars per Kloc

75%

80%

85%

90%

95%

100%

105%

Improved Improved Product QualityProduct Quality

With Real With Real Cost SavingsCost Savings

Hours/KLOC

Architecture Design

SoftwareDesign

Code & Unit Test

Product Integration

&Verification

System Integration

&Verification

Improved Defect Find & Fix

SW CMM ML3 ProgramCMMI Level ML5 Program

Deployment

15 % decrease in defect find & fix costs

Lockheed Martin IS&S

Page 29: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 29

Carnegie MellonSoftware Engineering Institute

Higher Product Quality

7.5

6.24

4.73

2.28

1.05

0.060

1

2

3

4

5

6

7

8

Level 1 Level 2 Level 3 Level 4 Level 5 TSP

Defects/KLOC

Page 30: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 30

Carnegie MellonSoftware Engineering Institute

NAVAIR AV-8BMarch 2000 Began current CMM-based improvement

effortOct. 2000 Began PSP/TSP introduction sequenceJan. 2001 First TSP team launchedMay 2001 CBA-IPI: CMM level 2; 3 KPAs satisfied at

level 3; level 4/5 observations on TSPJune 2001 Received draft of CMM-TSP gap analysis

(levels 2 and 3 only, minus SSM and TP) to help guide improvement efforts

Feb. 2002 Received late-model gap analysis (including TP at level 3 and levels 4 and 5)

June 2002 Launched second TSP teamSep. 2002 CBA-IPI: CMM level 4 (16 months from L2!)See Crosstalk, Sep. 2002, “AV-8B’s Experiences Using the TSP to Accelerate SW-CMM Adoption,” Dr. Bill Hefley, Jeff Schwalb, and Lisa Pracchia.

Page 31: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 31

Carnegie MellonSoftware Engineering Institute

Overhead Rates: LM IS&SOverhead Rate

93.0%

94.0%

95.0%

96.0%

97.0%

98.0%

99.0%

100.0%

101.0%

SW CMM L3 SW CMM L4 SW CMM L5 CMMI L5

Ove

rhea

d R

ate

Overhead Rate as a Percentage of SW CMM L3

Near the end of the SW CMM L2 period, the overhead pools were changed. A SW CMM L2 Overhead is therefore not included.

CMMI Does Not Come with Overhead BaggageCMMI Does Not Come with Overhead Baggage

Page 32: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 32

Carnegie MellonSoftware Engineering Institute

Progress during PI Effort at CMS

Work Products Completion

0%10%20%30%40%50%60%70%80%90%

100%

M 1 M 1.5 M 2 M 2.5 M 3 M 3.5

Early Planning

PP

PMC

Engineering

Support

Work product completion improved dramatically CMS Information Services, Inc. – ML3

Page 33: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 33

Carnegie MellonSoftware Engineering Institute

One Example – Productivity

Source: Lockheed Martin SEPG Presentation 2003

Page 34: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 34

Carnegie MellonSoftware Engineering Institute

Process Improvement Pay-OffData Source Process Improvement ROI/Benefit ConclusionsSoftware Engineering Institute

Twelve medium to large-scale industrial, commercial or defense industry organizations examined in regards to improvement efforts. Implementation of CMMI or SW-CMM

Samples across 12 organizations: • 4.5% Decline in overhead rate • 20% Reduction in average cost variance • Increased % of milestones met from ~ 50% to ~ 95%• 30% Increase in software productivity • 5:1 ROI for quality activities• 13:1 ROI calculated as defects avoided per hour spent in

training and defect prevention

Space and Naval Warfare Systems Center (SSC-SD)

Achieve a SW-CMM level 3 for the SmartNet scheduling tool for High Performance Computing Environments. Produce high quality, high reliability product, while maintaining high level of control in configuration management

• 45% reduction in Software Change Requests over 18 months

• Better overall performance of the software, better documentation, reduced scheduled variance, higher quality, higher customer satisfaction, improved employee morale, better communication among team

Software Productivity Research

Four development projects using SW CMM in the Test Software Branches of the Oklahoma City Air Logistics Center (OC-ALC), Directorate of Aircraft, Software Division (LAS)

• 7:1 ROI and savings of $11M over eight years• 90% reduction in defect rates compared to baseline project• 26% reduction in average cost of maintenance actions over

24 months.

NASA - Goddard Space Flight Center

Improvements in: staff training, reduced cycle time, defect prevention, requirements definition

7:1 ROI over 17 years (Project Overhead, Data Processing, Model Development)Benefits of Improvement Efforts, CMU/SEI-2004-SR , Apr 2004

Page 35: CarnegieMellon Software Engineering Institute

© 2005 by Carnegie Mellon University page 35

Carnegie MellonSoftware Engineering Institute

Boeing, Australia

Quality

Schedule / cycle time

Product cost

Making transition to CMMI from SW-CMM and EIA 731; early CMMI pilot in Australia

RESULTS on One Project• 33% decrease in the average cost to fix a defect• Turnaround time for releases cut in half• 60% reduction in work from Pre-Test and Post-Test

Audits; passed with few outstanding actions

• Increased focus on product quality• Increased focus on eliminating defects• Developers seeking improvement opportunities

In Processes is there a Pay-Off? Terry Stevenson, Boeing Australia, Software Engineering Australia 2003 conference.