Upload
others
View
7
Download
0
Embed Size (px)
Citation preview
University of Southern California
S I E Center for Software Engineering - Tutorial:
Software Cost Estimation with COCOMO 2.0
Barry Boehrn, USC
COCOMOISCM Forum 1996
October 9.1996 l
Barry Boehm - 1
University of Southern California Center for Software Engineering
Outline
+ Steps in Software Cost Estimation - 10 years of TRW experience
+ Integrated Estimation, Measurement, and Management -Getting to CMM levels 4 and 5 - Role of COCOMO 2.0
+ COCOMO 2.0 Model Overview + COCOMO 2.0 Status and Plans + Information Sources
Utilvnrsily ol Soulliot 1 1 Cillilo~~lin Center for Soflwaro Ellgi~icct'illg -
EXAMPLE SOFTWARE PROJECT COST /-//STOI?Y
DEVENNY, 1976
.I000
320C
240C
1 GOO
000
0 G 1 ;! 111 2 1 30
NUh/!ilfr;!: OF MOPlTl 15 A l - T l l l i CONTRACT A W A R D
Steps in Software Cost Estimation Establish Objectives
-Rough sizing -Ma ke-or-B uy -Detailed Planning
Allocate Enough Time, Dollars, Talent Pin Down Software Requirements
-Documents Definition, Assumptions
Work out as much detail as Objectives permit Use several independent techniques + sources
-Top-Down vs. Bottom-Up -Algorithm vs. Expert Judgement
Compare and iterates estimates -Pin down and resolve inconsistencies -Be Conservative
Follow Up
University of Southern California Center for Software Engineering
Cost Estimating Objectives
+ Relative vs. Absolute estimates - Selection vs. Resource Allocation
Decisions + Balanced Estimates
- Degree of uncertainty vs. Size + Generous vs. Conservative estimates
* Key objects to decision needs *Re-example and iterate objectives as appropriate
University of Southern California Center for Software Engineering
4x
2x
1.5X
l.25X
X
RELATIVE COST RANGE
0.5x
0 . 2 5 ~
SOFTWARE ESTIMATION ACCURACY VS. PHASE
- SIZE (DSI)
X -COST ($)
FEASIBILITY PLANS AND ROTS PRODUCT D ETA1 L DEVEL. AND TEST DESIGN DESIGN
BB 04/22/96 PHASES AND MILESTONES
University of Southern California Center for Software Engineering
Table 21 -1 Software Cost-Estimating Mini project Plan
1. Purpose. Why is the estimate being made? 2. Products and Schedules. What is going to be furnished be
when? 3. Responsibilities. Who is responsible for each product?
Where are they going to do the job (organizationally? geographically?).
4. Procedures. How is the job going to be done? Which cost- estimation tools and techniques will be used (see Chapter 22)?
5. Required Resources. How much data, time, money, effort, etc. is needed to do the job?
6.Assumptions. Under what conditions are we promising to deliver the above estimates, given the above resources (availability of key personnel, computer time, user data)?
University of Southern California Center for Software Engineering
Software Cost Estimation Plan: Zenith Rapid Transit System
+ Purpose To help determine the feasibility of a computer controlled rapid-transit system for the Zenith metropolitan area.
+ Products and Schedules. 2/1/97 -- Cost Estimation Plan 2/15/97 -- First cost model run 2/22/97 -- Definitive cost model run
Expert estimates complete 2/29/97 - Final cost estimate report, incorporating modell
expert iterations. Accuracy to within factor of 2.
+ Responsibilities. Cost estimation study: Z.B. Zimmerman Cost model support: Applications Software Department Expert estimators (2): Systems Analysis Department
University of Southern California Center for Software Engineering
Software Cost Estimation Plan: Zenith Rapid Transit System (Contd.)
+ Procedures. Project will use SOFCOST model, with sensitivity analysis on high-leverage cost driver attributes. Experts will contact BART personnel in San Francisco and Metro personnel in Washington, D.C. for comparative data.
+ Reauired Resoures. Z.B. Zimmerman: 2 man-weeks Expert estimators: 3 man-days each Computer: $200
+ Assumptions. No major changes to system specification dated 15 January 1997. Authors of specification available to answer sizing questions.
W//A T DOES A SOFTWAREPRODUCT DO?
W I T D O A SOFTWARE PROJECT DO?
SUMMARY OF 1WO TEAMS DEVEI-OI'ING PRODUCT TO SAME IlASlC SPECIFICATION (SMALL, I N - I ~ A C T I V E SOF-I-WAIIII cow MODEL)
MEETING MISCEL- LANEOUS
WHAT DO PROGRAMMEBS DO? - BELL LABS TIME AP?O MOTlOlv STUDY
- University of Southern California I c S ( E Center for Software Engineering -
Strengths and Weaknesses of Software Cost- Estimation Methods
Strengths and Weaknesses of Software Cost-Estimation Methods
Method Strengths Weaknesses
Algorithmic model
Expert Judgment Analogy
Parkinson
Price to win
Top-down
Objective, repeatable, analyzable formula
Efficient, good for sensitivity
Objectively calibrated to experience
Assessment of representatives, interactions, exceptional circumstances
Based on representative experience
Correlates with some experience
Often wins
System level focus
Efficient
More detailed basis
More stable
Fosters individual commitment
Subjective inputs
Assortment of exceptional circumstances
Calibrated to past, no hture
No better than participants
Biases, incomplete recall
Represetativeness of experience
Reinforces poor practice
Generally produces large overruns
Less detailed basis
Less stable
May overlook system level costs
Requires more effort
University of Southern California Center for Software Engineering
Comparing and Iterating Estimates
+ Complementarity of techniques
+ Missing items
+ The "tall pole in the tent" - Pareto (80-20) principle
+ The optimist/pessimist phenomenon
University of Southern California Center for Software Engineering
SOFTWARE COST ESTIMATION ACCURACY VS. PHASE
1.25)
X
RELATIVE COST RANGE
4x -
1
1 -
- A A A
FEASIBILITY A
PLANS AND ROTS PRODUCT DETAIL A
DEVEL. AND TEST DESIGN DESIGN
BB 04/22/96 PHASES AND MILESTONES
CLASSES OF PEOPLE, DATA SOURCES TO SUPPORT EXAMPLE SOURCES OF UNCERTAINTY,
MAN-MACHINE INTERFACE SOFTWARE
RESPONSE TIMES
INTERNAL DATA STRUCTURE BUFFER HANDLING TECHNIQUES
DETAILED SCHEDULING ALGORITHMS, ERROR HANDLING OF PROGRAMMER SPEC UNDERSTANDING
CONCEPT OF OPERATION
RQTS. SPEC.
PRODUCT DESIGN DETAllL SPEC. DESIGN
SPEC.
ACCEPTED SOFTWARE
University of Southern California Center for Software Engineering
Judging the Soundness of Cost Estimates: Some Review Questions
. How would the estimated cost change if:
- There was a big change in the interface?
- We had to supply a more extensive capability?
- Our workload estimates were off by 50%?
- We had to do the job on a hardware configuration?
. Suppose our budget was 20% (less, greater). How would this affect the product?
. How does the cost of this item break down by component?
University of Southern California Center for Software Engineering
Judging the Optimism of Cost Estimates: Some Review Questions
+ How do the following compare with previous experience: - Sizing of subsystems? - Cost driver ratings? - Cost per instruction? - Assumptions about hardware? GFE?
Subcontractors? Vendor software?
+ Have you done a cost-risk analysis? - What are the big-risk items? . Suppose we had enough money to really do this job right - How much would we need?
University of Southern California Center for Software Engineering
Software Cost Estimation: Need for
+ Estimating inputs are imperfect + Estimating techniques are imperfect + Project may not fit model
- Incremental development + Software projects are volatile + Software is an evolving field
- COTS, objects, cyberspace, new processes
University of Southern California Center for Software Engineering
Management-Relevant, Hypothesis Driven Data: TRW Data Collection Policy
+ Identify major project milestone
+ At each milestone: - Re-do cost model inputs - Re-run cost moel - Collect actuals corresponding to cost
model outputs - Compare results
University of Southern California Center for Software Engineering
Outline
+ Steps in Software Cost Estimation - 10 years of TRW experience * + Integrated Estimation, Measurement,
and Management -Getting to CMM levels 4 and 5 - Role of COCOMO 2.0
+ COCOMO 2.0 Model Overview + COCOMO 2.0 Status and Plans + Information Sources
University of Southern California Center for Software Engineering
Key Process Areas by Level Level 4 (Managed)
+ Process Measurement and Analysis - Statistical process control - Cost/schedule/quality tradeoffs understood - Causes of variation identified and controlled
+ Quality Management - Measurable quality goals and priorities - Quality goals drive product and process plans - Measurements used to manage project
University of Southern California Center for Software Engineering
Key Process Areas by Level Level 5 (Optimizing)
+ Defect Prevention - Defect sources identified and eliminated
+ Technology Innovation - Processes for technology tracking, evaluation,
assimilation - Assimilation tied to process quality and productivity
improvements
+ Process Change Management - Commitment to process improvement goals - Evidence of measured improvements - Improvements propagate accross the organization
University of Southern California Center for Software Engineering
Role of COCOMO 2.0
+ Supports modern application of TRW data collection policy - Objects, GUI builders, reuse, new
processes - COTS integration, quality estimation
underway + Use data to recalibrate model + Use model to optimize new technology
investments
University of Southern California Center for Software Engineering
COCOMO 2.0 Long Term Vision Rescope Execute
System objectives NO project fcn'y, perf., quality d to next
Milestone Cost. t Sched, I I
Corporate parameters: Risks - tools, processes, reuse COCOMO 2.0
Improved r
resources Cost, Sched, 1 1 Qualiti drivers Parameters
Evaluate Corporate SW Improvement Strategies
Milestone expectations /
Revise Milestones, Plans, Resources
Recalibrate COCOMO 2.0
- - -
Accumulate COCOMO 2.0 calibration data
Yes L----l Revised
Yes I
University of Southern California Center for Software Engineering
Outline
+ Steps in Software Cost Estimation - 10 years of TRW experience
+ Integrated Estimation, Measurement, and Management - Getting to CMM levels 4 and 5 - Role of COCOMO 2.0
I) + COCOMO 2.0 Model Overview + COCOMO 2.0 Status and Plans + Information Sources
University of Southern California Center for Software Engineering
COCOMO 2.0 Model Overview
COCOMO Baseline Overview + COCOMO 2.0 Objectives + Coverage of Future Market Sectors + Hierarchical Sizing Model + Modes Replaced by Exponent Drivers + Stronger ReuseIReengineering Model + Other Model Improvements + Activity Sequence
University of Southern California Center for Software Engineering
COCOMO Baseline Overview I
'1 COCOMO
Software product size estimate
Software product, process, computer and personnel attributes,
Software reuse, maintenance, and
Software development, maintenance cost and schedule estimates
Cost, schedule distribution by phase, activity, increment .
Software organization's project data
COCOMO recalibrated to organization's data
University of Southern California Center for Software Engineering
COCOMO Baseline Overview II
+ Open interfaces and internals - Published in Software Enqineerinq
Economics, Boehm, 1981 + Numerous implementations, calibrations,
extensions - Incremental development, Ada, new
environment technology - Arguably the most frequently-used
software cost model worldwide.
University of Southern California Center for Software Engineering
Partial List of COCOMO Packages -From STSC Report, Mar. 1993
+ CB COCOMO + COCOMOID + COCOMO1 + CoCoPro + COSTAR + COSTMODL
+ GECOMO Plus + GHL COCOMO + REVIC + SECOMO + SWAN
University of Southern California Center for Software Engineering
COCOMO 2.0 Objectives
+ Retain COCOMO internal, external openness + Develop a 1990-2000's software cost model
-Addressing full range of market sectors - Addressing new processes and practices
+ Develop database, tool support for continuous model improvement
+ Develop analytic framework for economic analysis - E.g., software product line return-on-investment
analysis + Integrate with process, architecture research
University of Southern California ' Center for Software Engineering
The future of the software practices marketplace
User programming
(55M performers in US in year 2005)
Infrastructure (0.75M)
Application generators \ (0.6M)
Application composition . (O.7M)
System integration
(0.7M)
[cis .,,. iEi University of Southern California * $3 Center for Software Engineering
COCOMO 2.0 Coverage of Future SW Practices Sectors
User Programming: No need for cost model Applications Composition: Use object c o u n t s or object points - Count (weight) s c r e e n s , reports, 3GL routines
Sys tem Integration; development of applications genera tors a n d infrastructure sof tware - Prototyping: Applications .. composit ion model -Early design: Function Points a n d 7 c o s t drivers - Post-architecture: S o u r c e S ta tements o r Function
Points a n d 17 cost drivers - Stronger reuselreengineering model
University of Southern California ' Center for Software Engineering
Applications Composition
+ Challenge: - Modeling rapid applications composition
with graphical user interface (GUI) builders, client-server support, object- Ii braries, etc.
+ Response: - Object-Point sizing and costing model - Reporting estimate ranges rather than
point estimates
Step 1 : Assess Object-Counts: estimate the number of screens, reports, and 3GL components that will comprise !his application. Assume the standard definitions of these objects iii your iCASE environment.
Step 2: Classify each object instance into simple, medium and difficult complexity levels depending on values of characteristic dimensions. Use the follow ing scheme:
For Screens For Reports # and source of data tables # and source of data tables
Number of Total < 4 Total <8 Total 8+ Number Total < 4 Total <8 Total 8t Views (c2 srvr, c3 (4 srvr, 3- (>3 srvr, >S of Sections (<2 srvr, c3 (<3 srvr, 3- (>3 srvr, >5
Contained clnt) 5 clnt) clnt) Contained clnt) 5 clnt) clnt) simple 0 or I simple simple medium medium 1
3 - 7 simple medium difficult 2 or 3 simple medium difficult >8 medium difficult difficult 4 t medium difficult difficult
Step 3: Weigh the number in each cell using the following scheme. The weights reflect the relative effort required to implement an instance of that complexity level.
Object Type Complexi ty-Weigh t Simple Medium Difficult
Screen 1 2 3 Report 2 5 8 3GL Component 1 0
Step 4: Determine Object-Points: add all the weighted object instances to get one number, the Object-Point count. Step 5: Estimate percentage of reuse you expect to be achieved in this project. Compute the New Object Points to be developed,
NOP =(Object- Points) (1 00- %reuse) / 100. Step 6: Determine a productivity rate, PROD=NOP/person-month, from the following scheme:
. Developer's experience and capability Very Low Low Nominal High Very Hiqh ICASE maturity and capability Very Low Low Nominal High Very High
PROD 4 7 I n 7 5 .c; n
Step 7: Compute the estimated person-months: PM=NOP/PROD.
I University of Southern California ' Center for Software Engineering
SW Costing and Sizing Accuracy vs. Phase
Relative 1 .25~ Size
X Range
Completed Programs
USAF/ESD Proposals
S i z e (DSI) + Cost ($)
I Product Detail
Concept of / Operat ion Spec. Rqts. Spec. Design Spec. Design Accepted Software A A A A A A Feasability Plans Product Detail Devel.
and Design Design and
Rqts. Phases and Milestones Test
University of Southern California Center for Software Engineering
New Processes + Challenges
- Cost and schedule estimation of composable mixes of prototyping, spiral, evolutionary, incremental development, other models
+ Responses - New milestone definitions
>> Basic requirements consensus, life-cycle architecture
- Replace development modes by exponent drivers .
- Post-Initial Operational Capability (IOC) evolution
>> Drop maintenancelreuse distinction
University of Southern California Center for Software Engineering
Process/Anchor Point Examples
Rapid spiral-type APP= Devel.
LC0 LCA IOC
Ev.Dev., Spiral
Waterfall, W'fall, IncDev, EvDev,
LCO, IOC LCA
Spiral-ty pe Spiral-type Spiral, Design-to-Cost, etc.
I University of Southern California '
Center for Software Engineering .. . .
fia$i%?:Kx-
Life-cyc'le Architecture as Critical Milestone
Candidate Cr i t i ca l 1 Concerns
Specifications I Inflexible point solutions
M i l es tones Complete Requirements Premature decisions, e.g. user interface features
Beta-Test Code Gold plating High-risk downstream functions
Life-Cycle Architecture
Inflexible point solutions Capabilities too far from user needs Provides flexibility to address concerns above Further concerns:
Predicting directions of growth, evolution High-risk architectural elements
University of Southern California ' Center for Software Engineering
Early Design and Post-Arch Model: + Effort:
+ Size - KSLOC (Thousands of Source Lines of Code
- UFP (Unadjusted -. Function Points
- EKSLOC (Equivalent KSLOC) used for adaptation
+ SF: Scale Factors (5)
+ EM: Effort Multipliers (7 for ED, 17 for PA)
University of Southern California Center for Software Engineering
IMPORTANCE OF CL EA R DEFINITIOIIIS SCOPE
I) Include Con version? Training? Documentation ?
ENDPOINTS I) Include requirements analysis? Full validation?
INSTRUCTIONS I) Source? Object? Executable? Comments?
PRODUCT I) Application? Support S/W? Test Drivers?
"DEVEl OPMENT" Include reused software? Furnished Utility S/ W?
"MAN MONTHS" I) Include Clerical and Supporf Personnel? Vacation?
I These can bias results by factors of 2-10 1
University of Southern California Center for Software Engineering
Unadjusted Function Points Step i: Determine function counts by tvpe. The unadjusted function counts should be counted by a lead technical person
based on information in the software requirements and design documents. The number of each of the five user function types should be counted (Internal Logical ~ i l e l (ILF), External Interface File (EIF), External Input (El), Extemal Output (EO), and External Inquiry (EQ)).
Step 2: Determine com~lexitv-level function counts. Classify each functign count into Low, Average, and High complexity levels depending on the number of data element types contained and the number of file types reference. Use the following scheme:
Step 3: ADDIV com~lexitv weiahts. Weight the number in each cell using the following scheme. The weights reflect the relative value of the function to the user.
For El For ILF and EIF
Functlon Type
[ Extemal Interfaces Files 1 -
5 I 7 I 10 I
Flle Types For EO and EQ
Record E lements
Complexity-Welght I I
Internal Logical Files
External Inputs I 3 I 4 I 6 External Outputs 4 5 7
Data Elements 1- 15- 1
Flle Types Data Elements 1- 120- 1
External Inquiries 1 3 I 4 I 6 j
Data Elements 1- 16 -
-. Low I Average I High
Step 4: Compute Unadiusted Function Points. Add all the weighted functions counts to get one number, the Unadjusted Function Points.
7 I 1 0 1 5
University bf Southern California '
Center for Software Engineering
Source Liries of Code: Measurement Unit: Physical source !in&] ]
d Statement Type 1
When a line or statement contains more than one type, classify i t as the type with the highest precedence.
1. Executable Order of precedence I€ 1 J 2. Nonexecutable 4
3. Declarations 7 J 4. Compiler directives ?
5. Comments 6. On their own lines : 4
7 . On lines with source code 5
8. Banners and nonblank spacers 6
9. Blank (empty) com@ents 7
10 R l a n k e s How produced ~ef in i t ion l J 1 Data ~ r r a ~ , l 1 1. Programmed 2. Generated with source code generators 3. Converted with automated translators 4. Copied or reused without change 5. Modified
\I
L
-J" L L Excludes
J
d O r i g i n ~efinitionl 4 ) Data ~ r r a ~ l 1 1. New work: no prior existbnce . 2. Prior work: taken or adapted from 3. A previous version, build, or release 4. Commercial, off-the-shelf software (COTS), other than libraries 5. Government furnished seftware (GFS), other than reuse libraries 6. Another product 7. A vendor-supplied language support library (unmodified) 8. A vendor-supplied operating system or utility (unmodified) 9. A local or modified language support library or operating system 10. Other commercial library 11. A reuse library (softwart designed for reuse) J7 Q f h e r s o f t w a r e c n m n o n e n t o r l i b r a r v
J J
J u1
J J J J J J . J J
University of Southern California Center for Software Engineering
Size Adjustment Breakage Nonlinear effects of reuse AAF = 0.4(DM) + 0.3(CM) + 0.3(IM)
ESLOC = ASLOC [AA + AAF(1 + .02(SU) (UN FM))] , AAFl.5
ESLOC = ASLOC [AA + AAF(SU) (UNFM)] , AAF> .5
100
Automatic translation of code .- A 1 ASLOC - 100
- - - - . . - -
adjusted estimated A TPROD
University of Southern California Center for Software Engineering
COCOMO 2.0 Model Overview
+ COCOMO Baseline Overview + COCOMO 2.0 Objectives + Coverage of Future Market Sectors + Hierarchical Sizing Model * + Modes Replaced by Exponent Drivers + Stronger ReuseIReengineering Model + Other Model Improvements + Activity Sequence
University of Southern California I
Center for Software Engineering
Project Scale Factors:
PM estimated
PREC I thoroughly I largely I somewhat 1 generally I largely I unprecedented I unprecedented ( unprecedented I famil i a r I famil iar
REX
RESL
general goals I
TEAM
PMAT
seamless interactions
rigorous
little (20%)
very difficult interactions
occasional relaxation some (40%)
weighted sum of KPA competition levels
-. some difficult interactions
some relaxation often (60%)
basically cooperative interaction
general conformity generally
some conformity mostly
( 75%) largely cooperative
( 9 0 % ) highly cooperative
Nature of Life-Cycle Architecture + Definition of life-cycle requirements envelope
I
+ Definition of software components and relationships - physical and logical - data, control, timing relationships
- shared assumptions - developer, operator, user, manager views
+ Evidence that architecture will accommodate requirements envelope
+ Resolution of all life-cycle-critical COTS, reuse, and risk issues
University of Southern California Center for Software Engineering
.- . .....- 6,2>:::<:?!,A*;3g#r- . .
New Scaling Exponent Approach
Nominal person-months = A*(size)**B B = 1 .O1 + 0.01 C (exponent driver ratings) - B ranges from 1 .O1 to 1.26 - 5 drivers; ratings from 0 to 5
Exponent drivers: - Precedentedness - Development flex-ibility - Architecture1 risk resolution -Team cohesion - Process maturity (being derived from SEI CMM)
University of Southern California '
Center for Software Engineering
Reuse and Product Line Management
+ Challenges 4
- Estimate costs of both reusing software and developing software for future reuse
- Estimate extra effects on schedule (if any)
+ Responses
- New nonlinear reuse model - Cost of developing reusable software
expanded from Ada COCOMO - Gathering schedule data
University of Southern California ' Center for Software Engineering
Nonlinear Reuse Effects
0.75
Relative cost 0.5
Data on 2954 NASA modules
Usual Linear
Amount Modified
! University of Southern California ' Center for Software Engineering
Reuse and Reengineering Effects
Add Assessment & Assimilation increment (AA) - Similar to conversion planning increment
Add software understanding increment (SU) - To cover nonlinear software understanding effects
- Apply only if reused software is modified
Results in revised Adaptation Adjustment Factor (AAF)
- Equivalent DSI = (Adapted DSI) * (AAFII 00) -AAF = AA + SU + 0.4(DM) + 0.3 (CM) + 0.3 (IM)
University of Southern California I
Center for Software Engineering
Prototype of Software Understanding Rating 1 lncrement
Structure
Application Clarity
Self- Descriptiveness
SU Increment to ESLOC
Very low cohesion, high
coupling, spaghetti code.
No match between
program and application
world views. Obscure code; documentation
missing, obscure or obsolete.
Moderately low cohesion, high
coupling.
Some correlation
between program and application .
Some code commentary and headers; some
useful documentation.
Reasonably well-
structured; some weak
areas.
Moderate correlation
between program and application .
Moderate level of code
commentary, headers,
documentation.
High cohesion, low coupling.
Gocd correlation
between program and application .
Goodcode commentary and headers;
useful documentation;
some weak are
Strong modularity, information
hiding in datalcontrol structures. Clear match
between program and application
world views. Self-
descriptive code;
documentation up-to-date,
well-organized, s. with design
rationale. 1 0
University of Southern California I c I S BE I Center for Software Engineering - Proposed Reuse 1 Maintenance Model (Change <= 20%)
AAF = 0.4(DM) + 0.3(CM) + 0.3(IM)
ESLOC = ASLOC[AA + AAF(1 + .02(SU) (UNFM))] , AAF <= 0.5 100
ESLOC = ASLOCIAA + AAF (SU) (UNFM)] , AAF > 0.5 100
UNFM: Unfamiliarity with Software modifier 0.0 - Completely familiar 0.2 - Mostly familiar 0.4 - Somewhat unfamiliar 0.6 - Considerably unfamiliar 0.8 - Mostly unfamiliar 1.0 - Completely unfamiliar
1 0109196 Barry Boehm - 6
Un~versity of Southern Callforma I c S I E ~ Center for Software Engineering - Maintenance Model (Change > 20%)
Obtained in one of two ways depending on what is known:
ESLOC = (BaseCodeSize MCF) MAF
ESLOC = (SizeAdded + SizeMified) M F
where
SizeAdded + SizeModiified MCF=
BaseCodeSize
Barry Boehm - 7
University of Southern California '
Center for Software Engineering
Legacy Software Reengineering
-Varying reengineering efficiencies based on tdol applicability, source-target system differences
+ Response:
- Exploratory model based. on NET IRS data
University of Southern California Center for Software Engineering
NlST Reengineering Case Study [Ruhl-Gunn, 19911
+ IRS Centralized Scheduling Program + Mix of batch, database querylupdate, on-line
processing + 50 KSLOC of COBOL74,2.4 KSLOC of
assembly code + Unisys 11 00 with DMS 11 00 network database + Reengineer to SQL database, open systems
+ Tools for program and data reengineering
University of Southern California '
Center for Software Engineering .*;: iy.'." . " <, L,.,k b , : ? < @ . $ . ".... . . ..,
NlST Reengineering Automation Data
Program Group
Batch Batch with SORT Batch with DBMS Batch, SORT, DBMS Interact ive
Automated Manual
University of Southern California 1 ..%. C S BE 1 Center for Software Engineering
COCOMO 2.0 Model Overview
COCOMO COCOMO Coverage
Baseline Overview 2.0 Objectives of Future Market Sectors
Hierarchical Sizing Model Stronger Reus'eIReengineering Other Model lm.provements
Model
University of Southern California '
Center for Software Engineering
Other Major COCOMO 2.0 Changes
+ Range versus point estimates + Requirements Volatility replaced by Breakage % + Multiplicative cost driver changes
- Product CD's
- Platform CD's - Personnel CD's.
- Project CD's + Maintenance model and reuse model identical
Post-Architecture EMS = Product:
Required Rel iabi l i ty
(RELY)
Database Size
(DATA) Complexity
(CPU) Required
Reuse (RUSE)
Documentation Match to Lifecycle 1 (DOCU)
see Complexity Table
slight incon- venience
Many life- 1 Some life- I Right-sized I Excessive for 1 Very 1 1
low, easily recoverable losses
DB bytes1 Pgm SLOC< 1 0
none
I I 1 needs 1
moderate, easi ly .. recoverable losses 1 OIDIPc 100
across pro ject
across program
cycle needs uncovered needs 1 uncovered ..
high financial loss
1001DIPc 1000
across product line
to life-cycle l i fe -cyc le needs I needs
risk to human l i f e
DIP> 1000
across m u l t i p l e
excessive for l i f e -cyc le
Post-Architecture Complexity:
... It0 processing includes device selection, status checking and error processing.
e . .
Simple use of widget set.
Mostly simple nest- ing. Some intermod- ule control. Decision tables. Simple call- backs or message passing, including middleware-sup- ported distributed processing. ...
Use of standard math and statistical routines. Basic matr ixtvector operations.
Multi-file input and single file output. Simple structural changes, simple edits. Complex COTS-DB queries, updates.
I I
!
University of Southern California ' Center for Software Engineering
Post-Architecture
Execution Time
Constraint (TI ME) Main
Storage Constraint
(ST0 R) Plat form
Vo la t i l i t y (PVOL)
major change every 12 mo.; minor change every 1 mo.
EMS - Platform:
150% use of .. I 70% I 85% available execution
available storage
t ime 550% use of 70%
major: 6 mo.; minor: 2 wk.
85%
major: 2 mo.; minor: 1 wk.
95%
major: 2 wk.; minor: 2 days
J
University of Southern California '
Center for Software Engineering
Post-Architecture EMS - Personnel: Analyst 15 th 1 35th 1 55th 1 75th
Capability percentile percentile percentile percentile
Programmer Capability
(PCAP) Personnel Continuity
Application Experience
(AEXP)
15 th percentile
Platform Experience
(PEXP) Language and Tool
Experience (LTEX)
35th percentile
5 2 months
5 2 months 1 6 months I year 1 years
55th percentile
6 months
90th percentile
75 th percentile
I 2 months
90th percentile
1 year 6 years 7- 3 years
6 months
6 years
6 years 1 year 3 years
University of Southern California ' Center for Software Engineering
Post-Architecture EMS Project:
Use of Software
Tools (TOOL)
edit, code, debug
Mul t is i te Development: Collocation
(SITE) Mult is i te
Development: Communicati
International
Some phone, m a i l
I ons (SITE) I Required 75% of
Development nominal Schedule 1 (SCED) 1
simple, frontend, backend CASE, little integration
Mul t i -c i t y and Multi- company
basic l i fecyc le tools, moderately integrated
Multi-city or Mult i- company
Individual phone, FAX
Narrowband emai l
strong, strong, mature, mature proactive l i fecyc le lifecycle tools, tools, well integrated moderately with processes, integrated methods, reuse Same city or Same building metro. area or complex
Wideband 1 Wideband elect. electronic comm, communica- occasional t i on video conf.
130% 160%
Fully collocated
Interactive multimedia
University of Southern Caiifornia Center for Software Engineering
COCOMO Model Comparisons
I
Instructions (DSI) or Source
f (DM.CM,IM)
I
Product Cost Drivers I RELY, DATA, CPLX
Breakage
Maintenance
Scale (b) in ~ p d ~ ~ ~ = a(~ ize)b
Requirements Volatility rating: (RVOL) Annual Change Traffic (ACT) - - %added + %modified Organic: 1 .OS Semidetached: 1.12 Embedded: 1.20
I
Personnel Cost Drivers 1 ACAP. AEXP. PCAP. VEXP.
Platform Cost Drivers
I
Project Cost Drivers 1 MODP. TOOL SCED
TIME, STOR. VIRT.TURN
I ' Different multipliers
Ada COCOMO
DSI or SLOC
Equivalent SLOC = Linear f (DM,CM,IM)
RVOL rating
ACT
Embedded: 1.04 - 1.24 depending on degree of:
early risk elimination solid architecture stable requirements Ada process maturity
RELY', DATA, CPLX*~ RUSE
TIME, STOR, VMVII, VMVT, NRN ACAP', AEXP, PCAP' , VEXP, LEXP'
COCOMO 2.0: Posl-Architecture
COCOMO 2.0: Application
Composition Object Points
COCOMO 2.0: Early Design
Implicit in Model
Function Points (FP) and Language
Implicit in Model
1.02-1.26 depending on the degree of: *precedentedness *conformity *early architecture, risk resolution *team cohesion wocess maturity
FP and Language or SLOC
%unmodified reuse: SR %modified reuse: nonlinear
1.02-1.26 depending on the degree 01: *precedentedness *conformily *early architecture, risk resolution *team cohesion *process malurltv
Equivalent SLOC = nonlinear f(AA,SU,DM,CM,IM)
f (AA,SU,DM,CM,IM) Breakage %: BRAK
Reuse model Object Point ACT
BW\K
Reuse model
I I
None I Personnel capability and I ACAP'. ~ ~ x p t , p c ~ p ' ,
None
None
t Different rating scale
(S EI) R=PX' t, RUSE' t
Platform difficulty: P D I F ' ~
None
(S El) RELY, DATA, DOCU' f ,
CPLX~ , RUSE' t TIME, STOR, PVOi(=ViRT)
experience: m't, P R E X * ~ SCED, FCIL' t
w'i, LTEX.'~, PCON. t TOOL' t, SCED, SITE' t
University of Southern California ' 1 $3 C 1 S I E 4 Center for Software Engineering J
Other Model Refinements: + Initial S c h e d u l e Estimation
where PM= es t imated pe r son m o n t h s excluding Schedule! multiplier effects
+ Output R a n g e s
Post-Architecture I 0.80 E 1.25 E I
-- - - - -
Application Composition Earlv Desian
- One s t a n d a r d deviation opt imist ic a n d pess imis t ic 'I es t ima te s
0.50 E 0.67 E
- Reflect s o u r c e s of uncertainty in model inpu ts
2.0 E 1.5 E
University of Southern California Center for Software Engineering
COCOMO 2.0 Model Overview
+ COCOMO Baseline Overview + COCOMO 2.0 Objectives + Coverage of Future Market Sectors + Hierarchical Sizing Model + Modes Replaced by Exponent Drivers + Stronger ReuseIReengineering Model + Other Model Improvements
+ Activity Sequence
COCOMO 2.0 Activity Sequence
+ Cost model review + Hypothesis formulation ( prepared, iterated) + Data definitions (prepared, iterated) + Monthly electronic data collection
- Amadeus data collection support + Data archiving + Incremental model development and refinement
- New USC COCOMO (baseline version prepared) + Related research
- COTS integration costlrisk estimation
University of Southern California 1 &... C 1 d:' S BE 1 Center for Software Engineering
Data Collection
+ Quantitative collection - Project sizing - Software metrics
+ Qualitative collection - Scale Factors - Effort Multipliers
University of Southern California 1 ... C + S 1~ \ 1 Center for Software Engineering $%%-
Amadeus System
+ Automated system for software - metric collection - metric analysis - metric reporting, graphing - metric prediction
+ Open architecture for extensibility
+ Supported metrics
University of Southern California Center for Software Engineering
Preliminary Analysis
+ Calibration to original COCOMO project database - Trend analysis is positive - Coefficient, A, calibrated - Correlation between Cost Drivers
+ Continuing analysis from other sources
University of Southern California " c B s m E I ;,< ?>I 0 3::
enter for Software Engineering
Results of COCOMO 81 Database Calibration
C2-C1 ERR
University of Southern Caliornia Center for Software Engineering
Outline
+ Steps in Software Cost Estimation - 10 years of TRW experience
+ Integrated Estimation, Measurement, and Management - Getting to CMM levels 4 and 5 - Role of COCOMO 2.0
+ COCOMO 2.0 Model Overview
+ COCOMO 2.0 Status and Plans + Information Sources
University of Southern California Center for Software Engineering
COCOMO 2.0 Status and Plans
+ Current Status - Affiliates' program
+ Future plans - Public and commercial versions - Model extensions
+ Long term vision
University of Southern California
C I S I E I Center for Software Engineering - COCOMO 2.0 Current Status
Model structure iterated with Affiliates - Largely converged
- Recent function point backfiring refinements being considered
Model being calibrated to Affiliates', other data - Currently around 65 solid data points
Model being beta-tested by Affiliates - Reasonable results so far
Barry Boehrn - 3
Error Before Regression
Error after stratification:
Std. Cev = .37 Wan = .OO N= 65.N
University of Southern California I c ( S M E I Center for Software Engineering - Current COCOMO 2.0 Affiliates (27)
Commercial Industry (9) - AT&T, Bellcore, CSC, EDS, Motorola, Rational, Sun, TI, Xerox
Aerospace Industry (1 0) - E-Systems, Hughes, Litton, Lockheed Martin, Loral, MDAC,
Northop Grumman, Rockwell, SAIC, TRW
Government (3) - AFCAA, USAF Rome Lab, US Army Research Labs
FFRDC's and Consortia (5) - Aerospace, IDA, MCC, SEI, SPC
Barry Boehm - 4
University of Southern California Center for Software Engineering
COCOMO 2.0 Affiliates' Program SEI, Others
- -funding support
L
Metric, -sanitized data Process -feedback on models, tools Definitions -special project definition, support -visiting researchers COCOMO
Affiliates, 2.0
Project -
Research Workshops, joint projects data COCOMO
-USC-CSE 2.0
Sponsors -USC-IOM Data base
-UCI-ICS
-1 COCOMO 2.0 models, tools - -Amadeus metrics package -Turorials, reports, related research -Special project results -Graduating students
University of Southern California I C I S I E I Center for Software Engineering - COCOMO 2.0: Public and Commercial
Versions Available once Affiliates' calibration and beta- testing stabilizes - Final beta-test version by COCOMOISCM Forum
- Full public version January 1997
Public version: USC COCOMO 2.0 tool - will do basic estimation and calibration functions
- won't include extensive amenities, planning aids
Commercial COCOMO vendors being provided with pre-release COCOMO 2.0 parameters, source code
Barry Boehm - 5
University of Southern California Center for Software Engineering
COCOMO 2.0 Plans: Model Extensions
COTS integration costs Cost/schedule/quality tradeoffs Activity distribution Domain-tailored models Specialized life-cycle stage estimation - Hardware-software integration, operational testing
Sizing improvements Project dynamics modeling Reuse, tool, process return-on-investment models
University of Southern California Center for Software Engineering
Long Term Vision:
Produce model covering trade-offs among software cost, schedule, functionality, performance, quality Enable the use of the model to scope projects and monitor project progress Use the data from COCOMO-assisted management to calibrate improved versions of COCOMO Use the data and improved COCOMO cost drivers to enable affiliates to formulate and manage software improvement investments.
University of Southern California Center for Software Engineering
Information Sources:
+ Anonymous FTP: usc.edu in directory: /pub/soft~engineering/cocomo2 . WWW: - http://sunset.usc.edu/COCOM02.0/
Cocomo. html - Upcoming COCOMO events, Research
Group - COCOMO 81 calculator - COCOMO 2.0 User's Manual