Upload
phamkhue
View
214
Download
2
Embed Size (px)
Citation preview
Sponsored by the U.S. Department of Defense© 2004 by Carnegie Mellon University
Version 1.0 page 1
Pittsburgh, PA 15213-3890
Carnegie MellonSoftware Engineering Institute
An Acquirer’s Guide toNavigating Contractor DataJeannine SiviyWolf GoethertRobert Ferguson
Software Engineering Measurement & Analysis Initiative
© 2004 by Carnegie Mellon University Version 1.0 page 2
Carnegie MellonSoftware Engineering Institute
Trademarks and Service Marks® Capability Maturity Model, Capability Maturity Modeling, Carnegie Mellon, CERT, CERT Coordination Center, CMM,and CMMI are registered in the U.S. Patent and TrademarkOffice by Carnegie Mellon University.
SM Architecture Tradeoff Analysis Method; ATAM;CMM Integration; CURE; IDEAL; Interim Profile; OCTAVE; Operationally Critical Threat, Asset, and Vulnerability Evaluation; Personal Software Process; PSP; SCAMPI; SCAMPI Lead Assessor; SCAMPI Lead Appraiser; SCE; SEI; SEPG; Team Software Process; and TSP are servicemarks of Carnegie Mellon University.
© 2004 by Carnegie Mellon University Version 1.0 page 3
Carnegie MellonSoftware Engineering Institute
Objectives
Establish a view of the acquirer and supplier/contractorroles and responsibilities.
Show how measurement and analysis skills for internaldevelopment can be recast for acquisition and contractingenvironments.
Address a prevalent question in the acquisitioncommunity:• How can we conduct causal analysis when we no
longer control the collection processes and/or data?
© 2004 by Carnegie Mellon University Version 1.0 page 4
Carnegie MellonSoftware Engineering Institute
Outline
Acquisition roles & responsibilitiesMeasurement & analysis methodsIllustration• background• monitoring and oversight: progress analysis
SummaryReferences
© 2004 by Carnegie Mellon University Version 1.0 page 5
Carnegie MellonSoftware Engineering Institute
Responsibility and AuthorityMeasuring project and product success is the same whether theproject is internal or contracted:• on schedule• at cost• with required functionality• without defects
The acquiring program manager’s “circle of influence” and “circle ofcontrol” is different than the development project manager’s.• development project manager addresses project execution• acquisition program manager executes new set of processes• acquisition program manager should leverage development
knowledge to manage the contract methodically, rationally, andknowledgeably
© 2004 by Carnegie Mellon University Version 1.0 page 6
Carnegie MellonSoftware Engineering Institute
Roles and Information Exchange
Status Information
Develop,Customize,Integrate• systems• software• COTS
Supplier /Developer
Directions, CorrectionsDirections, Corrections
Deliverables
Interim Documents, Tangibles
Acquirer
Pre-award activities
Post-award activities
• RFP prep.• Contract Award
Contractual Handshake
Functional & Quality
Requirements
Functional & Quality
Requirements
SubContractors
SEPG / SAPG SEPG
• MonitorProjectProgress
• EvaluateDeliver-ables
© 2004 by Carnegie Mellon University Version 1.0 page 7
Carnegie MellonSoftware Engineering Institute
Measuring Project, Product, Process
Processes
Products•Supplier Produced
•Acquisition Organization Produced
ContractualHandshake
Exchange ofindicators /informationfor tracking,monitoring,direction, etc.
Acquirer
Pre-awardactivities
Post-awardactivities
Supplier
Develop,Customize,Integrate• systems• software• COTS
Project•Schedule
•Cost
•Requirements satisfaction
(status, projection, trend)
(status, projection, trend) •Quality (amount of rework)
•Quality (amount of rework)
Relationship•Roles (changes)
•Invoicing (payment)
© 2004 by Carnegie Mellon University Version 1.0 page 8
Carnegie MellonSoftware Engineering Institute
Responsibilities After ContractAward
DeliverablesDocuments- SRD- SDP- Measurement
Plan- SDD
Status Reports- Schedule- Cost- Testing
Final Product
Evaluate Quality ofDeliverables
Monitor and Oversight - Schedule & Progress - Resources & Costs - Developer’s Processes
ACQUIRER
•
•
Acquirer Responsibilities(Post-Contract Award)
Contractor
Developthe
System
© 2004 by Carnegie Mellon University Version 1.0 page 9
Carnegie MellonSoftware Engineering Institute
Monitoring & Oversight
Acquirer'sEvaluationCriteria
Acquirer’sAnalysis& ReviewProcess
Status Information
- schedule progress- budget status- test results- process results, such as inspections- process compliance
Measurable Results (Examples)
• contractor effort actual vs. plan• contractor schedule actual vs. plan• defects reported
• description, severity, class, type• size, complexity of the work product
Indicators
© 2004 by Carnegie Mellon University Version 1.0 page 10
Carnegie MellonSoftware Engineering Institute
Evaluating Quality of Deliverables
Acquirer’sEvaluationcriteria
Acquirer’sInspectionor ReviewProcess
Documentsto review- SRD- SDP- Meas Plan- SDD
Measurable Results (Examples)
• defects discovered• description, severity, class, type
• size of the work product
• effort invested in the inspectionprocess
• time spent during the inspectionactivities
Indicators
FinalDeliverables
Process
Products
© 2004 by Carnegie Mellon University Version 1.0 page 11
Carnegie MellonSoftware Engineering Institute
Outline
Acquisition roles & responsibilitiesMeasurement & analysis methodsIllustration• background• monitoring and oversight: progress analysis
SummaryReferences
© 2004 by Carnegie Mellon University Version 1.0 page 12
Carnegie MellonSoftware Engineering Institute
Sources for Measures
Goals Questions MeasuresIndicators
Goal-Driven (Software) Measurement (GDM)
USER DEFINES INDICATORS & MEASURES
Based On:• what’s needed to manage the User’s goals• decisions and decision criteria related to managingthe user’s goals
(GQIM)
Practical Software & Systems MeasurementCommon
IssueArea
MeasurementCategory
Measures
PREDEFINEDPREDEFINED PREDEFINED
© 2004 by Carnegie Mellon University Version 1.0 page 13
Carnegie MellonSoftware Engineering Institute
Data Analysis Dynamics
Getting Started• Identify the goals• Black box process view• Is the data right?• Do I have the right data?
Decision point:• If the data is not perfect, do I
move forward or obtain betterdata?
Initial Evaluation• What should the data look like?• What does the data look like?• Can I characterize the process,
product, problem?
Decision point:• Can I address my goals
right now?• Or is additional analysis
necessary? at the same ordeeper level of detail?
• Can I move forward?
Moving Forward• Further evaluation• Decompose data, process
Decision point:• Do I take action?• What action do I take?
Repeat until root cause found, attarget with desired variation
[DAD 03]
© 2004 by Carnegie Mellon University Version 1.0 page 14
Carnegie MellonSoftware Engineering Institute
Performance Analysis Model
TechnicalAdequacy
DevelopmentPerformance
Growth andStability
Resourcesand Cost
Schedule andProgress
ProductQuality
CustomerSatisfaction
[PSM 00]
© 2004 by Carnegie Mellon University Version 1.0 page 15
Carnegie MellonSoftware Engineering Institute
Performance Analysis Checklist 1Single indicator issues:• Do actual trends correspond to planned trends, such as
progress, growth, and expenditures? How big is thevariance?
• Does the variance appear to be gradually growing eachmonth?
• Are actual values exceeding planned limits, such asopen defects, changes, and resource utilization?
[PSM 00]
© 2004 by Carnegie Mellon University Version 1.0 page 16
Carnegie MellonSoftware Engineering Institute
Performance Analysis Checklist 2Integrated indicator issues:• Is the source of the problem evident?
- Change in functionality, unplanned rework, etc.• Are growing problems in one area a leading indicator of
other problems later in the project?- Requirements creep impact on schedule
• Do multiple indicators lead to similar conclusions?- Lack of progress correlates with low staffing
• Does other project information contradict performanceresults?- Milestones being met but open defect counts are
increasing
[PSM 00]
© 2004 by Carnegie Mellon University Version 1.0 page 17
Carnegie MellonSoftware Engineering Institute
Outline
Acquisition roles & responsibilitiesMeasurement & analysis methodsIllustration• background• monitoring and oversight: progress analysis
SummaryReferences
© 2004 by Carnegie Mellon University Version 1.0 page 18
Carnegie MellonSoftware Engineering Institute
IllustrationThis illustration is based on an organization that is• maintaining an existing product, a blend of COTS, and
internally developed code• pursuing the acquisition of a replacement product
Their acquisition includes two contracts:• requirements development• product design, code, and test
This illustration will focus on• analyzing project execution data (Contract 2)
© 2004 by Carnegie Mellon University Version 1.0 page 19
Carnegie MellonSoftware Engineering Institute
Monitoring & Oversight
Contract 2 has been awarded.• supplier is developing the product in two builds
The contractor has just notified you that the project hasboth cost and schedule slippage.
What do you do?
© 2004 by Carnegie Mellon University Version 1.0 page 20
Carnegie MellonSoftware Engineering Institute
Contractor Information
Contractor information:• provides data per your contractual agreement• also provides additional data if you ask*• uses measurement data to help monitor development• has defined processes and monitors compliance• analyzes software trouble reports to identify process
improvements• average software process maturity
*Typically, If the RFP does not require the data, the contractor is not obligated to provide it. In thiscase study, the acquirer and contractor have a good working relationship and the contractor is willingto share data beyond what the RFP specifies.
© 2004 by Carnegie Mellon University Version 1.0 page 21
Carnegie MellonSoftware Engineering Institute
Action: Management Review
Meet with development contractor to:• Find out what is happening.• Find out why it is happening.
What decisions/actions could be taken based upon thedata presented?• to correct the slippage• to prevent further slippage
Postulate future consequences of these decisions
Identify actions that could have been taken earlier toprevent the slip
© 2004 by Carnegie Mellon University Version 1.0 page 22
Carnegie MellonSoftware Engineering Institute
TechnicalAdequacy
DevelopmentPerformance
Growth andStability
Resourcesand Cost
Schedule andProgress
ProductQuality
CustomerSatisfaction
Use model to guide analysis.• Step 1: Confirm Problem (Cost & Schedule Slippage)
Performance Analysis Model
© 2004 by Carnegie Mellon University Version 1.0 page 23
Carnegie MellonSoftware Engineering Institute
Schedule & Progress IndicatorsCost
EV
PV
AC
J F M A M J J A S O N D J F M A M J J A S O N D
2002 2003
PV: Planned ValueAC: Actual CostEV: Earned Value
PV: Planned ValueAC: Actual CostEV: Earned Value
} Cost Var. = EV - AC
Sched Var. = EV - PV
}
EV
PV
AC
Cost
EV
PV
AC
J F M A M J J A S O N D J F M A M J J A S O N D
2002 2003
PV: Planned ValueAC: Actual CostEV: Earned Value
PV: Planned ValueAC: Actual CostEV: Earned Value
} Cost Var. = EV - AC
Sched Var. = EV - PV
}
EV
PV
AC
Est. Requirements
Architecture
Build 1DesignAssembly & TestIntegration & Ver.
Build 2DesignAssembly & TestIntegration & Ver.
Product Integration
Formal Qual. Test
J F M A M J J A S O N D J F M A M J J A S O N D2002 2003
Life cycle Activities
Tool tips:
The top two charts weremade in Excel andmanually manipulated.
The Gantt chart can begenerated using anyscheduling software.
Components Completing A&T
0
10
20
30
40
50
60
S O N D J F M A M J J A
Planned Actual
As of Jul 03
Build 1 Assembly & Test Build 2 Assembly & Test
2002 2003
A&T = Assembly and Test
© 2004 by Carnegie Mellon University Version 1.0 page 24
Carnegie MellonSoftware Engineering Institute
What We Learned
From Schedule and Progress indicators• cost and schedule slippage -- EV chart• activities taking longer than planned -- Gantt chart• assembly and test behind schedule -- components
completion chart
What does this mean?• confirms we have a problem
© 2004 by Carnegie Mellon University Version 1.0 page 25
Carnegie MellonSoftware Engineering Institute
Resources and Cost Indicators
Analysis/Probing Questions• Is the staff allocation
contributing to theproblem (too many, toofew, wrong time frame)?
• What is rate of staffturnover?
• How does actual staffcompare to planned staffallocation?
TechnicalAdequacy
DevelopmentPerformance
Growth andStability
Resourcesand Cost
ProductQuality
CustomerSatisfaction
Schedule andProgress
© 2004 by Carnegie Mellon University Version 1.0 page 26
Carnegie MellonSoftware Engineering Institute
Resources and Cost Indicators
0
5
10
15
20
25
30
35
40
45
Jan
Mar May Ju
l
Sep
Nov Jan
Mar
May Ju
l
Sep
Nov
Nu
mb
er o
f S
taff
Planned
Actual
2002 2003
Turn Over
Key Pers.
Others.
Tool tip: This chart was made in Excel and manually manipulated.
Plan 2 3 3 3 9 9 11 12 22 23 23 22 22 22 22 15Actual 3 3 4 5 5 5 6 9 9 11 11 15 18 19 19 20 20 30 30
Prg
Plan 1 1 1 3 3 2 2 5 5 7 8 8 8 8 5Actual 3 3 3 3 3 3 3 3 5 5 5 5 5 5 5 5 5 5 5Tester
© 2004 by Carnegie Mellon University Version 1.0 page 27
Carnegie MellonSoftware Engineering Institute
What We Learned
From Resources and Cost Indicators• staffing did not follow planned level
- too many at beginning of project- testers and programmers used to fill in for analysts
and designers => high re-training costs- high turnover rate => training & getting up-to-speed
costs
What does this mean?• cost overrun due partly to staffing problems
© 2004 by Carnegie Mellon University Version 1.0 page 28
Carnegie MellonSoftware Engineering Institute
Growth and Stability Indicators
Analysis/Probing Questions• Are the requirements stable?• What is the code growth?• Is functionality being
transferred from build 1 tobuild 2? If so, how does thiseffect the delivery date?
TechnicalAdequacy
DevelopmentPerformance
Growth andStability
Resourcesand Cost
ProductQuality
CustomerSatisfaction
Schedule andProgress
© 2004 by Carnegie Mellon University Version 1.0 page 29
Carnegie MellonSoftware Engineering Institute
2002 2003Feb Mar Apr May Jun Jul Aug Sep Dec Mar Jul
Req Changes 10 11 6 14 10 8 4 14 4 1 5Complexity Low Low Low Low Low Low Low Low Low Low LowResources (staff-days)
4 5 3 2 4 2 3 3 2 1 2
0
20
40
60
80
100
120
140
AddedReq.
Total Req.
Nu
mb
er o
f R
equ
irem
ents
J F M A M J J A S O N D J F M A M J2002 2003
J
Build 1
Build 2
As of Jul 03
Requirement Changes Information
Tool tip: This chartcan be generatedin Excel followedby manual editingusing the drawingtoolbar
© 2004 by Carnegie Mellon University Version 1.0 page 30
Carnegie MellonSoftware Engineering Institute
Growth and Stability IndicatorsSize Growth02
Jul
Jan
02
Mar
02
May
02
Sep
02
Nov
02
Jan-
03
Mar
03
May
03
Jul0
3
Sep
03
Lin
es o
f C
od
e
0
100
200
300
400
500
600
Plan
Actuals
Build 1
Build 2
In 1
00K
As of Jul 03
Requirements per Build
Contractor’s Explanation:• Functions deferred to later buildbecause of unanticipatedcomplexity
Tool tip: This chart was made in Excel andmanually manipulated.
Nu
mb
er o
f R
eq.
As of Jul 03
0
10
20
30
40
50
60
70
80
90
Build1
Build2
Original PlanReplan 1ActualNot Done
© 2004 by Carnegie Mellon University Version 1.0 page 31
Carnegie MellonSoftware Engineering Institute
What We LearnedFrom Growth and Stability Indicators• requirement changes are of low complexity but will
have some ripple effect• code production below planned value• functionality being deferred from build 1 to build 2
attributed by contractor to unanticipated complexity
What does this mean?• expect further cost and schedule growth due to low
code production and increased number of functions tobe implemented in Build 2
• expect an impact on completion date due to functionsdeferred to Build 2
• expect the possibility of a “Build 3” proposal
© 2004 by Carnegie Mellon University Version 1.0 page 32
Carnegie MellonSoftware Engineering Institute
Product Quality Indicators
TechnicalAdequacy
DevelopmentPerformance
Growth andStability
Resourcesand Cost
ProductQuality
CustomerSatisfaction
Schedule andProgress
Analysis/Probing Questions• Are the defined processes
being followed?• What is the rate of closure for
trouble reports?• What type of trouble reports
are being detected? In whatphase?
© 2004 by Carnegie Mellon University Version 1.0 page 33
Carnegie MellonSoftware Engineering Institute
Product Quality Indicators
STR Status: Open vs. Closed
0
100
200
300
400
500
600
700
800
900
J F M A M J J A S O N D J F M A M J J A S O N D
Open
Closed
Delta
As of Jul 03
20032002
Build 1
Build 2
Tool tip: This chart was made in Excel and manually manipulated.
0
50
100
150
200
Open (200)
Closed (400)
Priority1
Priority2
Priority3
Priority4
Priority5
Nu
mb
er o
f S
TR
s
Open and Closed STRs
Showstopper Nit
Tallies on Jul 03
Cumulative
© 2004 by Carnegie Mellon University Version 1.0 page 34
Carnegie MellonSoftware Engineering Institute
Classifying Trouble Report Defects
0 5 10 15 20 25
DocumentationSyntax
Build, package
Assignment
Interface
Checking
Data
Function
System
Environment
Percent of Defects
Def
ect
Typ
e
Types that codeinspections wouldhave been expectedto catch
© 2004 by Carnegie Mellon University Version 1.0 page 35
Carnegie MellonSoftware Engineering Institute
What We Learned
From Product Quality Indicators• STRs being opened faster than they’re being closed• Code inspections should have found defect types
What does this mean?• Code inspection process allowed large number of
defects to slip through.
© 2004 by Carnegie Mellon University Version 1.0 page 36
Carnegie MellonSoftware Engineering Institute
Development PerformanceIndicators
TechnicalAdequacy
DevelopmentPerformance
Growth andStability
Resourcesand Cost
ProductQuality
CustomerSatisfaction
Schedule andProgress
Analysis/Probing Questions• Are the defined processes
being followed?• Are any defined processes
being skipped?
© 2004 by Carnegie Mellon University Version 1.0 page 37
Carnegie MellonSoftware Engineering Institute
Development PerformanceIndicators
Process Compliance
50556065707580859095
100
Jan-
02F
eb-0
2M
ar-0
2A
pr-0
2M
ay-0
2Ju
n-02
Jul-0
2A
ug-0
2S
ep-0
2O
ct-0
2N
ov-0
2D
ec-0
2Ja
n-03
Feb
-03
Mar
-03
Apr
-03
May
-03
Jun-
03Ju
l-03
Aug
-03
Sep
-03
Oct
-03
Nov
-03
Dec
-03
Per
cen
tag
e
Build 1
Build 2
As of Jul 03
0 25 50 75 100
Documentation
SQACSCI Int. & Testing
Code & Unit Testing
Config. Mgt
Req. Analysis
Rework
Process
% Completion
Detailed Design
As of Jul 03Process Compliance
Funct. Int. & Testing
Inspections
Prelim. Design
Tool tip: This chart was made in Excel and manually manipulated.
© 2004 by Carnegie Mellon University Version 1.0 page 38
Carnegie MellonSoftware Engineering Institute
What We Learned
From Development Performance Indicators• adherence to defined process decreased over time• stopped doing inspections
What does this mean?• defects usually detected during code inspections
allowed to slip through• impact on cost and schedule due to rework
© 2004 by Carnegie Mellon University Version 1.0 page 39
Carnegie MellonSoftware Engineering Institute
Reasons for SlippageStaffing problems:• too many at beginning of project• below planned level during most of development
- noting that productivity increased dramatically• high turnover rate
Process compliance:• stopped doing inspections• allowed errors to leak to later phases
Requirements changes after Build 2 code and unit test
Conclusion:• expect further cost and schedule growth due to low code
production and increased number of functions to beimplemented in Build 2
© 2004 by Carnegie Mellon University Version 1.0 page 40
Carnegie MellonSoftware Engineering Institute
Possible ActionsDeveloper Actions• replan based on current performance• get staffing under control
- verify the skills balance of resources- do not decrease staffing to conform to “planned”
staffing, particularly if that would decrease thenumber of programmers
• restart inspections- code- test cases
Acquirer Decision Options• use contract labor (additional costs)• deliver smaller size - less functionality• accept schedule slip
© 2004 by Carnegie Mellon University Version 1.0 page 41
Carnegie MellonSoftware Engineering Institute
What if Data is Not Available?Data may not be available because• contractor does not collect this level of data• contractor not required by contract to report it
Using process compliance data as an example, how mightmissing this data affect conclusions, actions, and project results?• corrective action might have adjusted staffing• it would not have addressed the skipped inspections which
allowed errors to leak to later phases, resulting in increasedcost and schedule
A possible action to infer process compliance• could check data on results of code inspections (if data is
specified on contract)
Lesson learned: specify in contract what type of data to bereported in status reports
© 2004 by Carnegie Mellon University Version 1.0 page 42
Carnegie MellonSoftware Engineering Institute
Prevention
Cost & Schedule Slippage
Visible indicators of underlyingproblems
Staffing issues
Requirement changes
Root causes of problems
Functionality being deferred
Decreased process compliance
(skipped inspections)
Developers “corrective” actions
TechnicalAdequacy
DevelopmentPerformance
Growth andStability
Resourcesand Cost
ProductQuality
CustomerSatisfaction
Schedule andProgress
The performance analysis model is a guide to root cause.Understanding root cause leads to prevention.
Unanticipated complexity
Inability to process STRs
Direct causes of problems
© 2004 by Carnegie Mellon University Version 1.0 page 43
Carnegie MellonSoftware Engineering Institute
Outline
Acquisition roles & responsibilitiesMeasurement & analysis methodsIllustration• background• monitoring and oversight: progress analysis
SummaryReferences
© 2004 by Carnegie Mellon University Version 1.0 page 44
Carnegie MellonSoftware Engineering Institute
Roles and Information Exchange
Status Information
Develop,Customize,Integrate• systems• software• COTS
Supplier /Developer
Directions, CorrectionsDirections, Corrections
Deliverables
Interim Documents, Tangibles
Acquirer
Pre-award activities
Post-award activities
• RFP prep.• Contract Award
Contractual Handshake
Functional & Quality
Requirements
Functional & Quality
Requirements
SubContractors
SEPG / SAPG SEPG
• MonitorProjectProgress
• EvaluateDeliver-ables
© 2004 by Carnegie Mellon University Version 1.0 page 45
Carnegie MellonSoftware Engineering Institute
Measuring Project, Product, Process
Processes
Products•Supplier Produced
•Acquisition Organization Produced
ContractualHandshake
Exchange ofindicators /informationfor tracking,monitoring,direction, etc.
Acquirer
Pre-awardactivities
Post-awardactivities
Supplier
Develop,Customize,Integrate• systems• software• COTS
Project•Schedule
•Cost
•Requirements satisfaction
(status, projection, trend)
(status, projection, trend) •Quality (amount of rework)
•Quality (amount of rework)
Relationship•Roles-Changes
•Invoicing-payment
© 2004 by Carnegie Mellon University Version 1.0 page 46
Carnegie MellonSoftware Engineering Institute
SummaryKey acquisition responsibilities (after contract award):• monitoring and oversight• inspecting, reviewing, and understanding documents and
other work products
Post-contract award success depends on pre-contract awardactivities• building measurement expectations into contracts• establishing good partnerships and working relationships with
contractors
Measures and indicators across landscape are interrelated• use the Performance Analysis Model as your navigation guide• always use multiple indicators
Measure products, processes, projects, relationships
© 2004 by Carnegie Mellon University Version 1.0 page 47
Carnegie MellonSoftware Engineering Institute
Contact InformationWolf GoethertSoftware Engineering InstituteMeasurement & Analysis InitiativeEmail: [email protected]
Jeannine SiviySoftware Engineering InstituteMeasurement & Analysis InitiativeEmail: [email protected]
Robert FergusonSoftware Engineering InstituteMeasurement & Analysis InitiativeEmail: [email protected]
© 2004 by Carnegie Mellon University Version 1.0 page 48
Carnegie MellonSoftware Engineering Institute
References 1Note: URLs valid as of tutorial delivery date.
[DAD 03] Siviy, Jeannine and William Florac, Data Analysis Dynamics, Half Day TutorialDelivered at SEPG 2003, Boston, MA
[GQIM 96] Goal-Driven Software Measurement--A Guidebookhttp://www.sei.cmu.edu/publications/documents/96.reports/96.hb.002.html
[PSM 00] Practical Software and Systems Measurement A Foundation for Objective ProjectManagement, Guidebook, version 4.0b, Practical Software and SystemsMeasurement Support Center, U.S. TRACOM-ARDEC, AMSTA-AR-QA-A, PicatinnyArsenal, NJ, Website: www.psmsc.com, October 2000
© 2004 by Carnegie Mellon University Version 1.0 page 49
Carnegie MellonSoftware Engineering Institute
Reading & Resources 1
Note: URLs valid as of tutorial delivery date.
Practical Software and Systems Measurement (PSM)• reference for the Performance Analysis Model• reference lists of measures to consider• http://www.psmsc.com
Goal Driven Measurement (GDM) andGoal-Question-Indicator-Metric (GQIM)• front end for selecting most relevant PSM measures• used for developing context-specific indicators, particularly
“success indicators”• “Goal-Driven Software Measurement--A Guidebook”
http://www.sei.cmu.edu/publications/documents/96.reports/96.hb.002.html
© 2004 by Carnegie Mellon University Version 1.0 page 50
Carnegie MellonSoftware Engineering Institute
Reading & Resources 2
Note: URLs valid as of tutorial delivery date.
Defense Acquisition University (DAU) Deskbook• http://deskbook.dau.mil/jsp/default.jsp• provides information about regulatory references, mandatory
and discretionary references by service branch, and severalknowledge repositories
Guidelines for Successful Acquisition and Management ofSoftware-Intensive Systems,http://www.stsc.hill.af.mil/resources/tech_docs/index.html
Acquisition Centers of Excellence• Air Force, for instance ESC Hanscom
- http://esc.hanscom.af.mil/ESC-BP/• Navy
- http://www.ace.navy.mil/public/html/
© 2004 by Carnegie Mellon University Version 1.0 page 51
Carnegie MellonSoftware Engineering Institute
Reading & Resources 3
Note: URLs valid as of tutorial delivery date.
Project Management Body of Knowledge (PMBOK®)• proven, traditional project management practices and
innovated, advanced practices with more limited use• Project Management Institute Guide to the PMBOK contains
the generally accepted subset of knowledge and practicesthat are applicable to most projects most of the time- http://www.pmi.org/info/PP_StandardsExcerpts.asp- http://www.pmi.org/info/PP_PMBOK2000Excerpts.asp