15
University of Southern California Center for Software Engineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop July 31, 2001

University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop

Embed Size (px)

Citation preview

Page 1: University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop

University of Southern CaliforniaCenter for SoftwareEngineering

Reliable Software Research and Technology Transition

Barry Boehm, USC

NASA IT Workshop

July 31, 2001

Page 2: University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop

07/31/01 ©USC-CSE 2

University of Southern CaliforniaCenter for SoftwareEngineering

Outline

• Reliable Software Strategic Context– Opportunity Tree

• Reliable Software Technology Prospects• Reliable Software Technology Transition– Software Technology Transition Challenges

– Technology Transition Rate Drivers

– Technology Transition Acceleration

–Staged Testbed Experimentation

Page 3: University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop

07/31/01 ©USC-CSE 3

University of Southern CaliforniaCenter for SoftwareEngineering

Software Dependability Opportunity Tree

DecreaseDefectRiskExposure

Continuous Improvement

DecreaseDefectImpact,Size (Loss)

DecreaseDefectProb (Loss)

Defect Prevention

Defect Detectionand Removal

Value/Risk - BasedDefect Reduction

Graceful Degradation

CI Methods and Metrics

Process, Product, People

Technology

Page 4: University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop

07/31/01 ©USC-CSE 4

University of Southern CaliforniaCenter for SoftwareEngineering

Software Defect Prevention Opportunity Tree

IPT, JAD, WinWin,…

PSP, Cleanroom, Dual development,…

Manual execution, scenarios,...

Staffing for dependability

Rqts., Design, Code,…

Interfaces, traceability,…

Checklists

DefectPrevention

Peoplepractices

Standards

LanguagesPrototypingModeling & SimulationReuseRoot cause analysis

Page 5: University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop

07/31/01 ©USC-CSE 5

University of Southern CaliforniaCenter for SoftwareEngineering

People Practices: Some Empirical Data

• Cleanroom: Software Engineering Lab– 25-75% reduction in failure rates– 5% vs 60% of fix efforts over 1 hour

• Personal Software Process/Team Software Process– 50-75% defect reduction in CMM Level 5 organization– Even higher reductions for less mature organizations

• Staffing– Many experiments find factor-of-10 differences in people’s

defect rates

Page 6: University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop

07/31/01 ©USC-CSE 6

University of Southern CaliforniaCenter for SoftwareEngineering

Software Defect Detection Opportunity TreeCompleteness checking

Consistency checking

- Views, interfaces, behavior, pre/post conditions

Traceability checking

Compliance checking

- Models, assertions, standards

Defect Detectionand Removal - Rqts. - Design - Code

Testing

Reviewing

AutomatedAnalysis

Peer reviews, inspections

Architecture Review Boards

Pair programming

Requirements & design

Structural

Operational profile

Usage (alpha, beta)

Regression

Value/Risk - based

Test automation

Page 7: University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop

07/31/01 ©USC-CSE 7

University of Southern CaliforniaCenter for SoftwareEngineering

Orthogonal Defect Classification- Chillarege, 1996

Percent within activity

40

30

20

10

25

40

20 20

30

20

30 30

10

40

20

10

0

10

20

30

40

50

Design Code review Function test System test

Function Assignment Interface Timing

Page 8: University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop

UMD-USC CeBASE Experience Comparisons- http://www.cebase.org

“Under specified conditions, …”

Technique Selection Guidance (UMD, USC)• Peer reviews are more effective than functional testing for faults of

omission and incorrect specification – Peer reviews catch 60 % of the defects

• Functional testing is more effective than reviews for faults concerning numerical approximations and control flow

Technique Definition Guidance (UMD)• For a reviewer with an average experience level, a procedural

approach to defect detection is more effective than a less procedural one.

• Readers of a software artifact are more effective in uncovering defects when each uses a different and specific focus. – Perspective - based reviews catch 35% more defects

807/31/01

CeBASE

Page 9: University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop

07/31/01 ©USC-CSE 9

University of Southern CaliforniaCenter for SoftwareEngineering

Defect Impact Reduction Opportunity TreeBusiness case analysis

Pareto (80-20) analysis

V/R-based reviews

V/R-based testing

Cost/schedule/quality

as independent variable

DecreaseDefectImpact,

Size (Loss)

GracefulDegredation

Value/Risk -Based DefectReduction

Fault tolerance

Self-stabilizing SW

Reduced-capability models

Manual Backup

Rapid recovery

Page 10: University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop

07/31/01 ©USC-CSE 10

University of Southern CaliforniaCenter for SoftwareEngineering

C, S, Q as Independent Variable• Determine Desired Delivered Defect Density (D4)

– Or a value-based equivalent

• Prioritize desired features– Via QFD, IPT, stakeholder win-win

• Determine Core Capability– 90% confidence of D4 within cost and schedule– Balance parametric models and expert judgment

• Architect for ease of adding next-priority features– Hide sources of change within modules (Parnas)

• Develop core capability to D4 quality level– Usually in less than available cost and schedule

• Add next priority features as resources permit• Used successfully on 24 of 26 USC digital library projects

Page 11: University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop

07/31/01 ©USC-CSE 11

University of Southern CaliforniaCenter for SoftwareEngineering

HDC Technology Prospects• High-dependability change

– Architecture-based analysis and composition – Lightweight formal methods

• Model-based analysis and assertion checking – Domain models, stochastic models

• High-dependability test data generation• Dependability attribute tradeoff analysis

– Dependable systems form less-dependable components– Exploiting cheap, powerful hardware

• Self-stabilizing software• Complementary empirical methods

– Experiments, testbeds, models, experience factory– Accelerated technology transition, education, training

Page 12: University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop

07/31/01 ©USC-CSE 12

University of Southern CaliforniaCenter for SoftwareEngineering

Outline

• Reliable Software Strategic Context– Opportunity Tree

• Reliable Software Technology Prospects• Reliable Software Technology Transition– Software Technology Transition Challenges

– Technology Transition Rate Drivers

– Technology Transition Acceleration

–Staged Testbed Experimentation

Page 13: University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop

07/31/01 ©USC-CSE 13

University of Southern CaliforniaCenter for SoftwareEngineering

• Adoption requires behavioral change• Payoffs take a long time to demonstrate

– And are hard to trace back to particular technology insertions

• Marketplace often makes “fixing it later” more attractive than “doing it right the first time”– “The IT industry expends the bulk of its resources, both

financial and human, on rapidly bringing products to market.” -PITAC Report, p.8

• Strong coupling among technologies, processes, acquisition practices, cultures

• Rapidly evolving commercial technology• Slowly evolving Government acquisition practices• Risk-averse program managers• Leaky intellectual property

Software Engineering Technology Transition Challenges

Page 14: University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop

07/31/01 ©USC-CSE 14

University of Southern CaliforniaCenter for SoftwareEngineering

1. Competition-criticality

2. Impact on current operations, power bases

3. Number of concurrent innovations involved

4. Adopter change/risk-aversion

5. Payback realization speed

6. Regulatory obstacles or incentives

7. Degree of executive support

8. Factor Endowment (human, knowledge, capital, infrastructure)

Technology Transition Rate Drivers

Page 15: University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop

07/31/01 ©USC-CSE 15

University of Southern CaliforniaCenter for SoftwareEngineering

Technology Transition Acceleration: Staged Testbed Experimentation• Assess technology readiness, scalability, generality by

experimental use in staged testbeds– Stage I: small representative subsystems– Stage II: larger subsystems with representative

mission components– Stage III: part-to-full systems in full mission simulator– Stage IV: test-mode application to full mission-systems

• Use Goal-Question-Metric approach– Goals: mission objectives and priorities– Questions: content and hypotheses to test– Metrics: data to collect and analyze