Upload
erika-fletcher
View
216
Download
4
Embed Size (px)
Citation preview
1© 2009 Fraunhofer Center - Maryland
Dr. Forrest Shull, FCMD
Verification of Software Architecturesvia Reviews
2© 2009 Fraunhofer Center - Maryland
Lessons Learned about Software Reviews at NASA
Tailoring Lessons Learned for IV&V Architecture Verification
Mechanisms for Continuous Improvement
Outline
3© 2009 Fraunhofer Center - Maryland
A long history of research & application shows that structured human inspection is one of the most cost-effective practices for achieving quality software: “Cost savings rule”: Cost to find & fix software defects is about 100x
more expensive after delivery than in early lifecycle phases, for certain types of defects.
IBM: 117:1 between code and use Toshiba: 137:1 between pre- and post-shipment Data Analysis Center for Software: 100:1
“Inspection effectiveness rule”: Reviews and inspections find over 50% of the defects in an artifact, regardless of the lifecycle phase applied.
50-70% across many companies (Laitenberger) 64% on large projects at Harris GCSD (Elliott) 60% in PSP design/code reviews (Roy) 50-95%, rising with increased discipline (O’Neill) … many others
Software Inspection
4© 2009 Fraunhofer Center - Maryland
Key Lessons Learned
V e h ic le
• N u m b e r • Y e a r
• M a k e • M o d e l
• B o d y S ty le
• G ro s s R e g is tra tio n
• D a ta T im e S ta rte d • D a te T im e E n d
• P la te
• F e e T itle
• N u m b e r
• O w n e rs h ip E v id e n c e • S u rre n d e re d T itle
• F e e
• P a s s e n g e rs • D ie s e l
• C o lo r • C o s t
• M ile a g e
V e h ic le
• N u m b e r • Y e a r
• M a k e
• M o d e l • B o d y S ty le
• G ro s s
R e g is tra tio n • D a ta T im e S ta rte d
• D a te T im e E n d
• P la te • F e e
T itle
• N u m b e r • O w n e rs h ip E v id e n c e
• S u rre n d e re d T itle
• F e e
• P a s s e n g e rs • D ie s e l
• C o lo r • C o s t
• M ile a g e
A n a ttrib u te is a d a ta e le m e n t u s e d to
d e s c rib e a n in s ta n c e o f a n O b je c t o r C la s s ific a tio n S tru c tu re .
1 . Id e n tify th e A ttrib u te s 2 . P o s itio n th e A ttrib u te s
3 . Id e n tify a n d d e fin e th e in s ta n c e
c o n n e c tio n s 4 . R e v is e th e O b je c ts
5 . S p e c ify th e A ttrib u te
6 . S p e c ify th e in s ta n c e c o n s tra in ts
A ttrib u te S p e c ific a tio n :
N a m e , d e s c rip tio n , a n d a llo w a b le v a lu e s , ra n g e , lim it, u n it o f m e a s u re , a n d
p re c is io n . C a te g o riz e e a c h A ttrib u te a s d e s c rip tiv e , d e fin itio n a l, a lw a y s
d e riv a b le , o r o c c a s io n a lly d e riv a b le .
V e h ic le
• N u m b e r • Y e a r
• M a k e • M o d e l
• B o d y S ty le
• G ro s s R e g is tra tio n
• D a ta T im e S ta rte d • D a te T im e E n d
• P la te
• F e e T itle
• N u m b e r
• O w n e rs h ip E v id e n c e • S u rre n d e re d T itle
• F e e
• P a s s e n g e rs • D ie s e l
• C o lo r • C o s t
• M ile a g e
V e h ic le
• N u m b e r • Y e a r
• M a k e
• M o d e l • B o d y S ty le
• G ro s s
R e g is tra tio n • D a ta T im e S ta rte d
• D a te T im e E n d
• P la te • F e e
T itle
• N u m b e r • O w n e rs h ip E v id e n c e
• S u rre n d e re d T itle
• F e e
• P a s s e n g e rs • D ie s e l
• C o lo r • C o s t
• M ile a g e
A n a ttrib u te is a d a ta e le m e n t u s e d to
d e s c rib e a n in s ta n c e o f a n O b je c t o r C la s s ific a tio n S tru c tu re .
1 . Id e n tify th e A ttrib u te s 2 . P o s itio n th e A ttrib u te s
3 . Id e n tify a n d d e fin e th e in s ta n c e
c o n n e c tio n s 4 . R e v is e th e O b je c ts
5 . S p e c ify th e A ttrib u te
6 . S p e c ify th e in s ta n c e c o n s tra in ts
A ttrib u te S p e c ific a tio n :
N a m e , d e s c rip tio n , a n d a llo w a b le v a lu e s , ra n g e , lim it, u n it o f m e a s u re , a n d
p re c is io n . C a te g o riz e e a c h A ttrib u te a s d e s c rip tiv e , d e fin itio n a l, a lw a y s
d e riv a b le , o r o c c a s io n a lly d e riv a b le .
WorkProduct
Improved WorkProduct
By-Products
Resources Procedures Schedule & Staff Time Training
Defects
(To Prevent Repetition of Defects)
Metrics
InputOutput
( For Monitoring and Controlling Process and For Continuous S/W Process Improvement)
Overview Preparation Follow-UpPreparation Inspection ReworkReworkPlanning
Third Hour
(IdentifyPotentialDefects)
(Find & Record Defects)
(Verify Fixes)
(Open Issues / Solutions)
(SelectInspection Team)
Meeting
(Present BackgroundInformation)
Meeting
(Fix Defects)
5© 2009 Fraunhofer Center - Maryland
Key Lessons Learned Focusing the review materials
Giving each reviewer a particular and unique perspective on the document under review
Making individual review of a document an active (rather than passive) undertaking
Articulating the quality aspects of interest
…An approach known as Perspective-Based Inspection incorporates these key points.
6© 2009 Fraunhofer Center - Maryland
History 2001-2003: SARP research funding
Refined basic approach, implemented on NASA projects and in classrooms, measured results
2004: Tech infusion with Flight Software Branch / GSFC Goal: Produce new Branch standards
2004: Tech infusion with United Space Alliance / JSC Goal: Reduce defect slippage over current increment
2007-2009: SARP funding for improved general inspection planning 2008: Tech infusion with L-3 Communications / IV&V
Goal: Effective approaches for inspecting system models
2008-2009: NSC funding for inspection training and workshops
Other Results and Benefits Improved defect detection effectiveness substantially (NASA & elsewhere) Helped re-invigorate inspection practices Forms the basis of industrial and academic training
Perspective-Based Inspection (PBI)
7© 2009 Fraunhofer Center - Maryland
Recent Application with NASA TeamsSSC: LH BargeLadder Logic
Wallops: SIPS Core Slaving ComputationProcess
JSC: LIDS
GSFC & MSFC:NPR 7150.2 procedural reqts forsoftware engineering.
ARC: Simulink modelsfor smallsat
8© 2009 Fraunhofer Center - Maryland
Focusing the Review Guidelines regarding amount of technical material that can be
effectively inspected Based on hundreds of previous NASA inspections 20 to 40 pages / hour of meeting time
Implies that choices need to be made about where to focus Find the areas with:
High risk Large complexity Critical communication Architectural “danger signs” ? Problematic development history ? (e.g. author stovepipes)
9© 2009 Fraunhofer Center - Maryland
Focusing the ReviewFor systems with history, analysis and visualization can help focus
on appropriate system components
10© 2009 Fraunhofer Center - Maryland
Focusing the Review Potential “code smells” can be automatically detected for further
investigation.
Class uses directly more than a few attributes of other classes
Functional complexity of the class is very high
Class cohesion is low
God ClassAND
ATFD > FEW (5)
WMC ≥ VERY HIGH (47)
TCC < ONE THIRD (0.33)
• ATFD= Access To Foreign Data• WMC= Weighted Methods per class• TCC= Tight Class Cohesion
11© 2009 Fraunhofer Center - Maryland
Improves both team and individual results by up to 33%. Rationale: Focus the responsibilities of each inspector Minimize overlap among inspector responsibilities Maximize union of defects found
Review Perspectives
Designer
2
Tester
3
3Use-based
5
3
1 6
Rev. 1
0
Rev. 2
1
1
Rev.3
8
1
0 4vs.
12© 2009 Fraunhofer Center - Maryland
Tailoring required for new domain, new issues, new artifacts
Review Perspectives
Needs, goals, objectives elucidation
Architecture validation
Architecture construction
System Architecture
Fanny May : Loan Arranger
Borrower : Borrower
A Lender : Specified Lender
Loan : Loan
verify_report()
new_loan(lender, borrowers)
new_
look_for_a_lender(lender)
look_for_a_loan(loan)
look_for_a_
update_loan(lender, borrower)
update_
lender :
new_lender(name,contact, phone_number)
update(lender)
monthly_report(lender, loans, borrowers)
identify_report_format()
Receive Monthly Report
Fanny May : Loan Arranger
Borrower : Borrower
A Lender : Specified Lender
Loan : Loan
verify_report()
new_loan(lender, borrowers)
new_
look_for_a_lender(lender)
look_for_a_loan(loan)
look_for_a_
update_loan(lender, borrower)
update_
lender :
new_lender(name,contact, phone_number)
update(lender)
monthly_report(lender, loans, borrowers)
identify_report_format()
Receive Monthly Report
Fanny May : Loan Arranger
Borrower : Borrower
A Lender : Specified Lender
Loan : Loan
verify_report()
new_loan(lender, borrowers)
new_
look_for_a_lender(lender)
look_for_a_loan(loan)
look_for_a_
update_loan(lender, borrower)
update_
lender :
new_lender(name,contact, phone_number)
update(lender)
monthly_report(lender, loans, borrowers)
identify_report_format()
Receive Monthly Report
Architecture verification
Are business / user needs adequately reflected in arch.?
Is architecture complete, correct, consistent, and testable?
Do defined functionalities map to architecture?
13© 2009 Fraunhofer Center - Maryland
Possible perspectives for this domain: Architect
Assesses architecture against requirements & existing models Quality foci: Appropriate level of detail; completeness and clarity;
consistency and correctness of models
Domain expert Assesses whether architecture accurately and usefully captures domain Quality foci: Correctness of architecture (independent of models);
identification of stakeholders and evaluation of usability from their POV; checking flow of control & use of reusable components
Quality assurance Assesses whether architecture and system can be validated properly Quality foci: Handling of exception cases / unexpected system conditions;
robustness of system; testability
Review Perspectives
14© 2009 Fraunhofer Center - Maryland
Articulating Defect Types 5-7 defect types per reviewer help them focus on the task
appropriately Sources of defect categories:
Checklists and defect categories from previous work Experienced inspectors
What is it that the most experienced people are looking for?Can get from:
• Asking them!
• Noting recurring questions at inspection meetings
Defect/discrepancy report databases What types of problems seem to be common?What types of problems take the longest to close?
15© 2009 Fraunhofer Center - Maryland
Articulating Defect Types Review has proven effective at finding architectural defects of:
Ambiguity Formatting / notation / identifier problems Incomplete architecture
Critical functional definition or detail missing Missing analysis of possible states / allowable parameters / other info for
coders and testers
Extraneous / superfluous components Problem with supporting or reference documents
Reference to outdated supporting document Reference to wrong supporting document Incomplete list of reference documents
16© 2009 Fraunhofer Center - Maryland
Articulating Defect Types Review has proven effective… (cont.)
Incorrect architecture Conflicting elements Incorrect as per domain knowledge or parent requirements Unsuitability to the current problem Performance could be improved
Problems with error handling Error conditions not listed or referenced Missing error handling requirements Inappropriate functions may be allowed during safe mode Too many / unnecessary error checks are included, will affect
performance
Interface problems Interfaces not specified Interfaces do not match
17© 2009 Fraunhofer Center - Maryland
Reading through a document one page at a time is not the most effective approach.
Improved results from using scenarios to guide reviewers
Should be appropriate to the perspective
Should be appropriate to reviewer’s expertise, e.g. Creating models relevant to the analysis
Ordering components by complexity, importance
Reviewers look for relevant defect types as they follow the scenario
Active Review
18© 2009 Fraunhofer Center - Maryland
Active Review Some examples:
Identify use cases, and trace control flow through the architecture from the start conditions onward
Identify scenarios based on business drivers, analyze how the architecture will handle each (SEI’s ATAM method)
Identify potential failure modes, analyze where the functionality resides for handling each
Conduct a separate domain analysis, map the results into the architecture
19© 2009 Fraunhofer Center - Maryland
Continuous Improvement Review materials as “Experience Base”
Defect taxonomy, perspectives, scenarios represent best knowledge about how to effectively find defects
Can be improved over time as we learn from outcomes
0 10 20 30 40 50 60 70
Closed
Closed Before Submitted
Draft
Not an Issue
Submitted
To Be Verified
Withdrawn
Closed
Closed Before Submitted
Draft
Not an Issue
Submitted
To Be Verified
Withdrawn
Closed
Closed Before Submitted
Not an Issue
Not To Be Verified
Submitted
To Be Verified
Withdrawn
Closed
Closed Before Submitted
Draft
Not an Issue
Submitted
To Be Verified
Withdrawn
Closed
Closed Before Submitted
Not an Issue
Submitted
To Be Verified
Withdrawn
Closed
Closed Before Submitted
Not an Issue
Submitted
To Be Verified
Withdrawn
1.91.1.2
1.1.41.1.6.2
1.1.71.4.4
0 10 20 30 40 50 60 70
Closed
Closed Before Submitted
Draft
Not an Issue
Submitted
To Be Verified
Withdrawn
Closed
Closed Before Submitted
Draft
Not an Issue
Submitted
To Be Verified
Withdrawn
Closed
Closed Before Submitted
Not an Issue
Not To Be Verified
Submitted
To Be Verified
Withdrawn
Closed
Closed Before Submitted
Draft
Not an Issue
Submitted
To Be Verified
Withdrawn
Closed
Closed Before Submitted
Not an Issue
Submitted
To Be Verified
Withdrawn
Closed
Closed Before Submitted
Not an Issue
Submitted
To Be Verified
Withdrawn
1.91.1.2
1.1.41.1.6.2
1.1.71.4.4
Defect Type X
Defect Type Y
…Should be betterdefined or removedfrom checklist?
…Seems to be on-target and effective?
- example data -
20© 2009 Fraunhofer Center - Maryland
Continuous Improvement Using metrics to focus defect types:
Materials for focusing the review
Analysis of past defect history
Analysis of found defects
Analysis of defect disposition:Issues detected vs.-Closed-Withdrawn / not an issue
21© 2009 Fraunhofer Center - Maryland
Use measurement results to further optimize, e.g., Consistently missing a certain type of defect?
Add a new defect type and associated questions to relevant perspectives
Missing a whole set of defects?Consider whether a new perspective should be added.
Do we never find defects of type x? Are there any perspectives that don’t seem to add much that the other perspectives didn’t find?
Consider deleting the relevant defect type or perspective. Do we consistently face a lot of problems of type x?
Add more reviewers using relevant perspectives
Continuous Improvement
22© 2009 Fraunhofer Center - Maryland
Summary We have many lessons learned about the effectiveness of
software reviews for verification at NASA
Architecture reviews are configurable based on: Specific quality goals of the mission Specific process in which architecture was produced / will be used
Review materials also permit several “hooks” for creating experience bases on practical and effective techniques for verification
23© 2009 Fraunhofer Center - Maryland
Forrest Shull
301-403-8970
Contact information