FTR Lecture

Embed Size (px)

Citation preview

  • 8/12/2019 FTR Lecture

    1/40

    opyright 1998 Philip M. Johnson (1)

    Introduction toFormal Technical Reviews

    Philip JohnsonAssociate Professor

    Department of Information and Comp. Sciences

    University of Hawaii

    [email protected]

    http://www.ics.hawaii.edu/~johnson/

    808 956-3489

    Permission granted to redistribute and modify thesematerials freely as long as copyright notice is

    retained.

  • 8/12/2019 FTR Lecture

    2/40

    opyright 1998 Philip M. Johnson (2)

    Objectives

    Understand the process of FTR.

    Understand the goals and benefits of FTR.

  • 8/12/2019 FTR Lecture

    3/40

    opyright 1998 Philip M. Johnson (3)

    Outline

    Basic Review Principles

    A Generic Inspection Process

    Critical Success Factors for Review

  • 8/12/2019 FTR Lecture

    4/40

    opyright 1998 Philip M. Johnson (4)

    Basic

    ReviewPrinciples

  • 8/12/2019 FTR Lecture

    5/40

    opyright 1998 Philip M. Johnson (5)

    What is Formal TechnicalReview?

    A method involving a structured encounter in whicha group of technical personnel analyzes or improvesthe quality of the original work product as well asthe quality of the method.

  • 8/12/2019 FTR Lecture

    6/40

    opyright 1998 Philip M. Johnson (6)

    Reviews improve schedule performance.

    Cost Escalation Model.Defects are much more expensive to fix the laterthey are discovered.A defect that isn't discovered until testing can be100 times more expensive to repair than if it had

    been discovered during a review.

    Why review? We test!

    No Revs.

    Revs

    Req Design Code Test

    R R R

  • 8/12/2019 FTR Lecture

    7/40opyright 1998 Philip M. Johnson (7)

    Reviews reduce rework.Rework accounts for 44% of dev. cost!Reqs (1%), Design (12%), Coding (12%), Testing(19%)

    Reviews are more productive than testing.Find more errors in the same amount of time.Reviews are more effective than testing.Find errors not possible through testing.

    Reviews are training.Domain, corporate standards, group.

    Why review? We test!

  • 8/12/2019 FTR Lecture

    8/40opyright 1998 Philip M. Johnson (8)

    Why review? Who benefits?

    Formal technical review provides:Defect information to the author.Information on work product and development topeers.Fault likelihood data to testers.

    Product status to management.Process status to SPI group.

  • 8/12/2019 FTR Lecture

    9/40opyright 1998 Philip M. Johnson (9)

    True FTR is well-defined

    Well-defined processPhases (orientation, etc.)Procedures (checklists, etc.)Well-defined roles

    Moderator, Reviewer, Scribe, Author, etc.Well-defined objectivesDefect removal, requirements elicitation, etc.

    Well-defined measurementsForms, consistent data collection, etc.

  • 8/12/2019 FTR Lecture

    10/40opyright 1998 Philip M. Johnson (10)

    FTR is effective qualityimprovement

    Reviews can find 60-100% of all defects.Reviews are technical, not management.

    Review data can assess/improve quality of:work product

    software development processreview process

    Reviews reduce total project cost, but have non-trivialcost (~15%)Upstream defect removal is 10-100 times cheaper.

    Reviews disseminate domain knowledge, development skills,and corporate culture.

  • 8/12/2019 FTR Lecture

    11/40opyright 1998 Philip M. Johnson (11)

    Industry Experience with FTR

    Aetna Insurance Company:FTR found 82% of errors, 25% cost reduction.

    Bell-Northern Research:Inspection cost: 1 hour per defect.Testing cost: 2-4 hours per defect.Post-release cost: 33 hours per defect.

    Hewlett-PackardEst. inspection savings (1993): $21,454,000

    IBM (using Cleanroom)C system softwareNo errors from time of first compile.

  • 8/12/2019 FTR Lecture

    12/40opyright 1998 Philip M. Johnson (12)

    Who, What, and When

    Who decides what should be reviewed?Senior technical personnel, project leader

    What should be reviewed?Work products with high impact upon project risks.Work products directly related to qualityobjectives.

    Upstream work products have higher impact.

    When should review be planned?Specify review method and target work products in

    software development plan/quality plan.

  • 8/12/2019 FTR Lecture

    13/40

    opyright 1998 Philip M. Johnson (13)

    The range of review practice

    TekInspect

    Development Method

    Non-Cleanroom Cleanroom

    FTRinFTR

    CodeInspection(Fagan76)

    Inspection(Gilb93)

    2-PersonInspection(Bisant89)

    N-FoldInspection(Martin90)

    Walkthrough

    (Yourdon89)

    Verification-basedInspection(Dyer92)

    ActiveDesignReviews(Parnas85)

    FTArm(Johnson94)

    ICICLE(Brothers90)

    Scrutiny(Gintell93)CAIS(Mashayekhi94)

    ManualTool-Based

    CodeReading(McConnell93)

    SoftwareReview(Humphrey90)

    Phased Insp.(Knight93)

  • 8/12/2019 FTR Lecture

    14/40

    opyright 1998 Philip M. Johnson (14)

    Families of Review Methods

    Walkthroughs

    Minimal overhead

    Developer training

    Quick turnaround

    Requirements elicitation

    Ambiguity resolution

    Training

    Method Family Typical Goals Typical Attributes

    Little/no preparation

    Informal process

    No measurement

    Not FTR!

    TechnicalReviews

    Team authorship

    Formal process

    Author presentation

    Wide range of discussion

    InspectionsDetect and remove alldefects efficiently andeffectively.

    Single author

    Formal process

    Checklists

    Measurements

    Verify phase

  • 8/12/2019 FTR Lecture

    15/40

    opyright 1998 Philip M. Johnson (15)

    AnExemplaryGenericInspectionProcess

  • 8/12/2019 FTR Lecture

    16/40

    opyright 1998 Philip M. Johnson (16)

    The Generic InspectionProcess

    Choose team, materials, dates.

    Present product, process, goals.

    Check product, note issues.

    Consolidate issues.

    Correct defects.

    Verify product/process quality

    Preparation

    Orientation

    Planning

    Review Meeting

    Rework

    Verify

  • 8/12/2019 FTR Lecture

    17/40

    opyright 1998 Philip M. Johnson (17)

    Planning

    ObjectivesGather review package: work product, checklists,references, and data sheets.

    Form inspection team.Determine dates for meetings.

    ProcedureModerator assembles team and review package.Moderator enhances checklist if needed.

    Moderator plans dates for meetings.Moderator checks work product for readiness.Moderator helps Author prepare overview.

    Planning Orientation Preparation Review Mt. Rework Verify

  • 8/12/2019 FTR Lecture

    18/40

    opyright 1998 Philip M. Johnson (18)

    Example Planning DataPlanning 1. Inspection ID ____________ Date: _____________

    2. TeamModerator _________________________

    Authors _________________________ _________________________Reviewers _________________________ _________________________

    _________________________ __________________________________________________ _________________________

    3. DocumentsWork Product _________________________ _________________________References _________________________ _________________________Checklists _________________________ _________________________

    4. Meetings Date Locat ion Start EndOrientation ____________ ____________ ____________ ____________

    Review Meeting ____________ ____________ ____________ ____________

    5. Planning References obtained for work product. Objectives Checklis ts obtained for work product .

    Moderator is trained in TekInspect method. Team members agree to proposed times/dates . Moderator's quick review yields less than 5 major issues. Reviewers unders tand responsibilities and are committed.

    6. Plan. Effort min

  • 8/12/2019 FTR Lecture

    19/40

    opyright 1998 Philip M. Johnson (19)

    Orientation

    ObjectivesAuthor provides overview.Reviewers obtain review package.Preparation goals established.

    Reviewers commit to participate.

    ProcedureModerator distributes review package.Author presents overview, if necessary.Scribe duty for Review Meeting assigned.Moderator reviews preparation procedure.

    Planning Orientation Preparation Review Mt. Rework Verify

  • 8/12/2019 FTR Lecture

    20/40

    opyright 1998 Philip M. Johnson (20)

    Example Orientation Data

    Orientation 7. Prep. Goals min/pg x pgs. = prep t ime/reviewer

    8. Orient. Reviewers understand scope and purpose of work product..

    Objectives Reviewers understand checking proc ess, checklists, and reference Work produc t, references, checklists, and checking forms provide

    9. Orient. Effort min. meet x particip. = min

  • 8/12/2019 FTR Lecture

    21/40

    opyright 1998 Philip M. Johnson (21)

    Preparation

    ObjectivesFind maximum number of non-minor issues.

    Procedure for reviewers:Allocate recommended time to preparation.Perform individual review of work product.Use checklists and references to focus attention.Note critical, severe, and moderate issues onReviewer Data Form.

    Note minor issues and author questions on workproduct.

    Planning Orientation Preparation Review Mt. Rework Verify

  • 8/12/2019 FTR Lecture

    22/40

    opyright 1998 Philip M. Johnson (22)

    Example Issue ClassificationCritical

    Defects that may cause the system to hang, crash,produce incorrect results or behavior, or corrupt userdata. No known work-arounds.

    SevereDefects that cause incorrect results or behavior with

    known work-arounds. Large and/or important areas ofthe system is affected.

    ModerateDefects that affect limited areas of functionalitythat can either be worked around or ignored.

    MinorDefects that can be overlooked with no loss offunctionality.

  • 8/12/2019 FTR Lecture

    23/40

    opyright 1998 Philip M. Johnson (23)

    Example checklistChecklist for Software Quality Plans

    1. Does the plan reference the Tektronix Test Plan process document to be used in this p roject?

    2. Does the plan lis t the set of measurements to be used to assess the quality of the product?

    3. Is a rationale provided for each feature to be tested?

    4. According to this document, what features won' t be tested? Are any missing? List all below:_______________________________________________________________________________________________________________________________________________________________________________________________________________________________________

    __________________________________________________________________________________________________________________________________________________________

    Does the plan provide a rationale for why each of these features will not be tested?

    5. How well does the plan describe how tests will be traced bac k to requirements?Check one of the following: Very well Fairly well Poorly No Trac eability

    6. Refer to the corresponding software development plan. Does the quality plan discus s each of thmilestones and test transmittal events from this document?Check all that apply: I cannot access the software development plan. The software development plan has no test miles tones. The software development plan has no test transmittal events. The quality plan has no test miles tones. The quality plan has no test transmittal events. Both documents include the same set of test milestones and test transmittal events.

  • 8/12/2019 FTR Lecture

    24/40

    opyright 1998 Philip M. Johnson (24)

    Example references

    Corporate standards:Procedure for Software Quality Plans

    Exemplary documents:Foo System Software Quality Plan

    High quality reference texts:Software Quality: Concepts And Plans, Ch. 13 (Planfollowing an industrial model), Robert Dunn.

    On-line resources:http://flute.lanl.gov/SWQA/SMP.html

  • 8/12/2019 FTR Lecture

    25/40

    opyright 1998 Philip M. Johnson (25)

    Example Preparation Data

    1. Inspection ID _________ 2. Document: _____________ 3. Name: ________________

    4. Critical, Severe, and Moderate IssuesNu m Lo cation Severity Chk/Ref Description

    _____ _______ ______ ______ _____________________________________________________

    _____ _______ ______ ______ _____________________________________________________

    _____ _______ ______ ______ _____________________________________________________

    _____ _______ ______ ______ _____________________________________________________

    _____ _______ ______ ______ _____________________________________________________

    5. Effort: min 6. I ssue ________ ________ ________ ________ ________ Totals critical severe moderate minor author Q's

    7. Preparation Work product has been completely checked. Objectives All cr itic al, severe, and moderate issues are noted on this form.

    All minor issues and author questions are noted on the work product.

  • 8/12/2019 FTR Lecture

    26/40

    opyright 1998 Philip M. Johnson (26)

    Why not write on the workproduct?

    Advantages of Reviewer Data Sheet:Minor issues are pre-filtered from review meeting,saving meeting time.

    Reviewers articulate issues clearly duringpreparation, saving meeting time.

    Preparation statistics gathering simplified.

    Preparation effectiveness (% true defects, %redundancy) and checklist effectiveness ismeasurable.

    Issues can be presented in order of importance.Data sheet indicates effectiveness of checklists.

  • 8/12/2019 FTR Lecture

    27/40

    opyright 1998 Philip M. Johnson (27)

    Why not write on the workproduct?

    Disadvantages of Reviewer Data Sheet:Requires extra time (15 minutes?)Discourages last minute preparation.Makes quality of preparation more visible.

  • 8/12/2019 FTR Lecture

    28/40

    opyright 1998 Philip M. Johnson (28)

    Review Meeting

    ObjectivesCreate consolidated, comprehensive listing of non-minor issues.

    Provide opportunity for group synergy.

    Improve reviewing skill by observing others.Create shared knowledge of work product.

    Procedure

    Moderator requests issues sequentially.Reviewers raise issues.Scribe notes issues on Scribe Data Sheet.Scribe Data Sheet is visible to everyone.

    Planning Orientation Preparation Review Mt. Rework Verify

  • 8/12/2019 FTR Lecture

    29/40

    opyright 1998 Philip M. Johnson (29)

    Example Review Meeting DataReview Aggregrate Checking Data

    Meeting R1 R2 R3 R4 R5 R6 Total10. Prep . Effort + + + + _____ + ______ = min

    11. Cr itical Iss . + + + + _____ + ______ = iss.

    12. Severe Iss . + + + + _____ + ______ = iss.

    13. Moderate Iss + + + + _____ + ______ = iss.

    14. Minor Iss. + + + + ___ __ + ______ = iss.

    15. Author Q's. + + + + _____ + ______ = Qs

    Review 16. Rev. Meet. All reviewers present. Lis t absent:__________________________Meeting Objectives All reviewers prepared sufficiently for meeting.

    (cont.) All issues noted by Scribe and understood by Author for rework Any problems with inspection process have been noted.

    17. R.M. Effort min. meet x part icip. = min

  • 8/12/2019 FTR Lecture

    30/40

    opyright 1998 Philip M. Johnson (30)

    Rework

    ObjectivesAssess each issue, determine if it is a defect, andremove it if necessary.

    Produce written disposition of non-minor issue.Resolve minor issues as necessary.

    Planning Orientation Preparation Review Mt. Rework Verify

  • 8/12/2019 FTR Lecture

    31/40

    opyright 1998 Philip M. Johnson (31)

    Rework (cont.)ProcedureAuthor obtains Scribe Data Sheet containingconsolidated issues list as well as copies of workproducts.

    Author assesses each issue and notes action taken

    using Author Data Sheet.Author determines the type of each defect(reqs/spec/design/imp, etc.)

    When finished Author provides Author Data Sheetand reworked product to Moderator to Verify.

  • 8/12/2019 FTR Lecture

    32/40

    opyright 1998 Philip M. Johnson (32)

    Example Rework Data1. Inspection ID _________ 2. Docume nt____________ 3. Author ____________

    4. I ssue Disposition

    Num Fixed Type Explana tion

    _____ ____ __________ _____________________________________________________________

    _____ ____ __________ _____________________________________________________________

    _____ ____ __________ _____________________________________________________________

    5. Effort min

    6. Rework Outcome of all Review Meeting Data Sheet issues are noted on this form. Objectives All minor issues have been addressed.

    No known defects remain in the work product.

  • 8/12/2019 FTR Lecture

    33/40

    opyright 1998 Philip M. Johnson (33)

    Verify

    ObjectivesAssess the (reworked) work product quality.Assess the inspection process.Pass or fail the work product.

    Procedure for moderator:Obtain reworked product and Author Data Sheet.Review work product/data sheet for problems.Provide recommendation for work product.Perform sign-off with reviewers.Compute summary statistics for inspection.Generate any process improvement proposals.Enter review data into quality database.

    Planning Orientation Preparation Review Mt. Rework Verify

  • 8/12/2019 FTR Lecture

    34/40

    opyright 1998 Philip M. Johnson (34)

    Example Verify DataVerify 18. Total ________ Planning (Line 6)

    Effort + ________ Orientat ion (Line 9)

    + ________ Preparation (Line 10)

    + ________ Review Meeting (Line 17)

    + ________ Rework (See Rework Data Sheet ) + ________ Verify

    = ________ Total Ins pection Effort

    19. Total ________ Crit ical (All from Rework Data Sheet )

    Defects + ________ Severe

    Removed + ________ Moderate

    + ________ Minor

    = ________ Total Defects Removed

    20. Method Reviewer forms were not filled out completely. Variations Review meeting involved issue discussion and resolution.

    Checklis ts did not appear to be helpful. References did not appear to be helpful. Other: ________________________________________________

    21. Verify Moderator's quick review yields less than 2 major issues. Objec tives Moderator has c ollec ted all TekInspect forms for filing.

    Moderator has entered data into quality engineering database.

    22. Process _____________________________________________________ Improvement _____________________________________________________

    23. Inspection Pass Status Conditional Pass: _______________________________________

    Fail: _________________________________________________Moderator s ignature:_________________________ Date: _________

    I agree/disagree with the moderator's decision:AgreeDisagree _________________________Date: _________

  • 8/12/2019 FTR Lecture

    35/40

    opyright 1998 Philip M. Johnson (35)

    InspectionCriticalSuccessFactors

    C l F

  • 8/12/2019 FTR Lecture

    36/40

    opyright 1998 Philip M. Johnson (36)

    Critical Success Factor:Checklists

    Checklists guide reviewers to areas prone to defects.Checklists may be stated as a yes/no question:Are all strings null terminated?

    Checklists can also stimulate mental modelling:After a fork, what happens if a child exitsimmediately?

    Checklists should be combined with general analysis.Dont trust checklists to be comprehensive!

    Checklists are specific to work product type anddevelopment phase.

    C i i l S F

  • 8/12/2019 FTR Lecture

    37/40

    opyright 1998 Philip M. Johnson (37)

    Critical Success Factor:Effective Preparation

    Effective preparation requires both:Comprehension: the nature of the entire document.Analysis: inter-document consistency and adequacy.

    Focus on:What is present but not adequate.What is missing but should be there.What unique skills and experiences can you bring tobear on the work product?

    Allocate enough time to prepare!Make multiple passes over document.

    Let it sit overnight.Dont prepare right before the review.

    C i i l S F

  • 8/12/2019 FTR Lecture

    38/40

    opyright 1998 Philip M. Johnson (38)

    Critical Success Factor:Measurement

    The goal of Inspection is to detect and remove alldefects efficiently and completely.

    We measure: Time spent on each phase.Number of issues of each type discovered.

    Utility of review meeting, checklists, etc.

    Analysis over time suggests: New and better checklist items. Improvements to inspection process,by identifying poor quality

    review. Improvements to software development process, by identifying poorquality work products.

    Improvements to standards.

    C i i l S F

  • 8/12/2019 FTR Lecture

    39/40

    opyright 1998 Philip M. Johnson (39)

    Critical Success Factor:The moderator

    Indicators of effective inspection moderators:Work products are inspected when ready.Meeting dates are aggressive but do-able.Author overviews are useful or omitted.Checklists and reference materials are useful.

    Review meeting focuses on issue detection.Author does not feel threatened.Rework is verified carefully.Improvements to inspection and software

    development process are discovered.Participants feel the method effectively improvedquality.

    Everyone wants to do it again!

  • 8/12/2019 FTR Lecture

    40/40

    Further referencesSoftware Inspection, Tom Gilb and Dorothy Graham,Addison-Wesley, 1993.

    The WWW FTR Archive,http://www.ics.hawaii.edu/~johnson/FTR/

    Software Inspection: An Industry Best Practice,David Wheeler, Bill Brykczynski, and Reginald Meeson.

    (For PSP) A Discipline for Software Engineering,Watts Humphrey, Addison-Wesley, 1995.