54
University of Paderborn Software Engineering Group Software Quality Assurance: III Quality Control Dr. Holger Giese Dr. Holger Giese Software Engineering Group Software Engineering Group Room E 3.165 Room E 3.165 Tel. 60 Tel. 60 - - 3321 3321 Email: Email: [email protected] [email protected]

SQA-ppt II unit

Embed Size (px)

Citation preview

Page 1: SQA-ppt II unit

University of PaderbornSoftware Engineering Group

Software Quality Assurance:III Quality Control

Dr. Holger GieseDr. Holger GieseSoftware Engineering GroupSoftware Engineering GroupRoom E 3.165Room E 3.165Tel. 60Tel. 60--33213321Email: Email: [email protected]@upb.de

Page 2: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-2

OutlineI IntroductionII Software Life CycleIII Quality ControlIV InfrastructureV ManagementVI StandardsVII Conclusion & Summary

Page 3: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-3

III Quality ControlIII.1 FoundationsIII.4 Static Analysis TechniquesIII.3 Dynamic Analysis TechniquesIII.4 Testing Process & ActivitiesIII.5 Management & AutomationIII.6 Paradigm & Domain Specific AspectsIII.7 Discussion & SummaryIII.8 Bibliography

Page 4: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-4

III.1 FoundationsQuality Control is the series of inspections, reviews and tests used throughout the development cycle to ensure that each work product meets the requirements placed upon it.

[Pressman2004]

Page 5: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-5

Verification & ValidationVerification – The process of evaluating a system or component to determine whether the products of a given development phase satisfy the conditions imposed at the start of that phaseValidation - The process of evaluating a system or component during or at the end of the development process to determine whether it satisfies specified requirements

[IEEE Std 610.12-1990]

Page 6: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-6

Verification & ValidationVerification: refers to the set of activities that ensure that software

correctly implements a specific function.

Validation: refers to a different set of activities that ensure that the software that has been built is traceable to customer requirements.

Boehm [Boehm81]:Verification: “Are we building the product right?”Validation: “Are we building the right product?”

The definition of V&V encompasses many of SQA activities, includingformal technical reviews, quality and configuration auditsperformance monitoring, different types of software testingfeasibility study and simulation

Page 7: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-7

Terminology: Dynamic/Static AnalysisDynamic analysis:

The process of evaluating a system or component based on its behavior during execution.Contrast with: static analysis.See also: demonstration; testing.

[IEEE-610.12-1990]

Static analysis:The process of evaluating a system or component based on its form, structure, content, or documentation. Contrast with: dynamic analysis. See also: inspection; walk-through.

[IEEE-610.12-1990]

Page 8: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-8

Terminology: TestingSeveral definitions:

“Testing is the process of executing a program or system with the intent of finding errors.”

[Myers1979]

Testing is the process used to verify or validate a system or its components.[Storey1996]

Testing (according to IEEE):(1) The process of operating a system or component under specified conditions, observing or

recording the results, and making an evaluation of some aspect of the system or component.

(2) (IEEE Std 829-1983 [5]) The process of analyzing a software item to detect thedifferences between existing and required conditions (that is, bugs) and to evaluate the features of the software items. See also: acceptance testing; benchmark; checkout; component testing; development testing; dynamic analysis; formal testing; functional testing; informal testing; integration testing; interface testing; loopback testing; mutation testing; operational testing; performance testing; qualification testing; regression testing; stress testing; structural testing; system testing; unit testing.

[IEEE-610.12-1990]

Page 9: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-9

Dynamic vs. Static AnalysisDynamic analysis (testing):

execution of system components;running the software

Static analysis:investigation without operation;pencil and paper reviews etc.Modelling (mathematical representation)

Page 10: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-10

Quality Control Activities & Process

XXXIntegration testingXXXX

Static analysis

XXSystem validation

XImplementationXDetailed designXTop-level designXRequirements

ModellingDynamic analysis

Life cycle phase

[Storey1996]

Page 11: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-11

III.2Static Analysis TechniquesOverview

Reviews and InspectionsWalkthroughs, inspections, personal reviewsFormal technical reviewsSummary

Other TechniquesControl-flow analysis, data-flow analysis, metrics, …

Page 12: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-12

II.2.1 Reviews and InspectionsA family of techniques

(1) Personal reviews (2) Inspections(3) Walkthroughs(4) Formal technical reviewsReview / inspect

To examine closelyWith an eye toward correction or appraisal

People (peers) are the examiners [SWENE]

Page 13: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-13

Purpose/ObjectivesVerify that

software meets its requirementssoftware follows predefined standardssoftware is developed in uniform manner

Catching errorsSoonerMore and differentBreaking frame of reference

Make projects more manageableTo identify new risks likely to affect the project

Improving communicationCrossing organization boundaries

Providing EducationMaking software visible

[SWENE]

Page 14: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-14

(1) Personal ReviewFeatures

InformalDone by the producer

ImplicationsNot objectiveAvailable to any developerDifferent mindset

Need for reviewProduct completion

[SWENE]

Limited screening efficiency!

Page 15: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-15

(2) InspectionsFeatures

Team reviews materials separatelyTeam and producers meet to discussMay review selected product aspects only

ImplicationsFocus on important issues

If you know what they areMore material per meetingLess preparation time

[SWENE]

Page 16: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-16

Process and Participants[Galin2004]

Page 17: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-17

(3) WalkthroughsFeatures

Less formalProducer presents or provides information

ImplicationsLarger groups can attend (education)More material per meetingLess preparation time

Disadvantage: Harder to separateProduct and presenterExplanation and justification

[SWENE]

Page 18: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-18

Process and Participants[Galin2004]

Page 19: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-19

What to omit and what to include?Inclusion:

Sections of complicated logicCritical sections, where defects severely damage essential system capability Sections dealing with new environmentsSections designed

Omission:“Straightforward” sections (no complications)Sections of a type already reviewed by the team in similar past projectsSections that, if faulty, are not expected to effect functionalityReused design and codeRepeated parts of the design and code

[Galin2004]

Page 20: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-20

(4) Formal Technical ReviewFeatures

FormalScheduled eventDefined procedureReported result

Independent review teamProducers not present

ImplicationsMore preparation timeLess material per meetingProduct must stand or fall on its own

See also: IEEE Standard for Software Reviews and Audits [IEEE 1028, 1988]

[SWENE]

Page 21: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-21

The Players

Managers

Review Team Producer

[SWENE]

Page 22: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-22

Team SelectionManager assigns

Vested interest in a good outcomeReview as delegation of manager’s responsibility

Technical competenceCurrent technology

ObjectivityBest buddies and “outsiders”

User involvement

[SWENE]

Page 23: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-23

Team SizeSmaller for

FocusSchedulingReasonable output volume per person-hour

Larger forExpertiseMaking review public

Non-participating observers

3 6

[SWENE]

Page 24: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-24

Team CharacteristicsExperienced senior technical staffRepresentatives of

team that created the documentclient representativeteam for next development phasesoftware quality assurance group

Page 25: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-25

Managers’ Participation“Review is a manager’s job”Problems:

Technical competenceReview of product vs. review of person

Active participationAs an “Outsider” only!As a team leader (and outsider)

providing general competence

Post facto participationReview materials Review report

[SWENE]

Page 26: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-26

What and When to ReviewAny software artifact

requirements, designs, code, documentation, procedures, interfaces, ...

Design for reviewControlling product complexityControlling review length:

Scheduling reviews

2 Hours

10 AM

[SWENE]

Page 27: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-27

Review ProcessProducers provide materialsLeader schedules meetingIndividuals prepareTeam holds review meetingManager gets report

[SWENE]

Page 28: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-28

Team Task OverviewProvide a good review

The team is responsible for the review, not the product (Don’t shoot the messenger)

Find issues: Raise them, don’t solve themRender an assessment decision

Accept, Accept with minor revision, Revision needed, Reject Unanimous approval required

Product rejection by individual veto

[SWENE]

Page 29: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-29

The Review Team

Leader Reviewers Recorder

[SWENE]

Page 30: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-30

Team Leader - TraitsTechnical competence

General strengthCredibilityAble to understand the issues

Personal skillsWilling to confront peopleWilling to report failureAble to step back from the heat of discussion

Administrative skills

[SWENE]

Page 31: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-31

Team Leader - CharacteristicsReview leader should be SQA representative

has the most to losecreator: eager to get approval (to start next job)client: can wait for acceptance testing

Knowledge and experience in development of projects of the type reviewed. Preliminary acquaintance with the current project is not necessary. Seniority at a level similar if not higher than that of the project leader.A good relationship with the project leader and his team.A position external the project team

[Galin2004]

Page 32: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-32

Team Leader - TasksAvoid premature reviewsCoordinate arrangements

Materials distributionMeeting scheduleMeeting location and facilities

Ensure a good reviewOr report the reason for failure

Materials missingReviewers missing or not prepared

[SWENE]

Page 33: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-33

Team Leader - Run the MeetingAct as chairperson

Opening and introductionsProcedure guide Closing

Act as facilitatorControlling level of participation

Enough but not too muchConflict resolution

Terminate the meeting if unproductive

[SWENE]

Page 34: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-34

Reviewers - TasksPrepare before

Thorough review of materialsParticipate

Be thereComing late; leaving early

Act professionallyPersonal agendasBig egos and shyness

Positive and negative commentsBalance; courtesy; preserving what’s good

[SWENE]

Page 35: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-35

Reviewer - PreparationBe prepared - evaluate product before reviewbe sure that you understand the contextfirst, skim all the product material to understand location and format of the informationnext, read product material and annotate hardcopyavoid issues of styleinform the review leader if you can’t prepare

Page 36: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-36

Reviewers – ParticipationReview the product, not the producerkeep your tone mild, ask questions instead of making accusationsstick to the review agendapose your written comments as questionsraise issues, don’t resolve them!limit discussions (do them off line!)avoid discussions of style - stick to technical correctness

Page 37: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-37

RecorderSelection

Any competent reviewerSingle or multiple recordersRotating responsibility within a meetingLeaders as recorders

Having too much to doSeparation of power

Task: Get it in writingBasis for report

[SWENE]

Page 38: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-38

Recording MediumIssues

Public Vs. private notesSpeed and accuracyUsefulness after the meeting

MediaFlip charts; posting prior pagesBlackboards, overheads, PC and projectorVideo and audio recording

[SWENE]

Page 39: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-39

Managers - TasksStay out of reviews in your own areaSupport reviews

Talk about itProvide resources

Time, the right people, place, materialsChange the reward system

Abide by the review results

[SWENE]

Page 40: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-40

[Galin2004]ProcessReview session agenda:

A short presentation of the design document.Comments made by members of the review team.Verification and validation of comments is discussed to determine the required action items (corrections, changes and additions).Decisions about the design product (document), which determines the project's progress.

Post review activities:Preparation of the report. Follow up performance of the corrections and to examine the corrected sections.

Page 41: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-41

Review Report - PurposePurpose

Tell managers the outcomeEarly warning system for major problemsProvide historical record

For process improvementFor tracking people involved with projects

[SWENE]

Page 42: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-42

Review Report - ContentsA summary of the review discussions.Decision about the product

accept without further modificationreject the work due to severe errors (review must be repeated)accept with minor modifications (that can be incorporated into the document by the producer)

A full list of the required action items — corrections, changes and additions.

For each action item, completion date and project team member responsible are listed.

The name(s) of the review team member(s) assigned to follow up.All participants have to sign-off

shows participation responsibilityshows their concurrence with the findings

Page 43: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-43

[Galin2004] from PressmanDesign Review Infrastructure

Develop checklists for common types of design documents.Train senior professionals serve as a reservoir for review teams. Periodically analyze past reviews effectiveness.Schedule the reviews as part of the project plan.

The Design Review TeamReview teams size should be limited, with 3–5 members being the optimum.

Review Guidelines (1/2)

Page 44: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-44

[Galin2004] from Pressman

Review Guidelines (2/2)The Design Review Session

Discuss professional issues in a constructive way refraining from personalizing the issues. Keep to the review agenda. Focus on detection of defects by verifying and validating theparticipants' comments. Refrain from discussing possible solutions. In cases of disagreement about an error - end the debate by noting the issue and shifting its discussion to another forum. Properly document the discussed comments, and the results of their verification and validation. The duration of a review session should not exceed two hours.

Post-Review ActivitiesPrepare the review report, including the action itemsEstablish follow-up to ensure the satisfactory performance of all the list of action items

Page 45: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-45

Comparison

1) Inspection session findings report

2) Inspection sessionsummary report

Formal design review report

Review documentation

Not formally required

Formally requiredNot formally required

Error-related data collection

NoYesNoParticipant’s use of checklists

NoYesNoFormal training of participants

NoYesYesFollow-up of corrections

YesYesYesReview session

Yes - briefYes - thoroughYes - thoroughParticipant’s preparations

NoYesNoOverview meetingWalkthroughInspectionDesign reviewProperties

[Galin2004]

Page 46: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-46

TYPES OF REVIEWSCategory/Attribute Management Technical Inspection Walkthrough

Objective Ensure Progress.Recommend correctiveaction. Ensure properallocation of resources.

Evaluate conformance tospecifications and plans.Ensure change integrity.

Detect and identifydefects.Verify resolution.

Detect defects.Examine alternatives.Forum for learning.

Delegated ControlsDecision making Management team charts

course of action.Decisionsare made at the meetingor as a result ofrecommendations.

Review team petitionsmanagement of technicalleadership to act onrecommendations.

Team choosespredefined productdispositions. Defectsmust be removed.

All decisions made byproducer. Change is theprerogative of theproducer.

Change verification Change verification left toother project controls.

Leader verifies as part ofreview report.

Moderator verifies rework.Change verification left toother project controls.

Group DynamicsRecommended size 2 or more people 3 or more people 3-6 people 2-7 peopleAttendance Management, technical

leadership, and peer mix.Technical leadership andpeer mix.

College of peers meetwith documentedattendance.

Technical leadership andpeer mix.

Leadership Usually the responsiblemanager.

Usually the lead engineer. Trained moderator. Usually producer.

ProceduresMaterial volume

Moderate to high,depending on the specific“statement of objectives”at the meeting.

Moderate to high,depending on the specific“statement of objective”for the meeting.

Relatively low. Relatively low.

Presenter Project representative Software elementrepresentative.

Presentation by “reader”other than producer.

Usually the producer

Data collection As required by applicablepolicies, standards, orplans.

Not a formal projectrequirement. May bedone locally.

Formally required. Not a formal projectrequirement. May bedone locally.

OutputsReports Management review

report.Technical review report. Defect list. Defect

summary. Inspectionreport.

Walkthrough report.

Database entries Any schedule changesmust be entered into theproject tracking database.

No formal databaserequired.

Defect counts,characteristics, severity,and meeting attributesare kept.

No formal databaserequired.

IEEE STD 1028-1988

Page 47: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-47

Formality of Technical Reviews

Inspections

Formal Review

“Proof of correctness”

Personal Review

“Improved design”

“Detection of flaws”

Walkthroughs

team-oriented

individual initiative

Page 48: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-48

[Galin2004]

Insufficient in-house professional capabilities in a specialized area.Temporary lack of in-house professionals for review team.Indecisiveness caused by major disagreements among the organization’s senior professionals.In small organizations, where the number of suitable candidates for a review team is insufficient.

When to Involve External Experts?

Page 49: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-49

Testing EfficiencyCatching most errors before test

Review plus test is much cheaper than just testSample results:

10x reduction in errors reaching test50 - 80 % total cost reduction

Fewer defects after releaseSubstantial cost savings in maintenance

Composite data from H-P (R. Grady)Testing efficiency (defects found / hour)

System use .21Black box .282White box .322Reading/inspect. 1.057

[SWENE]

Page 50: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-50

Effectiveness of Inspections[Fagan 1976] inspections of design & code

67%-82% of all faults were found by inspections25% time saved on programmer resources (despite inspections)

[Fagan 1986]93% of all faults were found by inspections

Cost reduction for fault detection(compared with testing)

[Ackerman+1989]: 85%[Fowler1986]: 90%[Bush1990]: 25.000US$ saved PER inspection

Page 51: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-51

Year Defect detection method Defects per 1000 lines of maintained

codeTest

%Design review

%

Code inspection

%1977 85 --- 15 0.19

1978 80 5 15 0.13

1979 70 10 20 0.06

1980 60 15 25 0.05

1981 40 30 30 0.04

1982 30 40 30 0.02

[Galin2004] from [Cusomano1991]

Code Inspection Effectiveness

Page 52: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-52

Design Reviews vs. Code ReviewsCase study: 700 KLOC real-time project,

over 400 developers, modern language, three year duration.

Design Reviews8.44

Testing0.17

Code Reviews1.38

Cost Effectiveness =

[Collofello&Woodfield1989]

cost savedcost consumed

Time for correction (hours):7.5 design6.3 code

11.6 testing

Page 53: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-53

SummaryHighly effective techniqueLow technologyNot used nearly enough

[SWENE]

Page 54: SQA-ppt II unit

WS04/05 – Software Quality Assurance – III Quality ControlDr. Holger Giese

University of PaderbornSoftware Engineering Group

III-54

Further Readings[Ackerman+1989] A.F. Ackerman, L.S. Buchwald, F.H. Lewski, “Software inspections: an effective verification process”, IEEE

Software 6 (May 1989), pp. 31-36[Bush 1990] M. Bush, “Improving software quality: the use of formal inspections at Jet Propulsion laboratory”

Proceedings of the 12th International Conference on Software Engineering, Nice, France, March 1990, pp. 196-199

[Collofello&Woodfield1989] J.S. Collofello and S. N. Woodfield. “Evaluating the Effectiveness of Reliability-AssuranceTechniques.” Journal of Systems and Software, March 1989.

[Cusumano1991] M. A. Cusumano. Japan‘s Software Factories – A Challange to u.S. Management. Oxford University Press, New York, 1991.

[Dobbins1998] J. H. Dobbins. Inspections as an up-front quality technique. In G. G. Schulmeyer and J. i. McManus (eds), handbook of Software Quality Assurance, Prentice Hall, Harlow, Essex, UK, 1998.

[Doolan1992] Doolan, E. P. “Experience with Fagan’s Inspection Method,” Software-Practice and Experience, 22, 2, February 1992: 173-182.

[Fagan 1976] M.E. Fagan, “Design and code inspections to reduce errors in program development”, IBM Systems Journal 15 (No. 3, 1976), pp. 182-211

[Fagan 1986] M.E. Fagan, “Advances in software inspections”, IEEE Transactions on Software Engineering, SE-12 (July 1986), pp. 744-751

[Fowler 1986] P.J. Fowler, “In-process inspections of workproducts at AT&T”, AT&T Technical Journal 65 (March/April 1986), pp. 102-112

[Kelly+1992] J. C. Kelly, J. S. Sherif, and J. Hops. “An Analysis of Defect Densities Found During Software Inspections.” Journal of Systems and Software, 1992.

[Parnas&Weiss1985] Parnas, D. L. & Weiss, D. M. “Active Design reviews: Principles and Practices,” Proceedings of the 8th Conference on Software Engineering, pp. 215-222, August, 1985.

[Parter+1994] Porter, A. A.; Votta, Jr., L. G.; & Basili, V. R. Comparing Inspection Methods for Software RequirementsInspections: A Replicated Experiment, University of Maryland Technical Report (CS-TR-3327), 1994. [E&S Tech Rept MDU 3327]

[Weller1993] E. F. Weller. “Lessons From Three Years of Inspection Data.” IEEE Software, Sept. 1993.