Essential Test Management

Preview:

DESCRIPTION

The key to successful testing is effective and timely planning. Rick Craig introduces proven test planning methods and techniques, including the Master Test Plan and level-specific test plans for acceptance, system, integration, and unit testing. Rick explains how to customize an IEEE-829-style test plan and test summary report to fit your organization’s needs. Learn how to manage test activities, estimate test efforts, and achieve buy-in. Discover a practical risk analysis technique to prioritize your testing and become more effective with limited resources. Rick offers test measurement and reporting recommendations for monitoring the testing process. Discover new methods and develop renewed energy for taking your organization’s test management to the next level.

Citation preview

ESSENTIAL TEST MANAGEMENT AND PLANNING

Rick Craigrcraig@sqe.com

4

Administrivia

· Course timing · Breaks

© 2013 SQE Training V3.1

© 2013 SQE Training V3.1 5

Course Agenda

1. The Culture of Testing and Quality2. Introduction to STEP and Preventive

Testing3. Test Levels4. Master Test Plan5. The Test Summary Report

1THE CULTURE OF TESTING AND QUALITY

© 2013 SQE Training V3.1 7

What are your problems?

· What are your testing challenges?

© 2013 SQE Training V3.1 8

What is quality?

• Meeting requirements(stated and/or

implied)

·Quality

© 2013 SQE Training V3.1 9

What is testing?

• The process of determining conformance to requirements—stated

and/or implied

·Testing

© 2013 SQE Training V3.1 10

Class Questionnaire

• The overall quality of the software systems/products at my organization is:

Outstanding - one of the best Acceptable - OK Poor - must be improved Unknown - a mystery to me

© 2013 SQE Training V3.1 11

Class Questionnaire

• The time, effort, and money that my organization spends trying to achieve high software quality is:

Too much - needs to be reduced

About right - OK Too little - needs to be increased Unknown - a mystery to me

© 2013 SQE Training V3.1 12

Corporate Culture• Them and Us vs. One Team • Early vs. Late

© 2013 SQE Training V3.1 13

Economics of Test and Failure• The cost of testing• The cost of failure• The savings of preventive

testing and defect prevention

ProductionDesignRequirements

Cost toCorrect in 1970s $

Code Test

Time

$139 $455 $977

$7,136

$14,102

TRW, IBM, and Rockwell Landmark Study

© 2013 SQE Training V3.1 14

Software PsychologyWhat is “good enough”?

Time

# of Bugs

2INTRODUCTION TO STEP AND PREVENTIVE TESTING

© 2013 SQE Training V3.1 16

Select a Method/Process

• Systematic Test and Evaluation Process (STEP™) – SQE

• MS Test Process – Microsoft• TMap®

• Exploratory Testing

· Example testing

methodologies

1.10

1.20

1.24

1.22

1.30

© 2013 SQE Training V3.1 17

STEP™• Level Plans

– Acceptance – System – Integration – Unit

...

U nit

. . .

P lan

A n a lyze D es ig n Im p lem en t

A qu ire

. . .

M easu re

Integration

...

System

...

Acceptance

Project Testing

© 2013 SQE Training V3.1 18

STEP™ Activities

• P1. Establish master test plan• P2. Develop detailed test

plans

· Plan strategy

• A1. Analyze test objectives• A2. Design tests

• A3. Implement plans and designs

· Acquire testware

• M1. Execute tests• M2. Check test set adequacy•M3. Evaluate software and process

· Measure software and

testware

19

Level Timing

Master Test Planning

and

Methodology

Standards

Guidelines

Projectplan

Requirementsspecification

High-leveldesign

ImplementationDetaileddesign

Acceptance

Unit

Integration

System

Plan

Plan

Plan

Plan

Acquire

Acquire

Acquire

Acquire

Measure

Measure

Measure

Measure

© 2013 SQE Training V3.1

© 2013 SQE Training V3.1 20

Preventive Testing

• Testing begins early (i.e., during requirements) and test cases are used as requirements models

• Testware design leads to software design

• Defects are detected earlier or prevented altogether

• Defects are analyzed systematically • Testers and developers work together

21

Opportunities for Defects

Business Needs

Requirements

Design

Coding

Installation

Operation

© 2013 SQE Training V3.1

© 2013 SQE Training V3.1 22

Testers Improve Requirements

• Testers ask:– What does this mean?– Are there any unspecified situations?– How can this requirement be adequately

demonstrated?– What problems might occur?

• Tests define and help specify requirements– “A function/method or constraint specification

is unfinished until its test is defined”

© 2013 SQE Training V3.1 23

Two Views·Food and Drug Administration (FDA):“You CAN’T test quality into your software.”

·SQE:“You MUST test quality into your software.”

3TEST LEVELS

© 2013 SQE Training V3.1 25

What is a Level?

· A level is defined by the environment

· An environment is the collection of

• Hardware• System Software• Application Software• Documentation• People• Interfaces• Data

26

The “V” Model

Plan

Plan

Plan

Plan

Requirements

High-level design

Detailed design

Coding

· Acceptance

· System

· Integration

· Unit

·

·

·

·

© 2013 SQE Training V3.1

© 2013 SQE Training V3.1 27

What determines the number of levels?

·Risk

·Politics

·Organization

·Objectives

© 2013 SQE Training V3.1 28

Acceptance Testing

· Objective: To demonstrate a system’s readiness for operational use and customer acceptance

• Focuses on user requirements• Follows systems testing and is expected to work

• Is a final stage of confidence building• Provides protection to ensure production readiness• Allows customers to “sign off” on the product

· Ends: When the system is approved for use

© 2013 SQE Training V3.1 29

System Testing

· Objective: To develop confidence that a system is ready for acceptance testing. It forms the basis of a regression test set and often represents the bulk of the testing effort

• Most comprehensive testing• Usually tests software design and requirements

• Usually consumes most test resources

· Ends: When a system is turned over for acceptance testing or moved into production

© 2013 SQE Training V3.1 30

Integration Testing

· Objective: To progressively test the interfaces between units and modules

• Focuses on interfaces• Can be top-down, bottom-up, or functional

· Ends: When the entire system has been integrated and stability has been demonstrated

© 2013 SQE Training V3.1 31

Unit Testing

· Objective: To determine that each unit meets its requirements (program specification) and that internal consistency has been achieved

• Can be black-box as well as white-box• Primary means of code coverage

· Ends: When each unit has met its exit criteria

4MASTER TEST PLAN

33

Master Test Plan

Master Test Plan

UnitTest Plan

AcceptanceTest Plan

IntegrationTest Plan

SystemTest Plan

Test Cases

© 2013 SQE Training V3.1

© 2013 SQE Training V3.1 34

Process vs. DocumentationThe process may be more important

than the product (i.e., paper)

© 2013 SQE Training V3.1 35

Audience

Who will read a Master Test Plan?

36

IEEE Style Master Test Plan (2008)

1. Introduction– Document identifier– Scope– References– System overview and

key features– Test overview

• Organization• Master test schedule• Integrity level

schema• Resources summary• Responsibilities• Tools, techniques,

methods, and metrics

2. Details of the MTP– Test processes including

definition of test levels• Management• Acquisition• Supply• Development• Operation• Maintenance

– Test documentation requirements

– Test administration– Test reporting

3. General– Glossary– Document change

procedures and history© 2013 SQE Training V3.1

© 2013 SQE Training V3.1 37

Master Test Plan Sections1. Test Plan Identifier2. Introduction3. Test Items4. Features to be tested5. Features not to be tested6. Software Risks7. Planning Risks and Contingencies 8. Approach9. Item Pass/Fail Criteria10. Suspension Criteria/Resumption Requirements11. Test Deliverables12. Testing Tasks13. Environmental Needs14. Responsibilities15. Staffing and Training Needs16. Schedule17. Approvals

© 2013 SQE Training V3.1 38

1. Test Plan Identifier / 2. Introduction

• Test Plan Identifier

• Introduction– Scope of project– Scope of plan (functional)

© 2013 SQE Training V3.1 39

3. Test Items• What is to be tested, from the library

management perspective• Which build/version of:

– Application software– Documentation– Databases– etc.

Version 2.1

40

To test or not to test? That is the question!

4. Features to Be Tested / 5. Not to Be Tested

© 2013 SQE Training V3.1

42

6. Risk Analysis

Software/product

Project /planning

Two kinds of risk

Testing priority

Contingencies andrisk mitigation

© 2013 SQE Training V3.1

43

6. Example Software/Product Risks

· Scalability· Scope of use· Environment· Accessibility· Usability· Interface complexity· Technical complexity

· Functional complexity· Performance · Reliability· Third-party products· Data integrity· Recoverability· New technology

© 2013 SQE Training V3.1

© 2013 SQE Training V3.1 44

6. Inventory Features and Attributes

Features

• Upload a file• Add a record• Logon to the system• Update a user profile• Generate a report

Attributes

Accessibility Reliability

SecurityPerformance

Backward compatibility

45

6. Risk Likelihood and Impact

· Likelihood• High• Medium• Low

· Impact• High

• Medium• Low

© 2013 SQE Training V3.1

46

6. Initial Risk Priority

LikelihoodHigh (3)Medium (2)Low (1)

ImpactHigh (3)Medium (2)Low (1)

X

= Potential $ Loss

Risk (Test) Priority 1 2 3Impact

Lik

elih

oo

d

12

3

1 2 3

2 4 6

3 6 9

© 2013 SQE Training V3.1

47

6. Risk Inventory ― Example

SpellingInvalid mail-toEmail virusesWrong tel #sSlow performancePoor usabilityUgly siteDoes not work with

Browser XHacker spam attackSite intrusion

Web Site Attribute Likelihood Impact Priority

HighLowMediumLowHighMediumLowMedium

LowLow

LowMediumMediumHighHighMediumMediumHigh

MediumHigh

32439426

23

© 2013 SQE Training V3.1

48

6. Risk Inventory ― Example (cont.)

Slow performanceDoes not work with

Browser XPoor usabilityEmail virusesSpellingWrong tel #sSite intrusionInvalid mail-toUgly siteHacker spam attack

Web Site Attribute Likelihood Impact Priority

HighMedium

MediumMediumHighLowLowLowLowLow

HighHigh

MediumMediumLowHighHighMediumMediumMedium

96

44333222

© 2013 SQE Training V3.1

49

6. Risk Inventory ― Example (cont.)

Slow performanceDoes not work with

Browser XPoor usabilityEmail virusesSpellingWrong tel #sSite intrusionInvalid mail-toUgly siteHacker spam attack

Web Site Attribute Likelihood Impact Priority

HighMedium

MediumMediumHighLowLowLowLowLow

HighHigh

MediumMediumLowHighHighMediumMediumMedium

96

44333222

© 2013 SQE Training V3.1

50

7. Class Questionnaire

1) _________________________________2) _________________________________3) _________________________________4) _________________________________

1) _________________________________2) _________________________________3) _________________________________4) _________________________________

What are your greatest planning risks?

What are your contingencies?

© 2013 SQE Training V3.1

51

7. Example Planning Risk Checklist

• Delivery dates• Staff availability• Budget• Environment options• Tool inventory• Acquisition schedule• Participant buy-in/

marketing• Training needs • Scope of testing

• Lack of product Requirements

• Risk assumptions• Usage assumptions• Resource

availabilityCheck Listxxxxxxxx xxxxxxxx xxxxxxxx xxxxxxxx xxxxxxxx xxxxxxxx xxxxxxxx

X

X

//

/

//

© 2013 SQE Training V3.1

52

7. Contingencies

Contingency Options:

+ Time- Scope/Size+ Resources- Quality/Risk

© 2013 SQE Training V3.1

© 2013 SQE Training V3.1 54

8. Approach / Strategy

· The set of directions that has a major impact on the effectiveness or efficiency of the testing effort (i.e., forks in the road)

© 2013 SQE Training V3.1 56

8. Methodology (Political) Decisions• When will testers become involved in the

project?• What testing levels will be employed?

– Acceptance– System– Integration– Unit

• How many (if any) beta sites will be used?• Will there be a pilot?• What testing techniques will be utilized?

– Inspections– Walkthroughs

57

8. Entrance and Exit Criteria Decisions

• What criteria will be used?• Are the criteria flexible/negotiable?• Who decides?

© 2013 SQE Training V3.1

© 2013 SQE Training V3.1 58

8. Smoke Tests as Entrance Criteria

· Smoke tests are often used to evaluate whether or not a release is REALLY ready for system testing

© 2013 SQE Training V3.1 59

8. Test Coverage Decisions

• Functional coverage should be measured– For every test domain

• Program-based coverage should be used– On all code written (given sufficient

resources)– As a criteria for finishing unit testing

What strategy should be used for determining “where to make the cut”?

© 2013 SQE Training V3.1 60

8. S/W Config Management Decisions

• Who owns/is responsible for each component?

• How many environments will be used, and what will each be used for?

• How often will the SUT be updated?• How much regression testing is required

for each build?– Full– Partial

• Who controls changes to the test environment?

61

8. Full Regression Strategy ― Arguments

Full regression usually requires maintenance of full, effective test sets

· For

• Easy to manage (one test set

per level)• Lowest risk

· Against

• Heavy and often impractical

resource requirements (time

and dollars)

© 2013 SQE Training V3.1

62

8. Partial Regression Strategy ― Arguments

· For

•Solves execution resource problem

• Only viable strategy for emergency

· Against

• Must develop and maintain adequate relationship

information between functions and tests and functions and

components• Must know which components

have changed• Must decide which features

need to be re-tested and assemble subsets

• Adds risk for features not re-tested

© 2013 SQE Training V3.1

© 2013 SQE Training V3.1 64

8. Test Environment(s) Decisions

• Will multiple platforms be needed?• What hardware will be used?• Data

– Where will it come from?– How much volume will be needed?– How fresh (fragility) must it be?– How inter-dependent (referential integrity)

does it need to be?

• What user profiles will be used?• Are simulators or other “test” software

required?

© 2013 SQE Training V3.1 65

8. Automation Decisions

• The four questions– What?– When?– Who?– How?

© 2013 SQE Training V3.1 66

Automated TestingTools are not THE answer

© 2013 SQE Training V3.1 67

9. Item Pass/Fail Criteria• Pass/fail criteria can be expressed using

a number of different measurements:– % of test cases passed/failed– Number of bugs

• Type of bugs• Severity of bugs• Location of bugs

– Overall usability– Overall stability

Bugs and/or test cases may be “weighted”

© 2013 SQE Training V3.1 68

10. Suspension and Resumption Criteria

·Examples of suspension criteria include

· Blocking bugs

· Test case failures > 10%

· GUI response > 5 seconds

· Requirements < 80% complete

· Environmental problems

69

11. Testing Deliverables

Test Plan Test Log

Test Procedure

Specification

Test Design Specification

Test Case Specification

Test Summary

Report

Test Incident Report

© 2013 SQE Training V3.1

70

12. Testing Tasks

To do listaaaabbbbccccddddeeee

© 2013 SQE Training V3.1

© 2013 SQE Training V3.1 71

13. Environmental Needs

• Hardware configuration• Data• Interfaces• Facilities• Publications• Security access• System software• Documentation

72

14. Responsibilities

Development Manager(Jennifer)

Library Manager(Alison)

Test Lead 1(Dale)

Test Lead 2 (Steve)

Tool Coordinator (Karen)

Test Manager(Rick)

Cre

ate

MT

P

Un

it T

est

Exi

t C

rite

ria

Cre

ate

Bu

ild

s

Dev

elo

p T

DS

fo

r F

eatu

re A

Au

tom

ate

Scr

ipts

1-2

2

Dev

elo

p T

DS

fo

r F

eatu

re B

Dev

elo

p T

DS

fo

r F

eatu

re C

Dev

elo

p T

DS

fo

r F

eatu

re D

X

X

X X

X X

X

X

Dev

elo

p T

DS

fo

r F

eatu

re E

X

© 2013 SQE Training V3.1

73

15. Staffing and Training Needs

Successful Testing Recipe

Take 1 experienced project leader

Add 5 senior testers, plus a handful of novice testers

Throw-in liberal amounts of training

Allow to mix and then add tools to taste

Garnish with appropriate amounts of hardware

© 2013 SQE Training V3.1

© 2013 SQE Training V3.1 74

16. Schedule

A timeline with milestones

© 2013 SQE Training V3.1 75

16. Class Discussion

·Where do milestones come from?•Marketing business decision•Wishful thinking•Estimates/guesstimates•Politics•WAGs

I need it today!

76

17. ApprovalsWho calls the shots?

Users?

Product manager?

Development manager?

Test manager?

© 2013 SQE Training V3.1

77

Test Summary Report• Report Identifier• References

– Test items (with revision #s)

– Environments– Test Plan

• Variances (Deviations)– From test plan or

requirements– Reasons for deviations

• Summary of Incidents– Resolved incidents– Defect patterns– Unresolved incidents

• Adequacy Assessment– Evaluation of coverage– Identify uncovered

attributes

• Summary of Activities– System/CPU usage– Staff time– Elapsed time

• Software Evaluation – Limitations– Failure likelihood

• Approvals

© 2013 SQE Training V3.1

Recommended