Requirements Testing: Turning Compliance into Commercial ......Mike Bartley, Test and Verification...

Preview:

Citation preview

1 3rd Nov 2011

Requirements Testing:

Turning Compliance into

Commercial Advantage

Mike Bartley,

Test and Verification Solutions

2 2 3rd Nov 2011

Agenda

Some

theory

Some

practice

Some

reflection

• Requirements

management

• Mapping

requirements

to tests

• Using SQL

• Recording

test results

• Business

advantages

4 4 3rd Nov 2011

• missed deadlines

• exceeded budget

• inability to meet project reqs

So what about software requirements?

• Software requirements are much more complex

• Poor and Changing Requirements have been the main cause of project failures for years – Bull 1998 (203 interviews) major causes of failure:

• Highest = breakdown in communications (57%)

– Chaos Report 1995 (308 managers):

• Highest = incomplete requirements (13.1%)

Agile attempts to solve this issue with requirements

5 5 3rd Nov 2011

Capturing software requirements

• Over the years we have learned many

ways to capture requirements: – Documents (User / Marketing Requirements Documents)

– Use Cases or Stories

– Specification by Examples

– Tests as Specifications

– Formal specifications

• But how do we make sure – Requirements are implemented

– And tested Some industries

Mandate this

6 6 3rd Nov 2011

Testing

Implementation

Features

Requirements

Ensuring Requirements are Implemented and Tested

Product A

Req D Req C Req B

Feature E Feature F

Feature G

Impl H Impl I

Test 1 Test 2 Test 3

7 7 3rd Nov 2011

Example

• Requirement – Client Service Operator finds all current clients in the system where their

total value of sales between 2 specified dates is above a specified value

• Features – User (with suitable privilege) can search system for current clients where

total value of sales between 2 specified dates is above a specified value

– User Interface • allows a user to select the report, select date range etc

• Implementation Aspects – For every transaction in the system the following data must be stored

• Revenue value, Date and Client reference

– Need a query to find clients where total transaction value between 2 dates is above specified value

– Authorisation, Exceptions, Etc.

• Tests – Query test database XXX with dates “d1/m1/y1” and “d2/m2/y2”.

Expected result = “client1”,...

8 8 3rd Nov 2011 Testing

Implementation

Features

Requirements

Ensuring Requirements are Implemented and Tested

Product A

Client Service Operator finds …

Query

User Interface

Test 2

Query test database XXX

Test 3 Test 1 Test 4

Search

9 9 3rd Nov 2011

Signoff

Implementation

Features

Requirements

Beyond tests!

Product A

Req D Req C Req B

Feature E Feature F

Feature G

Impl H

Function

Impl I

FSM

Function

Coverage Assertion

Test 3 Test 1 Test 2

FSM

Coverage

Line

Coverage

10 10 3rd Nov 2011

Test Holes and Test Orphans!

Product A

Req D Req C Req B

Feature E Feature F

Feature G

Impl H Impl I

Test 4 Test 2 Test 3

Uni-

Directional

Feature Y

Impl Z

Req X

11 11 3rd Nov 2011

Test Orphans and Test Holes

• Test Orphans Waste Time and Effort!

– How many tests do you have where you cannot

remember what they are testing?

– What % of your test suite is like this?

– How much time do you spend running tests where you

are not sure of the value of the test?

• Test Holes Introduce Risk!

– A requirement is missing a test

– Do you know how many of your requirements are not

tested?

12 12 3rd Nov 2011

Mapping requirements to tests

Requirement Identifiers

Reqs Tested REQ 1.1 REQ 1.2 REQ 1.3

Test Cases 5 3 2 3

1.1.1 1 x

1.1.2 2 x x

1.1.3 2 x x

1.1.4 1 x

1.1.5 2 x x

• Many use a traceability matrix

– Which supports requirements tracing

– But will not support status, results and history

13 13 3rd Nov 2011

Requirements management has good tool support

• Example tools:

– Doors, Reqtify, Enterprise Architect, Jira, …

• Requirements get mapped down to features,

design, units and can even get to code

• Until it comes to testing

– At best they just map to tests without any connection to

• Test status

• Test results

• Results history

Making it difficult

to track progress

14 14 3rd Nov 2011

Requirements tracing

• “the ability to follow the life of a

requirement, in both a backward

and forward direction”

[Gotel and Finkelstein, 2006]

• Requires bi-directional

relationships in the

requirements tree

• Advantages

– Orphan features/code

– Business advantages – later!

15 15 3rd Nov 2011

Recap

• Requirements management helps to record

requirements and manage their implementation

– Can identify test holes

• Bidirectional requirements mapping allows us

to trace in both directions

– Identify orphan code and tests

– And we will see impact and risk analysis

• Want to also associate test status to

requirements

16 16 3rd Nov 2011

Using SQL for Requirements Management and Testing

Reqs

1. _____

2. _____

3. _____

4. _____

5. _____

Configuration

System

SQL Database

Results

Regression Tests

Test Holes Reqs Signoff

Reqs History

Test History 1. Versions

2. Pass/Fail

1.1 ___

1.2 ___

1.3 ___

1.4 ___

Reqs

1. _____

2. _____

3. _____

4. _____

5. _____

Regression

Scripts

Resources 1. Staff

2. Hardware

Configuration

System

17 17 3rd Nov 2011

So what do we want to track regarding testing?

Start tracking testing from the start

• Are tests defined, written, executing, passing?

18 18 3rd Nov 2011

Experiences and Recommendations

• Importing requirements into the SQL database

– Using XML

– Able to export back using XML

• Add a simple API to get test results into the

database

– Regression started, configuration information

– Test started, test status

– Regression complete

• Extract coverage information automatically

– And store automatically into the database

19 19 3rd Nov 2011

Automate

capture of results

Capture specs

and mapping

Reqs Tracing

Tool

Experiences of applying Requirements-Based Testing

Supporting Sequential Development

Reqs Defn

Feature

Spec

Unit Build

Unit Spec

Unit Test

Spec Unit Test

Integration

Test Spec

Feature

Test Spec

Acceptance

Test Acceptance

Test Spec

Integration

Test

Prod Test

Static Analysis

Parallel

development of

verif plans and

verification

(TTM)

Improve

specifications

through test

definitions

Map tests to

reqs, features,

etc,

Capture results

20 20 3rd Nov 2011

Experiences of applying Requirements-Based Testing

Supporting Sequential Development

• Important in building rigorous software

systems – U.S. Food and Drug Administration (FDA)

– Mandatory for CMMI level 2 and above,

– Mandatory for Certification in Aeronautics (DO-178B, DO-254),

Railway Transportation (EN-5012x), Automotive (ISO26262,),

Medical Systems (FDA 21 CFR), Other (IEC 61508), ….

21 21 3rd Nov 2011

Experiences of applying Requirements-Based Testing

Supporting Iterative Development

1 2 3

• Agree Feature Spec

• Define tests

• Map tests to features

Product

Backlog

• Beta release

• Execute tests

• Record results in DB

• Production release

• Execute tests

• Record results in DB

4, 5, ..

• Maintain feature

• Execute tests

• Record results

in DB

Sprint Sprint Sprint Sprint

Not

strictly

scrum?

22 22 3rd Nov 2011

Recap

• Requirements management helps to record requirements and manage their implementation

• Bidirectional requirements mapping allows us to trace in both directions – Identify orphan code and tests

• Record multiple test status rather than just pass/fail – defined, written, executing, passing

• Using an SQL database to record test data – The mapping

– The status We will now see how this can create

significant business advantage

23 23 3rd Nov 2011

Risk-based Testing

Product A

Risk B

Risk B.1

Test 1

Test 2

Test 3

• Risk – Search for clients whose total value

of sales between 2 specified dates is above a specified value returns wrong result

– Probability = Low

– Impact = High

• Tests – Query test database XXX with

dates “d1/m1/y1” and “d2/m2/y2”. Expected result = “client1”,...

• Searching the test database – Prioritise tests according to risk

– Calculate remaining risk

• A passing test mitigates risk

Risk

24 24 3rd Nov 2011

Improved Time-To-Market through Prioritisation and

Risk Analysis

Req 1 1

Req 2 1

Req 3 1

Req 4 1

Req 5 1

Req 6 2

Req 7 2

Req 8 3

Requirement

Priority

Unit 1 Integration

-

-

-

-

• Release Unit level

with known risk

– Close at higher level

• Release with known

risk

!

System

-

-

-

-

-

-

!

Req 9 1 - -

Acceptance

-

-

-

-

-

-

25 25 3rd Nov 2011

Another view of Risk-based Testing (product risk)

Product A

Feature A = High Risk

Test 1 Test 2 Test 3

• Higher Risk = More testing – Higher risks should have more tests

– Can add “risk field” to features based on likelyhood of failure and impact of failure

– Can match level of testing to risk and ensure sufficient level of testing (using easy searches)

Feature B = Medium Risk

26 26 3rd Nov 2011

Filtering Requirements based on Customers

Product A

Req D Req C Req B

Feature E Feature F

Feature G

Impl H Impl I

Test 1 Test 2 Test 3

Customer 1

Customer 2

Can we release

to customer X?

27 27 3rd Nov 2011

Filtering Based on release

Product A

Req D Req C Req B

Feature E Feature F

Feature G

Impl H Impl I

Test 1

Test 2

Test 3

Beta Release Prodn Release

28 28 3rd Nov 2011

Impact Assessment

Product A

Req D Req C Req B

Feature E Feature F

Feature G

Impl H Impl I

Test 1 Test 2 Test 3

• What is the impact on

changing Req C?

• How often do people ignore testing when

assessing the impact of a change?

29 29 3rd Nov 2011

Impact Assessment

Product A

Req D Req C Req B

Feature E Feature F

Feature G

Impl H Impl I

Test 1 Test 2 Test 3

• What is the impact on

dropping test 3?

30 30 3rd Nov 2011

Return on Investment?

• Initial investment is relatively high – Building initial SQL database

– Ensuring requirements are recorded

– Ensuring test information is stored

– Mapping requirements to tests

• Business advantage is huge – Identify test holes and test orphans

– Understanding status at all points in project

• “Defined, written, running, passing”

– Automation of analysis (risks, impact, release readiness)

– Trend analysis

• Better prediction of release readiness

31 31 3rd Nov 2011

Summary

• Map requirements to tests

• Database to record mappings and results – Store results from test automation

– Record %’s of tests defined, written, executing, passing

• Advantages – Identify test holes and test orphans

– Track the status of the whole verification effort

– Use historical perspective for more accurate predictions

– Better reporting of requirements status

– Support for

• Risk-based testing

• Prioritisation and Risk Analysis

• Filtering Requirements based on Customers and releases

• Impact analysis

– Support for regulatory-based requirements signoff

32 32 3rd Nov 2011

Q&A

• Mike Bartley

– mike@testandverification.com

– Mike Bartley on LinkedIn

– Other materials on www.testandverification.com

Recommended