61
B. Computer Sci. (SE) (Hons.) CSEB233: Fundamentals of Software Engineering Software Verification and Validation

08 fse verification

Embed Size (px)

DESCRIPTION

 

Citation preview

Page 1: 08 fse verification

B. Computer Sci. (SE) (Hons.)CSEB233: Fundamentals of Software Engineering

Software Verification and Validation

Page 2: 08 fse verification

Objectives

• Discuss the fundamental concepts of software verification and validation

• Conduct software testing and determine when to stop• Describe several types of testing:

– unit testing, – integration testing, – validation testing, and – system testing

• Produce standard for software test documentation• Use a set of techniques for the creation of test cases

that meet overall testing objectives and the testing strategies

Page 3: 08 fse verification

Software Verification & Validation

Fundamental Concepts

Page 4: 08 fse verification

Verification & Validation (1)

V & V must be applied at each framework activity in the software process

Verification refers to the set of tasks that ensure that software correctly implements a specific function

Validation refers to a different set of tasks that ensure that the software that has been built is traceable to customer requirements

Boehm states this another way: Verification: "Are we building the product right?" Validation: "Are we building the right product?”

Page 5: 08 fse verification

Verification & Validation (2)

• V&V have two principal objectives:– Discover defects in a system– Assess whether or not the system is useful and

useable in an operational situation• V&V should establish confidence that the

software is fit for purpose– This does NOT mean completely free of defects– Rather, it must be good enough for its intended

use and the type of use will determine the degree of confidence that is needed

Page 6: 08 fse verification

• V & V (SQA) activities include:

Verification & Validation (3)

SQA activities: Technical reviews Quality and configuration audits Performance monitoring Simulation Feasibility study Documentation review Database review Algorithm analysis

Testing activities: Development testing Qualification testing Acceptance testing Installation testing

Page 7: 08 fse verification

Software Verification & Validation

Software Testing

Page 8: 08 fse verification

Software Testing

The process of exercising a program with the specific intent of finding errors prior to delivery to the end user

Must be planned carefully to avoid wasting development time and resources, and conducted systematically

What testing shows?

Page 9: 08 fse verification

Who Tests the Software? (1)

Developer Understands system

but, will test "gently“ Driven by "delivery"

Independent Tester Must learn about the

system, Will attempt to break it Driven by quality

Page 10: 08 fse verification

Who Tests the Software? (2)

• Misconceptions:– The developer should do no testing at all– Software should be “tossed over the wall” to

stranger who will test it mercilessly– Testers are not involved with the project until

it is time for it to be tested

Page 11: 08 fse verification

Who Tests the Software? (3)

• The developer and Independent Test Group (ITG) must work together throughout the software project to ensure that thorough tests will be conducted– An ITG does not have the “conflict of interest”

that the software developer might experience– While testing is conducted, the developer

must be available to correct errors that are uncovered

Page 12: 08 fse verification

Testing Strategy (1)

Identifies steps to be undertaken; when these steps are undertaken; how much effort; time; and resources required.

Any testing strategy must incorporate: Test planning Test case design Test execution Resultant data collection and evaluation

Should provide guidance for the practitioners and a set of milestones for the manager

Page 13: 08 fse verification

Testing Strategy (2)

• Characteristics of software testing strategies proposed in the literature:– To perform effective testing, you should

conduct effective technical reviews. • By doing this, many errors will be eliminated before

testing commences.– Testing begins at the component level and

works “outward” toward the integration of the entire computer-based system

Page 14: 08 fse verification

Testing Strategy (3)

– Different testing techniques are appropriate for different software engineering approaches and at different points in time.

– Testing is conducted by the developer of the software and (for large projects) an independent test group.

– Testing and debugging are different activities, but debugging must be accommodated in any testing strategy.

Page 15: 08 fse verification

Overall Software Testing Strategy Maybe viewed in the context of the spiral Begins by ‘testing-in-the-small’ and move toward

‘testing-in-the-large’

Page 16: 08 fse verification

Overall Software Testing Strategy• Unit Testing

– focuses on each unit of the software (e.g., component, module, class) as implemented in source code

• Integration Testing – focuses on issues associated with verification

and program construction as components begin interacting with one another

Page 17: 08 fse verification

Overall Software Testing Strategy• Validation Testing

– provides assurance that the software validation criteria (established during requirements analysis) meets all functional, behavioral, and performance requirements

• System Testing – verifies that all system elements mesh

properly and that overall system function and performance has been achieved

Page 18: 08 fse verification

When to Stop Testing?

• Testing is potentially endless– We cannot test until all the defects are

unearthed and removed – which is impossible• At some point, we have to stop testing

and ship the software– The question is, When?

• Realistically, testing is a trade-off between budget, time and quality

• It is driven by profit models(Pan, 1999)

Page 19: 08 fse verification

When to Stop Testing?

• The pessimistic, and unfortunately most often used approach is to stop testing whenever some, or any of the allocated resources - time, budget, or test cases - are exhausted

• The optimistic stopping rule is to stop testing when either reliability meets the requirement, or the benefit from continuing testing cannot justify the testing cost

Page 20: 08 fse verification

Software Verification & Validation

Types of Test

Page 21: 08 fse verification

Unit Testing

Focuses on assessing: internal processing logic and data structures within the

boundaries of a component (module). proper information flow of module interfaces. local data to ensure that integrity is maintained. boundary conditions. basis (independent) path. all error handling paths.

If resources are scarce to do comprehensive unit testing, select critical or complex modules and unit test these only

Page 22: 08 fse verification

Unit Testing

Page 23: 08 fse verification

Integration Testing

After unit testing of individual modules, they are combined together into a system

Question commonly asked once all modules have been unit tested: “If they work individually, why do you doubt that they’ll work when

we put them together?” The problem is “putting them together” – interfacing

Data can be lost across an interface Global data structures can present problems Subfunctions, when combined, may not produce the desired

function

Page 24: 08 fse verification

Integration Testing

• Incremental integration testing strategies:– Bottom-up integration– Top-down integration– Regression testing– Smoke testing

Page 25: 08 fse verification

Bottom-up Integration

• An approach where the lowest level modules are tested first, then used to facilitate the testing of higher level modules– The process is repeated until the module at the

top of the hierarchy is tested– Top level modules are the most important yet

tested last• Is helpful only when all or most of the

modules of the same development level are ready

Page 26: 08 fse verification

Bottom-up Integration

The steps: Test D, E individually Using a dummy program - ‘Driver’ Low-level components are com-

bined into clusters that perform a specific software function.

Test C such that it call D/E - If an error occurs we know that the problem is in C or in the interface between C and D/E

The cluster is tested Drivers are removed and clusters

are combined moving upward in the program structure

Page 27: 08 fse verification

Top-down Integration

The steps:• Main/top module used as a test driver and stubs are

substitutes for modules directly subordinate to it.• Subordinate stubs are replaced one at a time with real

modules (following the depth-first or breadth-first approach).

• Tests are conducted as each module is integrated.• On completion of each set of tests and other stub is

replaced with a real module.• Regression testing may be used to ensure that new

errors not introduced. • The process continues from 2nd step until the entire

program structure is built

Page 28: 08 fse verification

Top-down Integration

Example steps: Test A individually (use stubs for other

modules) Depending on the integration approach

selected, subordinate stubs are replaced one at a time with actual components

In a ‘depth-first’ structure: Test A such that it calls B (use stub for

other modules) If an error occurs we know that the

problem is in B or in the interface between A and B

Replace stubs one at a time, ‘depth-first’ and re-run tests

Page 29: 08 fse verification

Regression Testing (1)

• Focuses on retesting after changes are made – Whenever software is corrected, some aspects

of the software configuration is changed• e.g., the program, its documentation, or the data

that support it– Regression testing helps to ensure that

changes - due to testing or for other reasons - do not introduce unintended behavior or additional errors

Page 30: 08 fse verification

Regression Testing (2)

• In traditional regression testing, we reuse the same tests

• In risk-oriented regression testing, we test the same areas as before, but we use different (increasingly complex) tests

• Regression testing may be conducted manually, by re-executing a subset of all test cases or using automated capture/playback tools

Page 31: 08 fse verification

Smoke Testing (1)

• A common approach for creating “daily builds” for product software

• Software components that have been translated into code are integrated into a “build”

• A build includes all data files, libraries, reusable modules, and engineered components that are required to implement one or more product functions

• A series of tests is designed to expose errors that will keep the build from properly performing its function

Page 32: 08 fse verification

Smoke Testing (2)

• The intent should be to uncover “show stopper” errors that have the highest likelihood of throwing the software project behind schedule

• The build is integrated with other builds and the entire product (in its current form) is smoke tested daily

• The integration approach may be top down or bottom up

Page 33: 08 fse verification

Validation Testing (1)

Focuses on uncovering errors at the software requirements level.

SRS might contain a ‘Validation Criteria’ that forms the basis for a validation-testing approach

Page 34: 08 fse verification

Validation Testing (2)

Validation-Test Criteria: all functional requirements are satisfied all behavior characteristics are achieved all content is accurate and properly presented all performance requirements are attained,

documentation is correct, and usability and other requirements are met

Page 35: 08 fse verification

Validation Testing (3)

• An important element of the validation process is a configuration review/audit– Ensure that all elements of the software

configuration have been properly developed, are cataloged, and have the necessary detail to strengthen the support activities.

Page 36: 08 fse verification

Validation Testing (4)

• A series of acceptance tests are conducted to enable the customer to validate all requirements– To make sure the software works correctly for

intended user in his or her normal work environment

– Alpha test • Version of the complete software is tested by customer

under the supervision of the developer at the developer’s site

– Beta test • Version of the complete software is tested by customer at

his or her own site without the developer being present

Page 37: 08 fse verification

System Testing (1)

A series of different tests to verify that system elements have been properly integrated and perform allocated functions.

Types of system tests: Recovery Testing Security Testing Stress Testing Performance Testing Deployment Testing

Page 38: 08 fse verification

System Testing (2)

• Recovery Testing– forces the software to fail in a variety of ways

and verifies that recovery is properly performed• Security Testing

– verifies that protection mechanisms built into a system will, in fact, protect it from improper penetration

• Stress Testing– executes a system in a manner that demands

resources in abnormal quantity, frequency, or volume

Page 39: 08 fse verification

System Testing (3)

• Performance Testing– test the run-time performance of software

within the context of an integrated system• Deployment Testing

– examines all installation procedures and specialized installation software that will be used by customers

– all documentation that will be used to introduce the software to end users

Page 40: 08 fse verification

Software Verification & Validation

Software Test Documentation

Page 41: 08 fse verification

Software Test Documentation (1) IEEE 829 2008 Stan-

dard for Software Test Documentation

IEEE standard that specifies the form of a set of documents for use in eight defined stages of software testing

The documents are: Test Plan Test Design Specification Test Case Specification Test Procedure Specification Test Item Transmittal Report

Test Log Test Incident Report Test Summary Report

Page 42: 08 fse verification

Software Test Documentation (2)• Test Plan - A management planning

document that shows:– How the testing will be done

• including System Under Test (SUT) configurations.– Who will do it– What will be tested– How long it will take - may vary, depending

upon resource availability– What the test coverage will be, i.e. what

quality level is required

Page 43: 08 fse verification

Software Test Documentation (3)• Test Design Specification:

– detailing test conditions and the expected results as well as test pass criteria.

• Test Procedure Specification: – detailing how to run each test, including any

set-up preconditions and the steps that need to be followed

Page 44: 08 fse verification

Software Test Documentation (4)• Test Item Transmittal Report:

– reporting on when tested software components have progressed from one stage of testing to the next

• Test Log: – recording which tests cases were run, who ran

them, in what order, and whether each test passed or failed

• Test Incident Report: – detailing, for any test that failed, the actual versus

expected result, and other information intended to throw light on why a test has failed.

Page 45: 08 fse verification

Software Test Documentation (5)• Test Summary Report:

– A management report providing any important information uncovered by the tests accomplished, and including assessments of the quality of the testing effort, the quality of the software system under test, and statistics derived from Incident Reports

– The report also records what testing was done and how long it took, in order to improve any future test planning

– This final document is used to indicate whether the software system under test is fit for purpose according to whether or not it has met acceptance criteria defined by the project stakeholders

Page 46: 08 fse verification

Software Verification & Validation

Creating Test Cases

Page 47: 08 fse verification

Test-case Design (1)

Focuses on a set of techniques for the creation of test cases that meet overall testing objectives and the testing strategies

These techniques provide a systematic guidance for designing tests that Exercise the internal logic and interfaces of every software

component/module Exercise the input and output domains of the program to

uncover errors in program function, behaviour, and performance

Page 48: 08 fse verification

• For conventional application, software is tested from two perspectives:

Test-case Design (2)

White-box’ testing Focus on the program control

structure (internal program logic) Test cases are derived to ensure

that all statements in the program have been executed at least once during testing and all logical conditions have been exercised

Performed early in the testing process

‘Black-box’ testing Examines some

fundamental aspect of a system with little regard for the internal logical structure of the software

Performed during later stages of testing

Page 49: 08 fse verification

White-box Testing (1)

Using white-box testing method, you may derive test-cases that: Guarantee that al independent paths within a module

have been exercised at least once Exercise all logical decisions on their true and false sides Execute all loops at their boundaries and within their

operational bounds Exercise internal data structures to ensure their validity

Example method: basis path testing

Page 50: 08 fse verification

White-box Testing (2)

Basis path testing: Test cases derived to

exercise the basis set are guaranteed to exe-cute every statement in the program at least once during testing

Page 51: 08 fse verification

Deriving Test Cases (1)

Steps to derive the test cases by applying the basis path testing method: Using the design or code, draw a corresponding flow graph.

The flow graph depicts logical control flow using the notation illustrated in next slide.

Refer Figure 18.2 in page 486 - comparison between a flowchart and a flow graph

Calculate the Cyclometic Complexity V(G) of the flow graph Determine a basis set of independent paths Prepare test cases that will force execution of each path in the

basis set

Page 52: 08 fse verification

Deriving Test Cases (2)

Flow graph notation:

Sequence

IF

WHILE

UNTIL

CASE

Page 53: 08 fse verification

void foo (float y, float a *, int n){

float x = sin (y) ;if (x > 0.01)

z = tan (x) ;else

z = cos (x) ;

for (int i = 0 ; i < x ; + + i){

a[i] = a[i] * z ; Cout < < a [i];

}}

12

3

5

6

8

7

2 3

1

4

5

8

R1

R2

R3

Predicate nodes

Predicate nodes

6

7

Drawing Flow Graph: Example

Page 54: 08 fse verification

Deriving Test Cases (3)

The arrows on the flow graph, called edges or links, represent flow of control and are analogous to flowchart arrows

Area bounded by edges and nodes are called regions When counting regions, we

include the area outside the graph as region

Page 55: 08 fse verification

Deriving Test Cases: Example

Step 1: Draw a flow graph

Page 56: 08 fse verification

Deriving Test Cases: Example

Step 2: Calculate the Cyclomatic complexity, V(G)• Cyclomatic complexity can be used to count the

minimum number of independent paths.• A number of industry studies have indicated that

the higher V(G), the higher the probability or errors.• The SEI provides the following basic risk

assessment based on the value of code:Cyclomatic Complexity Risk Evaluation

1 to 10 A simple program, without very much risk11 to 20 A more complex program, moderate risk21 to 50 A complex, high risk program

> 50 An un-testable program (very high risk)

Page 57: 08 fse verification

Deriving Test Cases: Example

• Ways to calculate V(G):– V(G) = the number of regions of the flow graph.– V(G) = E – N + 2 ( Where “E” are edges & “N”

are nodes)– V(G) = P + 1 (Where P is the predicate nodes in

the flow graph, each node that contain a condition)

• Example:– V(G) = Number of regions = 4– V(G) = E – N + 2 = 16 – 14 + 2 = 4– V(G) = P + 1 = 3 + 1 = 4

Page 58: 08 fse verification

Deriving Test Cases: Example 1• Step 3: Determine a basis set of independent paths• Path 1: 1, 2, 3, 4, 5, 6, 7, 8, 12• Path 2: 1, 2, 3, 12• Path 3: 1, 2, 3, 4, 5, 9, 10, 3, …• Path 4: 1, 2, 3, 4, 5, 9, 11, 3, …

• Step 4: Prepare test cases • Test cases should be derived so that all of these

paths are executed• A dynamic program analyser may be used to check

that paths have been executed

Page 59: 08 fse verification

Summary (1)

• Software testing plays an extremely important role in V&V, but many other SQA activities are also necessary

• Testing must be planned carefully to avoid wasting development time and resources, and conducted systematically

• The developer and ITG must work together throughout the software project to ensure that thorough tests will be conducted

Page 60: 08 fse verification

Summary (2)

• The software testing strategy is to begins by ‘testing-in-the-small’ and move toward ‘testing-in-the-large’

• The IEEE 829.2009 standard specifies a set of documents for use in eight defined stages of software testing

• The ‘white-box’ and ‘black-box’ techniques provide a systematic guidance for designing test cases

• We need to know when is the right time to stop testing

Page 61: 08 fse verification

THE END

Copyright © 2013Mohd. Sharifuddin Ahmad, PhDCollege of Information Technology