50
Software Verification Introduction & The Model-Driven Test Design Process

Software Verification Introduction & The Model-Driven Test Design Process

Embed Size (px)

Citation preview

Page 1: Software Verification Introduction & The Model-Driven Test Design Process

Software Verification

Introduction&

The Model-Driven Test Design Process

Page 2: Software Verification Introduction & The Model-Driven Test Design Process

Testing in the 21st Century Today’s software market :

– is much bigger

– is more competitive

– has more users

Embedded Control Applications– airplanes, air traffic control

– spaceships

– watches

– ovens– remote controllers

increase pressure on testers– Programmers must unit test – with no training, education or tools !

– Tests are key to functional requirements – but who builds those tests ?

2

– PDAs– memory seats – DVD players– garage door openers– cell phones

Page 3: Software Verification Introduction & The Model-Driven Test Design Process

3

Cost of Testing

In the real-world, testing is the principle post-design activity

Restricting early testing usually increases cost

Extensive hardware-software integration requires more testing

You’re going to spend at least half of You’re going to spend at least half of your development budget on testing, your development budget on testing, whether you want to or notwhether you want to or not

Page 4: Software Verification Introduction & The Model-Driven Test Design Process

4

Why Test?

What fact is each test trying to verify?

If you don’t start planning for each test If you don’t start planning for each test when the functional requirements are when the functional requirements are formed, you’ll never know why you’re formed, you’ll never know why you’re conducting the testconducting the test

Not testing is even more expensive

Program Managers often Program Managers often say: “Testing is too say: “Testing is too

expensive.”expensive.”

Page 5: Software Verification Introduction & The Model-Driven Test Design Process

Testing

event generation event evaluation

events

instrumentation specification

send(42)

send(42)

Page 6: Software Verification Introduction & The Model-Driven Test Design Process

Test Design in Context

Test Design is the process of designing input values that will effectively test software

Test design is one of several activities for testing software– Most mathematical

– Most technically challenging

6

Page 7: Software Verification Introduction & The Model-Driven Test Design Process

The Sieve program

int P[101];

int V=1; int N=1; int I;

main() {

while (N<101) {

for (I=1;I<=N;I++)

if (P[I]=0) {

P[I]=V++ ;

N++ ; }

else if (V%P[I]==0)

I=N+1;

else I++}

}

Is this program correct?

Page 8: Software Verification Introduction & The Model-Driven Test Design Process

Types of Test Activities Testing can be broken up into four general types of activities

1. Test Design

2. Test Automation

3. Test Execution

4. Test Evaluation

Each type of activity requires different skills, background knowledge, education and training

8

1.a) Criteria-based

1.b) Human-based

Page 9: Software Verification Introduction & The Model-Driven Test Design Process

1. Test Design – (a) Criteria-Based

This is the most technical job in software testing

Test design is analogous to software architecture on the development side

9

Design test values to satisfy coverage criteria Design test values to satisfy coverage criteria or other engineering goalor other engineering goal

Page 10: Software Verification Introduction & The Model-Driven Test Design Process

1. Test Design – (b) Human-Based

Requires knowledge of :– Domain, testing, and user interfaces

Requires almost no traditional CS– A background in the domain of the software is essential

– An empirical background is very helpful (biology, psychology, …)

– A logic background is very helpful (law, philosophy, math, …)

10

Design test values based on domain knowledge of Design test values based on domain knowledge of the program and human knowledge of testingthe program and human knowledge of testing

Page 11: Software Verification Introduction & The Model-Driven Test Design Process

2. Test Automation

This is slightly less technical Requires knowledge of programming

–Fairly straightforward programming – small pieces and simple algorithms

Programming is out of reach for many domain experts

11

Embed test values into executable scriptsEmbed test values into executable scripts

Page 12: Software Verification Introduction & The Model-Driven Test Design Process

12

What is JUnit? Open source Java testing framework used to write

and run repeatable automated tests (junit.org) JUnit features include:

– Assertions for testing expected results

– Test features for sharing common test data

– Test suites for easily organizing and running tests

– Graphical and textual test runners

JUnit is widely used in industry JUnit can be used as stand alone Java programs

(from the command line) or within an IDE such as Eclipse

Page 13: Software Verification Introduction & The Model-Driven Test Design Process

3. Test Execution

This is easy – and trivial if the tests are well automated

13

Run tests on the software and record the resultsRun tests on the software and record the results

4. Test Evaluation

This is much harder than it may seem

Evaluate results of testing, report to developersEvaluate results of testing, report to developers

Page 14: Software Verification Introduction & The Model-Driven Test Design Process

Types of Test Activities – Summary

These four general test activities are quite different It is a poor use of resources to use people inappropriately

14

1a. Design Design test values to satisfy engineering goals

Criteria Requires knowledge of discrete math, programming and testing

1b. Design Design test values from domain knowledge and intuition

Human Requires knowledge of domain, UI, testing

2. Automation

Embed test values into executable scripts

Requires knowledge of scripting

3. Execution Run tests on the software and record the results

Requires very little knowledge

4. Evaluation Evaluate results of testing, report to developers

Requires domain knowledge

Page 15: Software Verification Introduction & The Model-Driven Test Design Process

Model-Driven Test Design

15

software

artifact

model / structur

e

test requireme

nts

refined requirement

s / test specs

input values

test cases

test script

s

test result

s

pass / fail

IMPLEMENTATIONABSTRACTION

LEVEL

DESIGNABSTRACTION

LEVEL

test requireme

nts

Page 16: Software Verification Introduction & The Model-Driven Test Design Process

Model-Driven Test Design – Steps

16

software

artifact

model / structur

e

test requireme

nts

refined requirement

s / test specs

input values

test cases

test script

s

test result

s

pass / fail

IMPLEMENTATIONABSTRACTION

LEVEL

DESIGNABSTRACTION

LEVEL

analysis

criterion refine

generate

prefixpostfix

expected

automateexecuteevaluate

test requireme

ntsdomain analysis

feedback

Page 17: Software Verification Introduction & The Model-Driven Test Design Process

Model-Driven Test Design – Activities

17

software

artifact

model / structur

e

test requireme

nts

refined requirement

s / test specs

input values

test cases

test script

s

test result

s

pass / fail

IMPLEMENTATIONABSTRACTION

LEVEL

DESIGNABSTRACTION

LEVEL

Test DesignTest Design

Test Test ExecutionExecution

Test Test EvaluationEvaluation

Page 18: Software Verification Introduction & The Model-Driven Test Design Process

How to cover the executions?

IF (A>1)&(B=0) THEN X=X/A; END;

IF (A=2)|(X>1) THEN X=X+1; END;

Choose values for A,B,X.

Value of X may change, depending on A,B.

What do we want to cover? Paths? Statements? Conditions?

Page 19: Software Verification Introduction & The Model-Driven Test Design Process

Execute every statement at least once

By choosing

A=2,B=0,X=3

each statement will be chosen.

The case that the tests fail is not checked!

IF (A>1)&(B=0) THEN X=X/A; END;

IF (A=2)|(X>1) THEN X=X+1; END;

Page 20: Software Verification Introduction & The Model-Driven Test Design Process

IMPORTANT TERMS

20

Page 21: Software Verification Introduction & The Model-Driven Test Design Process

21

Validation & Verification

Validation : The process of evaluating software at the end of software development to ensure compliance with intended usage

Verification : The process of determining whether the products of a given phase of the software development process fulfill the requirements established during the previous phase

Page 22: Software Verification Introduction & The Model-Driven Test Design Process

According to Boehm

Verification means “we are building the product right.”

Validation means “we are building the right product”.

Page 23: Software Verification Introduction & The Model-Driven Test Design Process

Verification or validation

Unverifiable (but validatable) spec: ... if a user presses a request button at floor i, an available elevator must arrive at floor i soon...

1 2 3 4 5 6 7 8

Example: elevator response

Verifiable spec: ... if a user presses a request button at floor i, an available elevator must arrive at floor i within 30 seconds...

Page 24: Software Verification Introduction & The Model-Driven Test Design Process

24

Software Faults, Errors & Failures

Software Fault : A static defect in the software

Software Failure : External, incorrect behavior with respect to the requirements or other description of the expected behavior

Software Error : An incorrect internal state that is the manifestation of some fault

Faults in software are equivalent to design mistakes in Faults in software are equivalent to design mistakes in hardware.hardware.

They were there at the beginning and do not “appear” when They were there at the beginning and do not “appear” when a part wears out.a part wears out.

Page 25: Software Verification Introduction & The Model-Driven Test Design Process

25

Testing & Debugging

Testing : Finding inputs that cause the software to fail

Debugging : The process of finding a fault given a failure

Page 26: Software Verification Introduction & The Model-Driven Test Design Process

26

Static and Dynamic Testing

Static Testing : Testing without executing the program

– This include software inspections and some forms of analyses

– Very effective at finding certain kinds of problems – especially “potential” faults, that is, problems that could lead to faults when the program is modified

Dynamic Testing : Testing by executing the program with real inputs

Page 27: Software Verification Introduction & The Model-Driven Test Design Process

27

White-box and Black-box Testing

Black-box testing : Deriving tests from external descriptions of the software, including specifications, requirements, and design

White-box testing : Deriving tests from the source code internals of the software, specifically including branches, individual conditions, and statements

Model-based testing : Deriving tests from a model of the software (such as a UML diagram

Page 28: Software Verification Introduction & The Model-Driven Test Design Process

Stress Testing

Very large numeric values (or very small) Very long strings, empty strings Null references Very large files Many users making requests at the same time Invalid values

28

Tests that are at the limit of the software’s expected Tests that are at the limit of the software’s expected input domaininput domain

Page 29: Software Verification Introduction & The Model-Driven Test Design Process

29

Top-Down and Bottom-Up Testing

Top-Down Testing : Test the main procedure, then go down through procedures it calls, and so on

Bottom-Up Testing : Test the leaves in the tree (procedures that make no calls), and move up to the root.

– Each procedure is not tested until all of its children have been tested

Page 30: Software Verification Introduction & The Model-Driven Test Design Process

30

Test Case

Test Case Values : The values that directly satisfy one test requirement

Expected Results : The result that will be produced when executing the test if the program satisfies it intended behavior

Page 31: Software Verification Introduction & The Model-Driven Test Design Process

Search routine specification

procedure Search (Key : INTEGER ; T: array 1..N of INTEGER; Found : BOOLEAN; L: 1..N) ;

Pre-condition-- the array has at least one element1 <= N

Post-condition-- the element is found and is referenced by L( Found and T (L) = Key) or -- the element is not in the array( not Found and

not (exists i, 1 >= i >= N, T (i) = Key ))

Page 32: Software Verification Introduction & The Model-Driven Test Design Process

Search routine test cases

Input array (T) Key (Key) Output (Found, L)17 17 true, 117 0 false, ??17, 29, 21, 23 17 true, 141, 18, 9, 31, 30, 16, 45 45 true, 617, 18, 21, 23, 29, 41, 38 23 true, 421, 23, 29, 33, 38 25 false, ??

Page 33: Software Verification Introduction & The Model-Driven Test Design Process

TESTING LEVELS

33

Page 34: Software Verification Introduction & The Model-Driven Test Design Process

34

Testing Levels Based on Test Process Maturity

Level 0 : There’s no difference between testing and debugging

Level 1 : The purpose of testing is to show correctness Level 2 : The purpose of testing is to show that the

software doesn’t work Level 3 : The purpose of testing is not to prove

anything specific, but to reduce the risk of using the software

Level 4 : Testing is a mental discipline that helps all IT professionals develop higher quality software

Page 35: Software Verification Introduction & The Model-Driven Test Design Process

35

Level 0 Thinking

Testing is the same as debugging

Does not distinguish between incorrect behavior and mistakes in the program

Does not help develop software that is reliable or safe

Page 36: Software Verification Introduction & The Model-Driven Test Design Process

36

Level 1 Thinking Purpose is to show correctness What do we know if no failures?

– Good software or bad tests?

Test engineers have no:– Strict goal

– Real stopping rule

– Formal test technique

– Test managers are powerless

Page 37: Software Verification Introduction & The Model-Driven Test Design Process

37

Level 2 Thinking

Purpose is to show failures

Looking for failures is a negative activity

This describes most software companies.This describes most software companies.

Page 38: Software Verification Introduction & The Model-Driven Test Design Process

38

Level 3 Thinking

Testing can only show the presence of failures

Whenever we use software, we incur some risk

Risk may be small and consequences unimportant

Risk may be great and the consequences catastrophic

Testers and developers work together to reduce risk

Page 39: Software Verification Introduction & The Model-Driven Test Design Process

39

Level 4 Thinking

Testing is only one way to increase quality

Test engineers can become technical leaders of the project

Page 40: Software Verification Introduction & The Model-Driven Test Design Process

TESTING MODELS

40

Page 41: Software Verification Introduction & The Model-Driven Test Design Process

41

Testing at Different Levels

Class A

method mA1()

method mA2()

Class B

method mB1()

method mB2()

main Class P Acceptance testing: Is

the software acceptable to the user?

Integration testing: Test how modules interact with each other

System testing: Test the overall functionality of the system

Module testing: Test each class, file, module or component

Unit testing: Test each unit (method) individuallyThis view obscures underlying similarities

Page 42: Software Verification Introduction & The Model-Driven Test Design Process

Criteria Based on Structures

42

Structures : Four ways to model software

1. Graphs

2. Logical Expressions

3. Input Domain Characterization

4. Syntactic Structures

(not X or not Y) and A and B

if (x > y) z = x - y;else z = 2 * x;

A: {0, 1, >1}B: {600, 700, 800}C: {swe, cs, isa, infs}

Page 43: Software Verification Introduction & The Model-Driven Test Design Process

43

1. Graph Coverage – Structural

6

5

3

2

1 7

4

Node (Statement)

Cover every node

• 12567

• 1343567

This graph may represent

• statements & branches

• methods & calls

• components & signals

• states and transitions

Edge (Branch)

Cover every edge

• 12567

• 1343567

• 1357

Path

Cover every path

• 12567

• 1257

• 13567

• 1357

• 1343567

• 134357 …

Page 44: Software Verification Introduction & The Model-Driven Test Design Process

Example

44

Software Artifact : Java Method/** * Return index of node n at the * first position it appears, * -1 if it is not present*/public int indexOf (Node n){ for (int i=0; i < path.size(); i++) if (path.get(i).equals(n)) return i; return -1;}

45

3

2

1 i = 0

i < path.size()

if

return ireturn -1

Control Flow Graph

Page 45: Software Verification Introduction & The Model-Driven Test Design Process

45

1. Graph - FSM ExampleMemory Seats in a Lexus ES 300

Driver 1Configuration

Driver 2Configuration

[Ignition = off] | Button2

[Ignition = off] | Button1

ModifiedConfiguration

sideMirrors ()[Ignition = on] |

lumbar ()[Ignition = on] |

seatBottom ()[Ignition = on] |

seatBack ()[Ignition = on] |

NewConfiguration

Driver 1

NewConfiguration

Driver 2

[Ignition = on] | Reset AND Button1

[Ignition = on] | Reset AND Button2

Ignition = off

Ignition = off

(to Modified)

Guard (safety constraint) Trigger (input)

Page 46: Software Verification Introduction & The Model-Driven Test Design Process

46

2. Logical Expressions

( (a > b) or G ) and (x < y)

Transitions

Software Specifications

Program Decision StatementsLogical

Expressions

Page 47: Software Verification Introduction & The Model-Driven Test Design Process

47

2. Logic – Active Clause Coverage

( (a > b) or G ) and (x < y)

1 T F T

2 F F T

3 F T T

4 F F T

5 T T T

6 T T F

With these values for G and (x<y), (a>b) determines the value of the predicate

Page 48: Software Verification Introduction & The Model-Driven Test Design Process

48

3. Input Domain Characterization Describe the input domain of the software

– Identify inputs, parameters, or other categorization

– Partition each input into finite sets of representative values

– Choose combinations of values

System level– Number of students { 0, 1, >1 }

– Level of course { 600, 700, 800 }

– Major { swe, cs, isa, infs }

Unit level– Parameters F (int X, int Y)

– Possible values X: { <0, 0, 1, 2, >2 }, Y : { 10, 20, 30 }

– Tests

• F (-5, 10), F (0, 20), F (1, 30), F (2, 10), F (5, 20)

Page 49: Software Verification Introduction & The Model-Driven Test Design Process

49

4. Syntactic Structures Based on a grammar, or other syntactic definition Primary example is mutation testing

1. Induce small changes to the program: mutants

2. Find tests that cause the mutant programs to fail: killing mutants

3. Failure is defined as different output from the original program

4. Check the output of useful tests on the original program

Example program and mutants

if (x > y)

z = x - y;

else

z = 2 * x;

if (x > y)

if (x >= y)

z = x - y;

z = x + y;

z = x – m;

else

z = 2 * x;

Page 50: Software Verification Introduction & The Model-Driven Test Design Process

50

Coverage OverviewFour Test Modeling Four Test Modeling

SoftwareSoftware

GraphGraphss

LogicLogic Input Input SpaceSpace

SyntaSyntaxx

Use Use casescases

SpecsSpecs

DesignDesign

SourceSource

Applied to

DNFDNFSpecsSpecs

FSMsFSMsSourcSourcee

Applied to

InputInput

ModelModelss

IntegInteg

SourcSourcee

Applied to