24
Software Design (F28SD2) Verification & Validation Andrew Ireland Department of Computer Science School of Mathematical and Computer Sciences Heriot-Watt University Edinburgh

Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

Software Design (F28SD2)Verification & Validation

Andrew IrelandDepartment of Computer Science

School of Mathematical and Computer SciencesHeriot-Watt University

Edinburgh

Page 2: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

Verification & Validation

I Verification:I building the product right.I ensuring the software implements the design.

I Validation:I building the right product.I ensuring the software achieves the customer’s requirements.

I A broad set of activities that span the software life cycle.I An evidence gathering activity to enable the evaluation of our

work, e.g.I Does it meet the users requirements?I What are the limitations?I What are the risks of releasing it?

Page 3: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

Focus on Testing: Why Test?

I Devil’s Advocate:“Program testing can be used to show the presenceof defects, but never their absence!”

Dijkstra

“We can never be certain that a testing system iscorrect.”

MannaI In Defence of Testing:

I Testing is the process of showing the presence of defects.I There is no absolute notion of “correctness”.I Testing is not limited to code level execution ...

Page 4: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

Static and Dynamic Perspectives

I Static Perspective:I techniques which do not involve executing code.I provide early feedback on design and implementation decisions.

I Dynamic Perspective:I techniques which involve executing code.I involves executing the code on specific data and analyzing the

results – typically referred to as testing.

Note: both approaches have strengthens and weaknesses, soshould be seen as complementary. But dynamic testingremains the predominate approach.

Page 5: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

Testing and Debugging

I Debugging:I A developer led process that aims to identify and correct bugs.

It may also involve some level of checking to ensure bug fixeshave been correctly implemented.

I Testing:I A systematic analysis of a component or system with the aim

of identifying bugs as well as ensuring conformance torequirements and/or specifications.

I A systematic analysis to ensure that bug fixes have beenimplemented correctly, and have not regressed the system as awhole, i.e. regression testing.

Page 6: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

Testing is for “Life”

I The causes of many software bugs, and certainly the mostexpensive ones, can be traced to requirements and/or designdecisions.

I Early identification of bugs and prevention of bug migrationare therefore key goals of a test process.

I Testing and analysis should therefore span the software lifecycle, i.e.

I RequirementsI DesignI CodingI Maintenance

I A systematic approach is required ...

Page 7: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

The Process of Testing

Planning and control:

I Exhaustive testing is not possible, so theavailable time needs to be used effectively –planning is required in order to identify:

I and prioritize what needs to be testedI what level of testing (evidence) will be requiredI an exit or success criteriaI personnel responsible for each testing activity

I Control is required when reality, i.e. what isactually achieved, deviates from the plan.

I Effective control requires a mechanism forrecording, tracking and analyzing defects.

Page 8: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

The Process of Testing

Analysis and design:

I Focus here is on deciding what tests arerequired, the aim being to have as few test casesas possible – test cases need to be maintained,so the fewer high yield cases you require thecheaper testing will be.

I Analysis and design also focuses on what datawill be required in order to implement the testsas well as what the expected results of executingthe tests.

Page 9: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

The Process of Testing

Implementation and execution: Realizing the test cases and settingup the environment, e.g. test harness and automatedtest scripts, within which tests can be efficientlyexecuted and results can be logged.

Evaluation: Making a judgement as to whether or not theexit/success criteria determined during the planningphase has been achieved, e.g. 85% of all executablestatements within the code base have be executed.

Note: In the main, the activities described above represents asequence. However, problems encountered in later phases mayrequire earlier phases to be revisited.

Page 10: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

Testing and Trace-ability

Requirement Sub-system Module Code Tests

reverse-thruster Avionics EngineCtrl Lines 99,101activation controller 100,239conditional BrakeCtrl Lines 11,51on landing 52,123gear deployment· · · · · · · · · · · · · · ·

Volatility of requirements calls for systematic tracking through tocode level test cases.

Page 11: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

The V-Model for Software Development

Requirements Acceptance Test

AAU ���

Architecture System Test

AAU ���

Sub-systems Sub-system Test

AAU ���

Code Specs Unit Test

AAU ���

Coding

Page 12: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

Planning for Testing

Planning

Requirements Acceptance Test

AAU ���

-

Architecture System Test

AAU ���

-

Sub-systems Sub-system Test

AAU ���

-

Code Specs Unit Test

AAU ���

-

Coding

Page 13: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

Requirements Analysis

Unambiguous: Are the definitions and descriptions of the requiredcapabilities precise? Is there clear delineationbetween the system and its environment?

Consistent: Freedom from internal & external contradictions?

Complete: Are there any gaps or omissions?

Implementable: Can the requirements be realized in practice?

Testable: Can the requirements be tested effectively?

Aside: Before requirements analysis, a Feasibility Study will havebeen undertaken, e.g. is the project cost-effective from a businessperspective? Are the right skills and technologies available for theproject? Information Systems students will pursue this threadduring weeks 8, 9 and 10.

Page 14: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

Requirements Analysis

I 80% of defects can be typically attributed to requirements.

I Late life-cycle fixes are generally costly, i.e. 100 times moreexpensive than corrections in the early phases.

I Standard approaches to requirements analysis:I “Walk-throughs” or Fagan-style inspections – a formal style of

meeting where specific tasks relating to quality are undertaken,e.g. bug detection.

I Graphical aids, e.g. cause-effect graphs, data-flow diagrams.I Modelling tools, e.g. simulation, temporal reasoning.

Note: modelling will provide the foundation for high-leveldesign.

Page 15: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

Design Analysis

I Getting the system architecture right is often crucial to thesuccess of a project. Alternatives should be explored explicitly,i.e. by review early on in the design phase.

I Without early design reviews there is a high risk that thedevelopment team will quickly become locked into oneparticular approach and be blinkered from “better” designs.

I Where possible, executable models should be developed inorder to evaluate key design decisions, e.g. communicationprotocols. Executable models can also provide early feedbackfrom the customer, e.g. interface prototypes.

I Aside: Design-for-test, i.e. put in the “hooks” or “test-points”that will ease the process of testing in the future.

Page 16: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

Exploiting Design Notations: UML

Object Constraint Language (OCL): provides a language forexpressing conditions that implementations mustsatisfy (feeds directly into unit testing – dynamicanalysis lecture).

Use Case Diagrams: provides a user perspective of a system:

I FunctionalityI Allocation of functionalityI User interfaces

Provides a handle on defining equivalence classes forunit testing (dynamic analysis lecture).

Page 17: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

Exploiting Design Notations: UML

State Diagrams: provides a diagrammatic presentation for a finitestate representation of a system. State transitionsprovide strong guidance in testing the controlcomponent of a system.

Activity Diagrams: provides a diagrammatic presentation ofactivity co-ordination constraints within a system.Synchronization bars provide strong guidance intesting for key co-ordination properties, e.g. thesystem is free from dead-lock.

Sequence Diagrams: provides a diagrammatic presentation of thetemporal ordering of object messages. Can be usedto guide the testing of both synchronous andasynchronous systems.

Page 18: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

Unit Testing

I Unit testing is concerned with the low-level structure ofprogram code. A unit might be a function or procedure.Within object-oriented programming the smallest executableunit is an object of a class – although within an object anumber of methods may exist.

I Unit tests aim to test whether the code achieves its associatedspecification. In addition, unit testing should aim to identifyredundant code.

I Effective unit testing requires a mechanized framework, e.g.JUnit, AUnit, PyUnit, ...

I Test Driven Development is an evolutionary approach to codelevel development that is led by test case design, i.e. write thetests then develop the code.

Aside: Unit testing techniques will be the focus of static & dynamicanalysis lectures (Computer Science thread: weeks 8, 9 & 10).

Page 19: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

Sub-System Testing

I Focuses on the integration and testing of groups of moduleswhich define sub-systems – often referred to as integrationtesting.

I Non-incremental or “big bang” approach:I Costly on environment simulation, i.e. stub and driver

modules.I Debugging is non-trivial.

I Incremental approach:I Fewer stub and driver modules.I Debugging is more focused.

I Strategies: top-down, bottom-up, function-based,thread-based, critical-first, opportunistic.

Page 20: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

Testing Interfaces

Interface misuse: type mismatch, incorrect ordering, missingparameters – should be identified via basic staticanalysis.

Interface misunderstanding: the calling component or client makesincorrect assumptions about the called component orserver – can be difficult to detect if behaviour ismode or state dependent.

Temporal errors: mutual exclusion violations, deadlock, livenessissues – typically very difficult to detect, modelchecking provides one approach (static analysis).

Page 21: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

System Testing

Volume and stress testing: Can the system handle the requireddata throughput, requests etc? What are the upperbounds?

Configuration testing: Does the system operate correctly on all therequired software and hardware configurations?

Resource management testing: Can the system exceed memoryallocation limits?

Security testing: Is the system secure enough?

Recovery testing: Use pathological test cases to test systemrecovery capabilities.

Availability/reliability: Does the system meet the requirements?

Page 22: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

Acceptance Testing

I The objective here is to determine whether or not the systemis ready for operational use.

I Focuses on user requirements and user involvement is highsince they are typically the only people with “authentic”knowledge of the systems intended use.

I Test cases are typically designed to show that the system doesnot meet the customers requirements, if unsuccessful then thesystem is accepted.

I Acceptance testing is very much to do with validation, i.e.have we built the right product, rather than verification, i.e.have we built the product right.

Page 23: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

Summary

I The analysis & testing life-cycle.

I Prevention better than cure – analysis & testing should startearly both in terms of immediate analysis and planning fortesting – developing test cases helps clarify requirements.

I Planning is crucial given the time-limited nature of the testingactivity – planning and test case generation should be, as faras possible, integrated within your design notations andformalisms.

Page 24: Software Design (F28SD2) Verification & Validationair/swdesign/lectures/lec-6-v-and-v.pdfI \Software Testing: An ISTQB-ISEB Foundation Guide (Second Edition). BCS 2010. I \Test Driven

References

I “Software Testing: An ISTQB-ISEB Foundation Guide(Second Edition). BCS 2010.

I “Test Driven Development: By Example”, K. Bent,Addison-Wesley, 2002.

I “Testing Object-Oriented Systems”, R.V. Binder, AddisonWesley, 1999.

I “Software Testing in the Real World”, E. Kit, Addison-Wesley,1995.

I “The Object Constraint Language: precise modeling withUML”, J. Warmer & A. Kleppe, Addison-Wesley, 1998.

I IEEE Standard for Software Verification and Validation Plans,1992 (IEEE/ANSI Std 1012-1986)

I IEEE Standard for Software Test Documentation, 1991(IEEE/ANSI Std 829-1983)

I “The Art of Software Testing”, Myers, G.J. Wiley & Sons,1979.