30
Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 [email protected] [email protected] [email protected]

Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 [email protected] [email protected] [email protected]

  • View
    214

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

Verification and Validation

Mark C. Paulk

July 16, 2003 SEEK 2003

[email protected] [email protected]

[email protected]

Page 2: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

2

V&V in IEEE 729-1983

Verification• The process of determining whether or not the

products of a given phase of the software development cycle fulfill the requirements established during the previous phase.

Validation• The process of evaluating software at the end

of the software development process to ensure compliance with software requirements.

Page 3: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

3

The V-Model

Sami Zahran, Software Process Improvement: Practical Guidelines for Business Success, 1998, pages 377-386.

“Software Development Standard for the German Federal Armed Forces, General Directive 250 – Software Life Cycle Process Model,” 1992.

System RequirementsAnalysis and Design

DP RequirementsAnalysis and Design

Software RequirementsAnalysis

Preliminary Design

Detailed Design

Implementation

Software ComponentsIntegration

DP Integration

System Integration

Page 4: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

4

V&V in IEEE 610.12-1990

Verification• The process of evaluating a system or

component to determine whether the products of a given development phase satisfy the conditions imposed at the start of that phase.

Validation• The process of evaluating a system or

component during or at the end of the development process to determine whether it satisfies specified requirements.

Page 5: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

5

IV&V in IEEE 610.12-1990

Verification and Validation (V&V)• The process of determining whether (1) the

requirements for a system or component are complete and correct, (2) the products of each development phase fulfill the requirements or conditions imposed by the previous phase, and (3) the final system or component complies with the specified requirements.

Independent Verification and Validation (IV&V)• V&V performed by an organization that is

technically, managerially, and financially independent of the development organization.

Page 6: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

6

Techniques

Name some verification techniques.

Name some validation techniques.

Page 7: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

7

Concept V&V in IEEE 1012-1998

Concept documentation evaluation

Evaluation of operational procedures

Criticality analysis

Hardware / software / user requirements

Allocation analysis

Traceability analysis

Hazards analysis

Risk analysis

Software V&V Plan generation or updating

Page 8: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

8

Requirements V&V in IEEE 1012-1998

Traceability analysis

Software requirements evaluation

Interface analysis

V&V test plan generation and verification• system• acceptance

Configuration management assessment

Criticality / hazard analysis update

Risk analysis

Page 9: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

9

Design V&V in IEEE 1012-1998

Traceability analysis

Software design evaluation

Interface analysis

Component V&V test plan generation and verification

V&V test design generation and verification• component• integration• system• acceptance

Criticality / hazard / risk analysis updates

Page 10: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

10

Implementation V&V in IEEE 1012-1998

Traceability / interface / criticality / hazard / risk analysis updates

Source code and source code documentation evaluation

Component V&V test plan generation and verification

V&V test case and test procedure generation and verification• component• integration• system

Acceptance test case generation and verification

Component V&V test execution and verification

Page 11: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

11

Test V&V in IEEE 1012-1998

Traceability / hazard / risk analysis updates

Acceptance V&V test procedure generation and verification

V&V test execution and verification• acceptance• integration• system

Page 12: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

12

Installation and Checkout V&V in IEEE 1012-1998

Hazard / risk analysis updates

Installation configuration audit

Installation checkout

V&V final report generation

Page 13: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

13

Operation V&V in IEEE 1012-1998

Software V&V Plan revision

Proposed change assessment

Anomaly evaluation

Criticality / hazard / risk analysis updates

Migration assessment

Retirement assessment

Page 14: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

14

CMMI Overview

Process is unpredictable,poorly controlled, and reactive

Process is characterized for projects and is oftenreactive

Process is characterizedfor the organization andis proactive

Process is measuredand controlled

Focus is on quantitativecontinuous processimprovement

Level Process Characteristics

Requirements ManagementProject Planning

Product and Process Quality Assurance

Configuration Management

Project Monitoring and ControlSupplier Agreement Management

Quantitative Project ManagementOrganizational Process Performance

Causal Analysis and ResolutionOrganizational Innovation and Deployment

Process Areas

Requirements Development Technical Solution

Product IntegrationValidation

Verification Organizational Process FocusIntegrated Project Management

Initial

Managed

Defined

Quantitatively Managed

Optimizing

Measurement and Analysis

Organization Process DefinitionOrganizational TrainingRisk ManagementDecision Analysis & Resolution

Page 15: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

15

V&V in CMMI

Verification• Confirmation that work products properly

reflect the requirements specified for them. In other words, verification ensures that “you built it right.”

Validation• Confirmation that the product, as provided (or

as it will be provided), will fulfill its intended use. In other words, validation ensures that “you built the right thing.”

Page 16: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

16

CMMI - Verification The purpose of Verification is to ensure that selected work products meet their specified requirements.

SG 1 Prepare for Verification

Preparation for verification is conducted.

SP 1.1-1 Select Work Products for Verification

Select the work products to be verified and the verification methods that will be used for each.

SP 1.2-2 Establish the Verification Environment

Establish and maintain the environment needed to support verification.

SP 1.3-3 Establish Verification Procedures and Criteria

Establish and maintain verification procedures and criteria for the selected work products.

SG 2 Perform Peer Reviews Peer reviews are performed on selected work products.

SP 2.1-1 Prepare for Peer Reviews Prepare for peer reviews of selected work products.

SP 2.2-1 Conduct Peer Reviews Conduct peer reviews on selected work products and identify issues resulting from the peer review.

SP 2.3-2 Analyze Peer Review Data

Analyze data about preparation, conduct, and results of the peer reviews.

SG 3 Verify Selected Work Products

Selected work products are verified against their specified requirements.

SP 3.1-1 Perform Verification Perform verification on the selected work products.

SP 3.2-2 Analyze Verification Results and Identify Corrective Action

Analyze the results of all verification activities and identify corrective action.

Page 17: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

17

CMMI - Validation The purpose of Validation is to demonstrate that a product or product component fulfills its intended use when placed in its intended environment.

SG 1 Prepare for Validation

Preparation for validation is conducted.

SP 1.1-1 Select Products for Validation

Select products and product components to be validated and the validation methods that will be used for each.

SP 1.2-2 Establish the Validation Environment

Establish and maintain the environment needed to support validation.

SP 1.3-3 Establish Validation Procedures and Criteria

Establish and maintain procedures and criteria for validation.

SG 2 Validate Product or Product Components

The product or product components are validated to ensure that they are suitable for use in their intended operating environment.

SP 2.1-1 Perform Validation Perform validation on the selected products and product components.

SP 2.2-1 Analyze Validation Results

Analyze the results of the validation activities and identify issues.

Page 18: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

18

SoftwareRD

SystemsRD

Customers&

Stakeholders

REQMHardware

RDHardware

TSHardware

VER

SoftwareTS

SoftwareVER

HardwarePI

SoftwarePI

SystemsPI

HardwareVAL

SoftwareVAL

System

sV

AL

SystemsTS

CMMI Engineering Hierarchy

Page 19: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

19

Testing Principles

All tests should be traceable to customer requirements.

Test should be planned long before testing begins.

The Pareto principle applies to software testing.

Exhaustive testing is not possible

To be most effective, testing should be conducted by an independent third party.

Roger Pressman, Software Engineering: A Practitioner’s Approach, Fifth Edition, 2001.

Page 20: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

20

Software Program Managers Network

SPMN established in 1992 to identify proven industry and government software best practices and convey these practices to managers of large-scale DoD system acquisition programs

16 Critical Software PracticesTM specifically address underlying cost and schedule drivers that have caused many software intensive systems to be delivered over budget, behind schedule and with significant performance shortfalls.  

<URL: http://www.spmn.com/>

Page 21: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

21

SPMN #8. Manage and Trace Requirements

Before any design is initiated, requirements for that segment of the software need to be agreed to.

Requirements tracing should be a continuous process providing the means to trace from the user requirement to the lowest level software component.

Tracing shall exist not only to user requirements but also between products and the test cases used to verify their successful implementation.

All products that are used as part of the trace need to be under configuration control.

Requirements tracing should use a tool and be kept current as products are approved and placed under CM.

Requirements tracing should address system, hardware, and software and the process should be defined in the system engineering management plan and the software development plan.

Page 22: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

22

SPMN #14. Inspect Requirements and Design

All products that are placed under CM and are used as a basis for subsequent development need to be subjected to successful completion of a formal inspection prior to its release to CM.

The inspection needs to follow a rigorous process defined in the software development plan and should be based on agreed-to entry and exit criteria for that specific product.

At the inspection, specific metrics should be collected and tracked which will describe defects, defect removal efficiency, and efficiency of the inspection process.

All products to be placed under CM should be inspected as close to their production as feasible.

Inspections should be conducted beginning with concept definition and ending with completion of the engineering process.

The program needs to fund inspections and track rework savings.

Page 23: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

23

SPMN #15. Manage Testing as a Continuous Process

All testing should follow a preplanned process, which is agreed to and funded.

Every product that is placed under CM should be tested by a corresponding testing activity.

All tests should consider not only a nominal system condition but also address anomalous and recovery aspects of the system.

Prior to delivery, the system needs to be tested in a stressed environment, nominally in excess of 150 percent of its rated capacities.

All test products (test cases, data, tools, configuration, and criteria) should be released through CM and be documented in a software version description document.

Every test should be described in traceable procedures and have pass-fail criteria included.

Page 24: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

24

SPMN #16. Compile and Smoke Test Frequently

All tests should use systems that are built on a frequent and regular basis (nominally no less than twice a week).

All new releases should be regression tested by CM prior to release to the test organization.

Smoke testing should qualify new capability or components only after successful regression test completion.

All smoke tests should be based on a pre-approved and traceable procedure and run by an independent organization (not the engineers who produced it).

All defects identified should be documented and be subject to the program change control process.

Smoke test results should be visible and provided to all project personnel.

Page 25: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

25

Generic V&V Techniques

• Technical reviews, management reviews, joint reviews• Symbolic execution, program proving, proof of

correctness, formal methods• Anomaly analysis, syntactical checks• Functional testing, black-box testing, equivalence

partitioning, boundary value analysis• Structural testing, white-box testing, basis path testing,

condition testing, data flow testing, loop testing• Unit testing• Regression testing, daily build and smoke test• Integration testing• Random testing, adaptive perturbation testing, mutation

testing, be-bugging• Operational profile testing, stress testing, performance

testing• System testing, acceptance testing• Peer reviews, structured walkthroughs, inspections,

active design reviews, pair programming, …

Page 26: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

26

Effective V&V

What is the most effective verification technique (or techniques)??

What is the most effective validation technique (or techniques)?

Page 27: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

27

Test procedure,test case inspections

Test procedure,test case inspections

Code inspections

Test procedure,test case inspections

Test procedure,test case inspections

Requirementsinspections

HLD inspections

DD inspections

The W-Model

Customer needs

Requirements

Architectural design

Detailed design

CodeUnit test

Integrationtest

Systemtest

Acceptancetest

Page 28: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

28

Questions and Answers…

?

Page 29: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

29

Some Useful Internet Links

Software Engineering Institute• www.sei.cmu.edu/• www.sei.cmu.edu/cmm/• www.sei.cmu.edu/cmm/cmm.articles.html• www.sei.cmu.edu/cmmi/• www.sei.cmu.edu/cmmi/models/• www.sei.cmu.edu/str/descriptions/inspections_body.html

Software Program Managers Network• www.spmn.com/

Formal Technical Review Archive• www2.ics.hawaii.edu/~johnson/FTR/

Page 30: Verification and Validation Mark C. Paulk July 16, 2003 SEEK 2003 PaulkConsulting@att.net Mark.Paulk@ieee.org mcp@cs.cmu.edu

30

Copyrights and Trademarks

Capability Maturity Model, Capability Maturity Modeling, CMM Integration, and CMMI are registered in the U.S. Patent and Trademark Office by Carnegie Mellon University.

SM eSCM, IDEAL, SCAMPI, SCAMPI Lead Assessor, SCAMPI Lead Appraiser, Personal Software Process, PSP, Team Software Process, and TSP are service marks of Carnegie Mellon University.