25
Test Strategy

Test Strategy Sample

Embed Size (px)

Citation preview

Page 1: Test Strategy Sample

Test Strategy

Page 2: Test Strategy Sample

Program Test Strategy Revision History

Version Date Prepared by Comments

Master Test Plan Approval Signatures

Deliverable Name: Test StrategyVersion Number: 0.8Clarity ID: TBDClarity Project Name: TBD

Title Approver Name Approval (email back with “Approved”)

Date

Table of Contents

1. INTRODUCTION............................................................................................................................................51.1. PURPOSE...................................................................................................................................................... 52. SCOPE........................................................................................................................................................... 62.1. SCOPE ITEMS...............................................................................................................................................62.2. TYPES OF TESTING.....................................................................................................................................63. TEST APPROACH.........................................................................................................................................83.1. TESTING PHASES AND APPROACH..........................................................................................................83.1.1. UNIT TESTING..............................................................................................................................................93.1.2. FUNCTIONAL TESTING.............................................................................................................................103.1.3. SYSTEM INTEGRATION TESTING............................................................................................................103.1.4. REGRESSION TESTING.............................................................................................................................103.1.5. END TO END TESTING...............................................................................................................................103.1.6. USABILITY TESTING..................................................................................................................................113.1.7. OS/BROWSER COMPATIBILITY TESTING...............................................................................................113.1.8. ACCESSIBILITY TESTING.........................................................................................................................123.1.9. DATA MIGRATION TESTING.....................................................................................................................123.1.10. PERFORMANCE TESTING.........................................................................................................................123.1.11. SECURITY TESTING...................................................................................................................................133.1.12. DISASTER RECOVERY / MDHA TESTING................................................................................................133.1.13. CONTENT VALIDATION TESTING............................................................................................................133.1.14. USER ACCEPTANCE TESTING.................................................................................................................133.1.15. CUTOVER TESTING...................................................................................................................................143.1.16. USER ASSESSMENT..................................................................................................................................143.2. TESTING TEAM MODEL IN ITERATIVE DEVELOPMENT........................................................................144. ENTRY AND EXIT CRITERIA.....................................................................................................................164.1. ENTRY AND EXIT CRITERIA FOR STORY TESTING DURING ITERATION:..........................................164.2. ENTRY AND EXIT CRITERIA FOR UAT:...................................................................................................164.3. TEST SUSPENSION CRITERIA..................................................................................................................174.4. TEST RESUMPTION REQUIREMENTS.....................................................................................................175. TESTING ORGANIZATION.........................................................................................................................18

Page 3: Test Strategy Sample

5.1. PROGRAM TEST MANAGEMENT.............................................................................................................185.2. ROLES AND RESPONSIBILITIES..............................................................................................................195.3. PROGRAM TEAMS ROLES AND RESPONSIBILITIES............................................................................226. TRAINING.................................................................................................................................................... 236.1. ON-BOARDING SAPIENT INDIA................................................................................................................236.2. KNOWLEDGE TRANSFER TO PROJECT INDIA......................................................................................236.3. KNOWLEDGE TRANSFER PROCESS TO PROJECT INDIA....................................................................247. TEST ENVIRONMENTS..............................................................................................................................258. TEST TOOLS...............................................................................................................................................278.1. HP QUALITY CENTER................................................................................................................................278.2. HP QUICK TEST PROFESSIONAL............................................................................................................278.3. HP LOADRUNNER......................................................................................................................................278.4. OTHER TOOLS...........................................................................................................................................279. TESTING PROCESSES..............................................................................................................................289.1. DEFECT MANAGEMENT............................................................................................................................289.2. TEST CASE WRITING APPROACH...........................................................................................................299.3. TEST CASE MANAGEMENT......................................................................................................................299.4. TRACEABILITY...........................................................................................................................................2910. COMMUNICATION AND ESCALATION.....................................................................................................3110.1. MEETINGS SCHEDULE..............................................................................................................................3110.2. PURPOSE OF THE MEETINGS..................................................................................................................3211. TEST DELIVERABLES AND REPORTING................................................................................................3311.1. DELIVERABLE LIST...................................................................................................................................3311.2. REPORTING METRICS...............................................................................................................................3312. TEST DATA MANAGEMENT......................................................................................................................3513. TEST AUTOMATION APPROACH.............................................................................................................3614. RISKS ASSUMPTIONS AND DEPENDENCIES.........................................................................................3714.1. RISKS.......................................................................................................................................................... 3714.2. ASSUMPTIONS...........................................................................................................................................3714.3. DEPENDENCIES.........................................................................................................................................37

Page 4: Test Strategy Sample

1. Introduction

1.1. Purpose

The following points are outlined in this document:

Scope of the test Definition of the testing strategy at a high-level Test Control Processes High-level Timeline Test Management Reporting Testing Roles and Responsibilities Testing Tools Test Environments Test Data

Page 5: Test Strategy Sample

2. ScopeThis section covers the Scope of testing and Types of tests that will be done through out the life cycle of the Program.

2.1. Scope Items

2.2. Types of Testing Following is the list of tests that will be done across the lifecycle of the program. The overall approach for these and where these fit in will be covered in the following section of the document.

Unit Testing

Functional Testing

System Integration Testing

Regression Testing

End to End Testing

Usability Testing

OS/Browser Compatibility Testing

Accessibility Testing

Data Migration Testing

Performance Testing

Security Testing

Disaster Recovery Testing

Content Validation Testing

User Acceptance Testing

Cutover Testing

Users Assessment

Page 6: Test Strategy Sample

3. Test ApproachThis section covers the various phases and describes the testing approach for different types of testing across the program.

3.1. Testing Phases and approach

The approach we are following is an early phased approach to testing where the testing activity is embedded within the Development story and starts in parallel with the development effort. No story is deemed complete until the exit criteria for the story as defined in Entry and Exit criteria section below has been met.will be integrated as part of End to End Test stories and testing will be focused on complete business processes.

3.1.1.Unit TestingThe purpose of Unit Testing is to validate that each distinct module of the application is implemented as per the technical and functional specification of the story at the unit level.

Page 7: Test Strategy Sample

Unit tests will be automated where possible using frameworks like JUnit and will be executed throughout the development process across all iterations.

Development teams in the functional tracks will conduct Unit tests as part of Development process and will publish the unit test results as part of the build release to QA.

3.1.2.Functional TestingThe purpose of Functional Testing is to validate that the functionality implemented as part of the story meets the requirements.

3.1.3.System Integration TestingThe key purpose of the System Integration testing is to verify that information flow between different components and systems is in the right format and contains correct data. The testing will include both positive

3.1.4.Regression TestingThe purpose of regression testing is to validate that the code / configuration changes have not broken functionality that was working earlier.

3.1.5.End to End TestingThe purpose of E2E testing is to validate that the entire application, including interfaces, works correctly as a whole when linked together. This will include

3.1.6.Usability TestingUsability Testing is a qualitative methodology used to assess a solution with Project end-users. The purpose is to assess the Usefulness and Usability of the solution - that is how well the solution enables users to accomplish their goals and meet their needs and how easy it is to use.

The Information Architect supports the overall track by serving as dedicated note-taker for the sessions, generating ideas for future state recommendations, and depending on the project, may generate wireframes depicting the recommendations. IA would be allocated for execution and to shape protocol and provide input into analysis and recommendations.

3.1.7.OS/Browser Compatibility Testing

There will be separate stories created for OS/Browser testing and the test team within the Functional tracks will perform these tests for the web pages that are developed over various iterations.

3.1.8.Accessibility TestingPurpose: The purpose is to validate that the web pages conform to W3C AAA Web Content Accessibility will perform these tests for the web pages that are developed over various iterations.

3.1.9.Data Migration TestingThe purpose of Data Migration testing is to validate that all the entities (e.g. Customer records, Products etc) have been migrated from source data into Project data and all the attributes for each of the entities are loaded

Page 8: Test Strategy Sample

correctly.

3.1.10. Performance TestingPerformance tests will be carried out to validate if the Security Testing The purpose of Security test is to validate the application and infrastructure vulnerability.

3.1.11. Disaster Recovery / MDHA Testing Disaster recovery involves testing how the system recovers from disasters, crashes or any catastrophic failures. As part of Disaster Recovery Test, we will bring down one of the environments (TBD) and we will then try and setup an environment that is replica of TBD and perform sanity tests to confirm that everything is working fine functionally and operationally in the new environmentDetailed approach for DR Testing and the ownership is TBD at this stage.

3.1.12. Content Validation Testing

3.1.13. User Acceptance TestingThe purpose of UAT is to allow the business users to validate that the application performs end to end business transactions and meets the criteria outlined within the FSD’s.

3.1.14. Cutover TestingCutover Testing involves conducting a dry run of the cutover process with the intent of validating that the

3.1.15. User AssessmentThis test will be performed as part of training that will be provided to business users in order to get them familiarized with the website and the associated functions. Details will be added as these are finalized.

Organization change vertical will be responsible for doing User Assessment tests.

3.2. Testing Team Model in Iterative Development

Page 9: Test Strategy Sample

4. Entry and Exit CriteriaThis section provides the entry and exit criteria for each of the Iterations and the overall testing

4.1. Entry and Exit Criteria for Story Testing during Iteration:

4.2. Entry and Exit Criteria for UAT:

Page 10: Test Strategy Sample

4.3. Test Suspension Criteria

• High number of Show Stopper or severe defects• Show Stopper or severe defects that affect the ability to effectively run other Test Cases• Number of unresolved Sev 0, Sev 1 and Sev 2 are impacting the Testing progress

significantly• Environmental instability

4.4. Test Resumption Requirements

• Show Stopper and severe defects resolved. The fix verified in development before pushing to the test environment

• Mitigation plan is in place to address large number of unresolved defects• Environmental issues are resolved

Page 11: Test Strategy Sample

5. Testing Organization

5.1. Program Test Management

Page 12: Test Strategy Sample

5.2. Roles and Responsibilities

The anticipated roles for QA team have been identified in the illustration and summarized in the roles and responsibilities table.

Page 13: Test Strategy Sample

5.3. Program Teams Roles and Responsibilities

Team Accountable for Test Phase

Team Responsible for Test Script Creation

Team Responsible for Test Script Execution

Team Responsible for Test Environment Set-up

Unit Testing Development Teams Development Teams

Development Teams

Infrastructure Team

Functional Testing PMO Test Mgmt Team

Functional tracks Functional tracks Infrastructure Team

System Integration Testing

PMO Test Mgmt Team

Functional tracks Functional tracks Infrastructure Team

Page 14: Test Strategy Sample

6. Training

Knowledge Transfer

.

7.

8.

Page 15: Test Strategy Sample

9. Test EnvironmentsThis section covers the list of environments and their usage. Following pictures reflect the list of environments and the code propagation flow:

Page 16: Test Strategy Sample

10. Test ToolsThis section covers the list of tools we will use during testing process.

10.1. HP Quality Center QC customizations, process and standards once these are closed.

10.2. HP Quick Test ProfessionalHP QTP will be used for automated regression testing. Please refer to section on Test Automation framework for QTP usage.

10.3. HP LoadRunnerHP LoadRunner will be used running the performance tests.

10.4. Other tools

11. Testing Processes This section provides details on different testing processes that will be followed by the testing team.

11.1. Defect Management

The defects logged during the testing process will be discussed in Defects triage meetings. A daily Defect triage meeting will be held for each of the Functional tracks. The participants will include Functional Track Lead, Testing Lead, Architect and optionally Tester and Developer.

During these meetings the group will review the new defects logged and the status of outstanding showstopper defects.

Here are the Severity definitions for defects that will be followed on the program

Severity Description0 - Show Stopper There is no workaround and the project will be halted from making further

progress until the defect is resolved1 – Severe The defect prevents further use of this work product until resolved2 – Major The defect will prevent further use of a portion of this work product until it is

resolved3 – Significant The defect should be addressed, but work on future work products may proceed

according to plan4 – Minor The defect has little or no impact on current or future work products

Page 17: Test Strategy Sample

11.2. Test Case Writing ApproachThis section covers the approach we will take for writing test cases. Following are steps that test team will follow:

Requirements Understanding:

• Test Case Creation:

• Traceability: have

• Test Case Reviews: roved by Project business users – that will be one of the entry criteria for End to End test execution.

11.3. Test Case Management

11.4. Traceability Following picture reflects how Traceability will work between Requirements, Test Cases and Test Execution.

:

Vertical -> Functional Track -> Functionality -> Use Case -> Requirements

In the Test Plan module we will break down Functionality into Use Case (s). Each of the Dev stories consisting of requirements will be listed here. All the associated test cases for the Dev story will be listed within the Dev Story folder. These Test Cases will be mapped to Requirements and Use Cases in requirements module. The folder structure in Test Plan module of QC will be as follows:

Vertical -> Functional Track -> Story -> Test Case

In the Test Lab module we will create a folder for the ADR Release no. which will have sub-folders for Iterations. These Iterations will in turn have Test set representing the Dev Story. All the test cases for the scope delivered will be added to these test sets. Folder structure will be as follows:

ADR -> Iteration -> Functional Track -> Story -> Test Case

Page 18: Test Strategy Sample

12. Communication and Escalation

12.1. Meetings ScheduleFollowing picture represents how communication will work across geographies

12.2. Purpose of the Meetings  Reporting

Page 19: Test Strategy Sample

13. Test Deliverables and Reporting

13.1. Deliverable List

Deliverables DescriptionTest Strategy A program high level document will include

Testing phases Metrics R&R Traceability strategy

Test Plan A test plan will be created Scope of testing

Testing Schedule & Logistics Testing Resources Allocation Types of tests planned (including the following)

Data Conversion Test Plan Security Testing Test Plan Performance Test Plan End-To-End Test Plan

Test Scripts A detailed document for each test case including: Description of the test scenario and the purpose of the test. Steps involved in executing the test with expected results. Any special set-up requirements (infrastructure, data, etc.)

Test scripts will be created for testing various stories in QCTraceability Matrix

This contains Mapping of requirements/Business Transactions to test cases to confirm

that testing validates the requirements. This will be stored in QC

Test Results Summary results of the testing phase: Test results documentation as appropriate. List of test scripts executed with their execution status. List of defects recorded and their status These will be recorded in QC

Testing Control Processes

Processes developed establish consistency of testing activities across the project teams

Test Planning covering documentation of test scripts. Test Execution covering documentation of test results. Defect Management Process. Management of testing activities and report progress

Test Training Materials

Training materials used for reviewing the testing strategy, testing control processes and usage of testing tools.

13.2. Reporting Metrics

The metrics that are reported as part of the testing program are intended to provide an accurate view of the testing status and progress. Specifically, the following items will be measured and reported for each team for each testing phase:

Test Scripts created compared with the expected number to be created (tracked by the Functional

Page 20: Test Strategy Sample

Teams or other teams required to create the Test Scripts) Test Scripts executed compared to the total number of test scripts expected to be executed (by

Iteration and team) Number of defects (by severity, track, and iteration) Number of open defects (by severity, team and iteration) The total defect count by severity (by week by iteration and track) Defect severity by cause Defect Aging (by week by iteration and track)

Page 21: Test Strategy Sample

14. Test Data Management

This section describes the approach we will take for test data creation and management.

Page 22: Test Strategy Sample

15. Test Automation Approach

Page 23: Test Strategy Sample

16. Risks Assumptions and Dependencies

16.1. Risks

16.2. Assumptions

16.3. Dependencies