Upload
ajamaljackson
View
231
Download
0
Embed Size (px)
Citation preview
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 1/47
STANDARD TEST APPROACHVersion: 3.0
XXXX Software Development Projects
Prepared for
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 1 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 2/47
XXXX (XXXX)
Revision History
Date Version Description Author
11/02/2010 1.0 Standard Approach Document Andre Jackson
1/14/2011 2.0 Standard Approach Document Andre Jackson
1/21/2011 2.0 Standard Approach Document Andre Jackson
1/25/2011 3.0 Test Plan/Approach Doc Merge Andre Jackson
Document Sign-Off
Title Signature
IT Director
Sr. App Engineer
Sr. Systems Analyst
App Engineer
QA Engineer
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 2 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 3/47
Table of Contents
1 INTRODUCTION...............................................................................................6
2 QUALITY OBJECTIVES.......................................................................................7
2.1 TEST APPROACH OBJECTIVES.....................................................................................................7
2.2 HIGH-LEVEL PRODUCT DEVELOPMENT OBJECTIVE..............................................................................8
2.2.1 PRIMARY OBJECTIVE.............................................................................................................8
2.2.2 SECONDARY OBJECTIVE.........................................................................................................9
2.3 SCOPE OF APPLICATIONS UNDER TEST..........................................................................................9
3 TEST METHODOLOGY.....................................................................................10
3.1 TEST S TAGES.....................................................................................................................10
Overview......................................................................................................................10Milestone 1 - Planning Phase........................................................................................10
Milestone 2 - Design Phase...........................................................................................10
Milestone 2a - Usability Testing....................................................................................11
Milestone 3 - Developing Phase....................................................................................11
Milestone 3a - Unit Testing (Multiple)...........................................................................12
Milestone 3b - Acceptance into Internal Release Testing..............................................12
Milestone 3c - Internal Release Testing........................................................................12
Milestone 3d - Acceptance into Alpha Testing..............................................................13
Milestone 3e - Alpha Testing.........................................................................................13
Milestone 4 - Stabilization Phase..................................................................................14
Milestone 4a - Acceptance into Beta Testing................................................................14
Milestone 4b - Beta Testing/Business Acceptance Testing (BAT)..................................14
Milestone 4c - Release to Production............................................................................15
Milestone 4d - Post Release..........................................................................................15
3.2 TEST LEVELS.....................................................................................................................15
3.2.1 Build Tests...........................................................................................................16
3.2.2 Milestone Tests....................................................................................................16
3.2.3 Release Tests......................................................................................................16
3.3 BUG REGRESSION................................................................................................................17
3.4 BUG TRIAGE......................................................................................................................17
3.5 SUSPENSION CRITERIA AND RESUMPTION REQUIREMENTS....................................................................18
3.6 TEST COMPLETENESS............................................................................................................18
3.6.1 Standard Conditions............................................................................................183.6.2 Bug Reporting & Triage Conditions......................................................................18
4 SOFTWARE RISK ISSUES................................................................................19
4.1 SCHEDULE.........................................................................................................................19
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 3 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 4/47
4.2 TECHNICAL........................................................................................................................19
4.3 MANAGEMENT.....................................................................................................................19
4.4 PERSONNEL.......................................................................................................................19
4.5 REQUIREMENTS...................................................................................................................20
5 TEST APPROACH............................................................................................20
5.1 SPECIAL TESTING TOOLS........................................................................................................20
5.2 SPECIAL TRAINING ON TESTING TOOLS........................................................................................215.3 TEST METRICS....................................................................................................................21
5.4 CONFIGURATION MANAGEMENT.................................................................................................22
5.5 REGRESSION TESTING............................................................................................................23
5.6 REQUIREMENTS MANAGEMENT..................................................................................................23
6 TEST STRATEGY.............................................................................................23
6.1 S YSTEM TEST.....................................................................................................................23
6.2 PERFORMANCE TEST.............................................................................................................24
6.3 SECURITY TEST...................................................................................................................24
6.4 AUTOMATED TEST................................................................................................................24
6.5 S TRESS AND VOLUME TEST ...................................................................................................24
6.6 RECOVERY TEST ..............................................................................................................246.7 DOCUMENTATION TEST .......................................................................................................24
6.8 BETA TEST.......................................................................................................................24
6.9 BUSINESS ACCEPTANCE TEST (BAT) .........................................................................................25
7 ENTRY AND EXIT CRITERIA.............................................................................26
7.1 TEST PLAN........................................................................................................................26
7.1.1 Test Plan Entry Criteria........................................................................................26
7.2.2 Test Cycle Exit Criteria........................................................................................26
8 TEST DELIVERABLES......................................................................................27
8.1 DELIVERABLES MATRIX..........................................................................................................29
8.2 DOCUMENTS......................................................................................................................31
8.2.1 Test Approach Document....................................................................................31
8.2.2 Test Schedule......................................................................................................31
8.2.3 Test Specifications...............................................................................................32
8.3 TEST CASE/BUG WRITE-UPS..................................................................................................32
8.3.1 Test Manager Test Cases.....................................................................................32
8.3.2 Test Case Coverage Reports................................................................................32
8.3.3 Bug Tracking System and Regression Results.....................................................32
8.3.4 TFS Bug Tracking Reports....................................................................................32
8.4 REPORTS..........................................................................................................................33
8.4.1 Weekly Status Reports.........................................................................................33
8.4.2 Phase Completion Reports...................................................................................338.4.3 Test Final Report - Sign-Off..................................................................................33
8.4.4 TFS Source Control Archive..................................................................................33
9 RESOURCE & ENVIRONMENT NEEDS................................................................34
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 4 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 5/47
9.1 BASE S YSTEM HARDWARE......................................................................................................34
9.2 BASE SOFTWARE ELEMENTS IN THE TEST ENVIRONMENT....................................................................35
9.3 PRODUCTIVITY AND SUPPORT TOOLS...........................................................................................35
9.4 TESTING TOOLS..................................................................................................................35
9.4.1 Tracking Tools.....................................................................................................35
9.4.1.1 Team Foundation Server...................................................................................35
9.4.1.2 MS Test Manager 2010.....................................................................................35
9.4.1.3 Configuration Management..............................................................................359.4.1.4 Issues Database................................................................................................36
9.4.2 Diagnostic Tools..................................................................................................36
9.4.2.1 ExamDiff Pro.....................................................................................................36
9.4.3 Automation Tools.................................................................................................36
9.4.3.1 Microsoft Visual Studio Ultimate/Test Professional 2010...................................36
9.5 TEST ENVIRONMENT..............................................................................................................36
9.5.1 Hardware.............................................................................................................37
9.5.2 Software..............................................................................................................37
10 RESPONSIBILITIES, STAFFING AND TRAINING NEEDS.....................................38
10.1 PEOPLE AND ROLES............................................................................................................38
10.2 S TAFFING AND TRAINING NEEDS..............................................................................................39
11 TEST MANAGER TEST CASES.........................................................................40
11.1 TEST CASES....................................................................................................................40
11.2 TEST CASE CONTENTS........................................................................................................40
Test Case Title..............................................................................................................40
TCID 40
Status 40
Classification................................................................................................................41
Steps 41
Summary......................................................................................................................41
Tested User Stories.......................................................................................................41 All Links........................................................................................................................41
Attachments.................................................................................................................41
Associated Automation.................................................................................................42
Parameter Values.........................................................................................................42
12 BUG MANAGEMENT......................................................................................42
12.1 BUG DOCUMENTATION.........................................................................................................42
12.1.1 Bug Severity and Priority Definition...................................................................43
12.1.1.1 Severity List....................................................................................................43
12.1.1.2 Priority List......................................................................................................44
12.2 TEST RUNNER BUG ENTRY FIELDS...........................................................................................44
Bug Title.......................................................................................................................44Status 44
Classification................................................................................................................45
Planning........................................................................................................................45
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 5 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 6/47
Details45
System Info..................................................................................................................45
Test Cases....................................................................................................................45
All Links........................................................................................................................45
Attachments.................................................................................................................45
12.3 BUG REPORTING PROCESS....................................................................................................45
13 DOCUMENTATION........................................................................................47
1 Introduction
This document identifies the XXXX Software Quality Assurance Department’s
methodology as implemented across all projects. This test approach describes the
high-level strategies and methodologies used to plan, organize, and manage testing
of software projects within XXXX. This test approach also includes descriptions of
XXXX Software Quality Assurance Department’s role at various phases of the
project development cycle. It also establishes the goals, processes, and
responsibilities required to implement effective quality assurance functions acrossall XXXX software development and release projects.
The details outlined in this document provide the framework necessary to ensure a
consistent approach to software quality assurance throughout the project life cycle.
It defines the approach that will be used by the Quality Assurance (QA) personnel to
monitor and assess software development processes and products to provide
objective insight into the maturity and quality of the software. The systematic
monitoring of the XXXX software products, processes, and services will be
evaluated to ensure they meet requirements and comply with XXXX policies,
standards, and procedures, as well as applicable Institute of Electrical and
Electronic Engineers (IEEE) standards.
The overall purpose of this test approach strategy is to gather all of the information
necessary to plan and control the test effort for testing XXXX applications. It
describes the approach to testing the software, and will be the top-level plan used
by testers to direct the test effort.
The approach is designed to create clear and precise documentation of the test
methods and processes that XXXX will use throughout the course of system
verification testing.
The strategy covers SQA activities throughout the formulation and implementationphases of the application mission. SQA activities will continue through operations
and maintenance of the system.
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 6 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 7/47
This documenting of the test methods and processes will serve as the basis for
ensuring that all major milestones and activities required for effective verification
testing can efficiently and successfully be accomplished. This plan may be modified
and enhanced as required throughout all future verification testing engagements.
NOTE: For the remainder of this document and for the sake of simplicity and
consistency, the AFCE Software Development Department will be simply referred to
as “Development”, and XXXX Software Quality Assurance Department will be
referred to as “QA Testing”.
2 Quality Objectives
2.1 Test Approach Objectives
This Test Approach supports the following objectives:
Outlines and defines the overall test approach that will be used;
Identifies hardware, software, and tools to be used to support the
testing efforts;
Defines the types of tests to be performed;
Defines the types of data required for effective testing;
Defines the types of security threats and vulnerabilities against which
each system will be tested;
Identifies and establishes traceability from the Requirements Matrix to
test cases and from test cases to the Requirements Matrix;
Serves as a foundation for the development of Test Plans and Test
Cases;
Defines the process for recording and reporting test results;
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 7 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 8/47
Defines the process for regression testing and closure of discrepancies;
Identifies the items that should be targeted by the tests;
Identifies the motivation for and ideas behind the test areas to be
covered;
Identifies the required resources and provides an estimate of the testefforts;
List the deliverable elements of the test activities;
Define the activities required to prepare for and conduct System, Beta
and User Acceptance testing;
Communicate to all responsible parties the System Test strategy;
Define deliverables and responsible parties;
Communicate to all responsible parties the various Dependencies and
Risks; and
Scope.
2.2 High-Level Product Development Objective
The high-level objectives of this project are as follows:
Update the software to a current technology platform while
maintaining the core functionality of the original product; Re-design the user interface to take advantage of current
technology standards, and to deliver a better user experience;
Provide a method for delivering updates to the productelectronically and seamlessly to the user; and
Allow internal non-technical personnel to manage the creation andmaintenance of new versions.
2.2.1 Primary Objective
The primary objective of testing our application systems is to:
Identify and expose all issues and associated risks;
Communicate all known issues to the project team; and
Ensure that all issues are addressed in an appropriate matter beforerelease.
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 8 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 9/47
As an objective, this requires careful and methodical testing of the application to
first ensure all areas of the system are scrutinized and, consequently, all issues
(bugs) found are dealt with appropriately.
2.2.2 Secondary Objective
A secondary objective of testing our application systems is to:
Assure that the system meets the full requirements of our customer(s);
Maintain the quality of the product; and
Remain within the cost range established at the project outset.
At the end of the project development cycle, the user should find that the project
has met or exceeded all of their expectations as detailed in the requirements.
Any changes, additions, or deletions to the Requirements document, Functional
Specification, or Design Specification will be documented and tested at the
highest level of quality allowed within the remaining time of the project and
within the ability of the test team.
2.3 Scope of Applications Under Test
The scope of this quality assurance effort is to validate the full range of activities
related to the functionality of all XXXX applications-under-test (AUT); as they
undergoes development, re-designing and re-building.
This test plan describes the unit, subsystem integration, and system level tests
that will be performed on components of the applications. It is assumed that
prior to testing, each subsystem to be tested will have undergone an informal
peer review and only code that has successfully passed a peer review will betested.
Unit tests will be initially done by the software designer agency (i.e. Eureka,
Avectra, etc.) and subsequently by the XXXX Development Department;
performing secondary unit testing, boundary checking and basic black box
testing.
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 9 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 10/47
3 Test Methodology
3.1 Test Stages
Overview
There are four major milestones in the Development Cycle: Planning Phase,
Design Phase, Development Phase, and Stabilization Phase. The Planning Phase
culminates in the completion of the Planning Docs Milestone (Requirements plus
Functional Specs). The Design Phase culminates in the completion of the Design
Specs and Test Plan/Test Specs. The Development Phase culminates in the Code
Complete Milestone. The Stabilization Phase culminates in the Release
Milestone.
During the first two phases, QA Testing plays a supporting role, providing ideas
and limited testing of the planning and design documents. Throughout the finaltwo stages, QA Testing plays a key role in the project.
Milestone 1 - Planning Phase
During the first phase of the Development Cycle, QA Testing should focus upon
the Requirements (User Stories) and Functional Specs. QA Testing reviews these
documents for their comprehensibility, accuracy, and feasibility. Specific tasks
that QA Testing may carry out during this phase include:
Assessing the impact of Requirements on testing;
Providing metrics factors (preliminary schedule, estimated test case
and bug counts, etc.); Identifying project infrastructure issues; and
Identifying risks related to the testing and development process.
Milestone 2 - Design Phase
During the second phase of the Development Cycle, QA Testing is focused upon
evaluating the design and is required to produce and distribute its draft Test
Plan. To generate the Test Plan, Test Spec, and Test Cases; QA Testing requires
that the Requirements, Functional Spec, Design Documents, Business Rules, and
Project Plan/Schedule be completed and a copy emailed to the test point person.
During this phase, QA Testing may participate within the design reviews (withDevelopment) and have access to the Design Spec under construction. This will
help QA Testing to better prepare its Test Plan, Test Spec, and Test Cases. The
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 10 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 11/47
Test Plan defines much of the detailed strategy and specific testing information
that will be used for testing the application.
The purpose of the Test Plan is to achieve the following:
Divide Design Spec into testable areas and sub-areas. This should beconfused with more detailed test specs. The plan will also identify and
include areas that are to be omitted (not tested); Define testing strategies for each area and sub-area;
Define bug-tracking procedures;
Define release and drop criteria;
Define list of configurations for testing;
Identify testing risks;
Identify required resources and related information; and
Provide a testing schedule.
Milestone 2a - Usability Testing
The purpose of Usability Testing is to ensure that the new components and
features will function in a manner that is acceptable to the customer.Development will typically create a non-functioning prototype of the UI
components to evaluate the proposed design. Usability Testing can be
coordinated by QA Testing, but actual testing must be performed by non-testers
(XXXX business unit members). QA Testing will review the findings and provide
the project team with its evaluation of the impact these changes will have on the
testing process and to the project as a whole.
Milestone 3 - Developing Phase
During the third phase of the development cycle, QA Testing begins to execute
its primary role by identifying issues (bugs). At the beginning of this phase, QA
Testing will be spending most of its time entering requirements (User Stories)
and generating test cases (Work Items) in Microsoft Visual Studio’s Test
Professional 2010: Test Manager; which will be housed on the Team
Foundation Server. As this phase progresses; however, QA Testing will
receive release candidates (builds) from Development; increasing functionality
to test. By the time the development phase closes, QA Testing will be primarily
executing test cases.
Development’s internal releases during this phase ultimately drive toward a
static Alpha build (recognized as code-complete). While working on the code for
an interim milestone (builds generated as features completed), QA Testingwrites the test specification and test cases for that feature set. During this
phase, Development will also be conducting their Unit Testing (White Box
Testing) prior to every internal release to QA Testing.
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 11 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 12/47
Milestone 3a - Unit Testing (Multiple)
Unit testing is conducted by Eureka Software Solutions, Inc., Avectra, and some
third-party vendors; however the XXXX Development Department conducts
informal Unit Testing to ensure that proper functionality and code coverage have
been achieved both during coding and in preparation for acceptance into Alpha
Testing. Involvement in Unit Testing by the QA Testing Department during this
phase should be in an advisory capacity only. It is the responsibility of QA Testing to require that a set level of quality control is adopted and maintained by
Development throughout this phase.
The following areas of the project must be unit-tested and signed-off before
being passed on to QA Testing:
Databases, Stored Procedures, Triggers, Tables, and Indexes;
Database conversion; and
.OCX, .DLL, .EXE and other binary formatted executables. The exit criterion for this milestone is “code-complete”. That is, all functionality
and logical and physical components of the application must be completed and
made available to QA Testing according to the requirements within the drop
criteria.
Milestone 3b - Acceptance into Internal Release Testing
Internal releases are issued builds received from Development containing new
functionality (and may include bug fixes). Before accepting a project for Internal
Release Testing, QA Testing must be assured that adequate Unit Testing has
been done by Eureka, Avectra, some third-party vendors and XXXX
Development.
A build must pass the following Build Acceptance Test before moving intointernal release testing:
Sign-off from the Senior Application Engineer/Architect that Unit Testing is complete on all modules released;
Verification that the code is located at a specified drop point;
Verification that Release Notes accompany the new build (discussion of changes, areas not to test, etc.);
QA Testing can install the build from a SETUP.EXE file (i.e.CFEExamPrep.exe, CFEExam.exe, etc.) located at the drop point; and
Verification that all “Build Acceptance Test” test cases (preferablyautomated) pass thereby indicating that the basic components work.
Milestone 3c - Internal Release Testing
Internal release testing is the evaluation of the new UI and functionality that has
been incorporated into the new build. Several cycles of internal release testing
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 12 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 13/47
will occur (every time QA Testing receives a new drop of a build). Completion of
this phase will occur when code is complete, and subsequent drops of new builds
will include only bug fixes with no new feature code.
Internal release testing will include:
Execution of Test Manager test cases;
Documentation into Test Manager of any variations from expectedresults; and
Addition of newly discovered test cases into Test Manager.
Milestone 3d - Acceptance into Alpha Testing
Alpha Testing occurs after code complete has been achieved; all subsequent
builds throughout alpha will thus consist of bug fixes only. Initial receipt of the
application defined as code complete into Alpha Testing requires that all critical
path test cases pass to ensure that general aspects of the application are robust
enough to undergo the testing process. The application must be functionally
complete. The drop of the application to QA Testing should meet the same
requirements as any drop criteria (see above).
Passing this milestone indicates that Alpha Testing is ready to commence.
Failure into acceptance requires that the drop be rejected back to Development.
This would only occur in only two instances: one, where the drop criteria had not
been properly met; or two, when a bug of sufficient severity prevents running
test cases against the application.
Milestone 3e - Alpha Testing
During the repeated cycles of identifying bugs and taking receipt of new builds
(containing bug fix code changes), there are several processes which are
common to this phase across all projects. These include the various types of
tests: functionality, performance, stress, configuration, etc. There is also the
process of communicating results from testing and ensuring that new drops
contain stable fixes (regression). The project should plan for a minimum of 3-4
cycles of testing (drops of new builds) in this phase with 6-8 for occasions where
even more extreme testing is appropriate.
Throughout the course of Alpha Testing, a brief daily report should be submitted
by the QA Engineer to key senior management personnel indicating testing
results to date. Also, frequent and regular triage meetings shall be held with the
project team (more frequent than in previous phases of the development cycle).QA Testing shall present their bug findings as recorded within the Test Manager
and Team Foundation Server. The objective of the triage meeting is to
determine priorities for bug fixes.
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 13 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 14/47
Milestone 4 - Stabilization Phase
During the fourth and final stage of the Development Cycle, QA Testing performs
most of the work (relative to other groups). Here is where testing resource
loading is at its peak. Upon entering this phase, the application has been
internally tested module by module, and has undergone Alpha Testing cycles.
The objective of this phase is to arrive at the Release Milestone with a robust
release candidate (build). There is still an emphasis on testing to continue to
identify issues (bugs) and regression test the bug fixes. All project members
may become involved in this process during this phase.
Milestone 4a - Acceptance into Beta Testing
Upon entering this milestone, QA Testing will continue to provide a brief daily
report indicating testing results to date. When applicable, the report must show
that to the best degree achievable during the Alpha Testing phase, all identified
severity 1 and severity 2 bugs have been communicated and addressed. At a
minimum, all priority 1 and priority 2 bugs should be resolved prior to entering
the beta phase.
Important deliverables required for acceptance into Beta Testing include:
Application SETUP.EXE;
Installation instructions; and
All documentation (beta test scripts, manuals or training guides, etc.).
Milestone 4b - Beta Testing/Business Acceptance Testing (BAT)
This milestone process is typically conducted by the various XXXX business unit
subject matter experts (testers). QA Testing and the Sr. Systems Analyst
participates in this milestone process as well by providing confirmation feedbackon new issues uncovered, and input based on identical or similar issues detected
earlier. The intention is to verify that the product is ready for distribution,
acceptance by the (business) customer and to iron out potential operational
issues. It is a resource intensive process in which the business unit testers
cooperate with Product Management to develop a focused Business Acceptance
Test (BAT) Plan specifying the objective, scope and duration of the Beta phase.
Throughout the beta test cycle, bug fixes will be focused on minor and trivial
bugs (severity 3 and 4). QA Testing will continue its process of verifying the
stability of the application through Regression Testing (existing known bugs, as
well as existing test cases). QA Testing will also assist with confirmationfeedback to beta test results (yes it is a bug; yes Development is working on a
fix, etc.).
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 14 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 15/47
The milestone target of this phase is to establish that the Application-Under-Test
(AUT) has reached a level of stability. The future web-based version of the
product will insure that the system is operating appropriately for its usage
(transaction response times, HTTP hits per second, throughput, number of
simultaneous users, etc.), that it can be released to the client users. BAT usually
involves 1 – 2 weeks of focused testing for an average project and 5 – 8 weeks
for a major version release.
Milestone 4c - Release to Production
Release to Production occurs only after the successful completion of the
application-under-test throughout all of the phases and milestones previously
discussed above. The milestone target is to place the release candidate (build)
into production after it has been shown that the application has reached a level
of stability that meets or exceeds the client expectations as defined in the
Requirements, Functional Spec., and XXXX Production Standards (not yet
established).
QA Testing will ensure that the Final Release Candidate (RC) Set passes thefollowing test cases:
Check for extra or missing files (Diff operation on directory);
Proper date stamps on all files (Diff operation on directory);
Binary comparison of source files conducted;
Installation test on clean machine one more time; and
Basic functionality test (preferably automated smoke tests).
Milestone 4d - Post Release
During this phase, QA Testing archives all project documents, notes, test-ware
(automation scripts, etc.), email, source code, etc. Archiving is the responsibilityof the QA Engineer. QA Testing also prepares a post-implementation review
document discussing the testing process throughout the development cycle.
3.2 Test Levels
Testing of an application can be broken down into three primary categories and
several sub-levels. The three primary categories include tests conducted on
every build (Build Tests), tests conducted at every major milestone (Milestone
Tests), and tests conducted at least once every project release cycle (Release
Tests). The test categories and test levels are defined below:
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 15 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 16/47
3.2.1 Build Tests
Level 1 - Build Acceptance Tests
Build Acceptance Tests should typically take approximately 1 week. These test
cases simply ensure that the application has been built and can be installed
successfully. Other related test cases ensure that QA Testing received the
proper Development Build from the Sr. Systems Analyst and was informed of any
build related matters, including the location of the build drop point. The
objective is to determine if further testing is possible. If any Level 1 test case
fails, the build is returned to the developers un-tested.
Level 2 - Smoke Tests
Smoke Tests are preferably automated; however, manual testing will suffice and
typically takes less than 2 days. These tests cases verify the major functionality
at high level. The objective is to determine if further testing is possible. These
test cases should emphasize breadth more than depth. All components should
be touched, and every major feature should be tested briefly by the Smoke Test.
If any Level 2 test case fails, the build is returned to the developers un-tested.
Level 2a - Bug Regression Testing
Every bug that was marked as “Open” during the previous build, but marked as
“Fixed, Needs Re-Testing” for the current build under test; will need to be
regressed, or re-tested. Once the smoke test is completed, all resolved bugs
need to be regressed. It should take between 5 minutes to 1 hour to regress
most bugs.
3.2.2 Milestone Tests
Level 3 - Critical Path Tests
Critical Path test cases must pass by the end of every 3-5 Build Test Cycles.
Critical Path test cases are targeted on features and functionality that the user
will see and use every day. They do not need to be tested every drop, but must
be tested at least once per milestone. Thus, the Critical Path test cases must all
be executed at least once during the Alpha cycle, and once during the Beta
cycle.
3.2.3 Release Tests
Level 4 - Standard Tests
Standard Test Cases need to be run at least once during the entire test cycle for
this release. These cases are run once, not repeated as are the test cases in
previous levels. However, Functional Testing and Detailed Design Testing can be
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 16 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 17/47
tested multiple times for each Milestone Test Cycle (alpha, beta, etc.). Standard
test cases usually include Installation, Data, GUI, and other test areas.
Level 5 - Suggested Test
Suggested Test Cases are those case which would be nice to execute, but may
be omitted due to time constraints.
Most Performance and Stress Test Cases are classic examples of Suggested TestCases (although some should be considered standard test cases). Other
examples of Suggested Test Cases include WAN, LAN, Network, and Load
Testing.
3.3 Bug Regression
Bug Regression will be a central tenant throughout all testing phases. All bugs
that are resolved as “Fixed, Needs Re-Testing” will be regressed when QA
Testing is notified of the new drop containing the fixes. When a bug passes
regression it will be considered “Closed, Fixed”. If a bug fails regression, QA
Testing will notify Development by entering bug notes into Test Manager. Whena Severity 1 bug fails regression, QA Testing should also put out an immediate
email to Development. The QA Engineer will be responsible for tracking and
reporting to Development and product management the status of Regression
Testing.
It is recommended that a separate cycle of Regression Testing occur at the end
of each phase to confirm the resolution of Severity 1 and 2 bugs. The scope of
this last cycle should be determined by the test point person and product
management.
3.4 Bug TriageBug Triages will be held throughout all phases of the development cycle. Bug
triages will be the responsibility of the QA Engineer. Triages will be held on a
regular basis with the time frame being determined by the bug find rate and
project schedules. Thus, it would be typical to hold a few triages during the
Planning phase, then maybe one triage per week during the Design phase,
ramping up to twice per week during the latter stages of the Development
phase. Then, the Stabilization phase should see a substantial reduction in the
number of new bugs found, thus a few triages per week would be the maximum
(to deal with status on existing bugs).
The QA Engineer, Sr. Systems Analyst, Sr. Application Engineer, Application
Engineer, and IT Director should all be involved in these triage meetings. The
QA Engineer will provide required documentation and reports on bugs for all
attendees. The purpose of the triage is to determine the type of resolution forPrepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 17 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 18/47
each bug and to prioritize and determine a schedule for all “To Be Fixed” bugs.
QA Testing will then assign the bugs to Development for fixing and report the
resolution of each bug back into Test Manager/TFS. The QA Engineer will be
responsible for tracking and reporting on the status of all bug resolutions.
3.5 Suspension Criteria and Resumption Requirements
Testing will be suspended on the affected software module when Smoke Test(Level 1) or Critical Path (Level 2) test case bugs are discovered. A bug report
will be filed in Test Manager/TFS and Development and Product Management will
be notified. After fixing the bug, Development will follow the drop criteria
(described above) to provide its latest drop to QA Testing. At that time, QA
Testing will regress the bug, and if passes, continue testing the module.
Notice that some discretion is in order here on the part of the QA Engineer. To
suspend testing, the bug must be reproducible, it must be clearly defined, and it
must be significant.
3.6 Test Completeness Testing will be considered complete when the following conditions have been
met:
3.6.1 Standard Conditions
When QA Testing, Development and Product Management agree thattesting is complete, the application is stable, and all parties agree thatthe application meets functional requirements;
Script execution of all test cases, in all areas, have passed;
Automated test cases have passed in all areas;
All priority 1 and 2 bugs have been resolved and closed;
Each test area has been signed off as completed by the QA Engineer;
50% of all resolved severity 1 and 2 bugs have been successfully re-regressed as final validation; and
Ad hoc testing in all areas has been completed.
3.6.2 Bug Reporting & Triage Conditions
Bug find rate indicates a decreasing trend prior to Zero Bug Rate (nonew Severity 1/2/3 bugs found);
Bug find rate remains at 0 new bugs found (Severity 1/2/3) despite aconstant test effort across 3 or more days;
Bug severity distribution has changed to a steady decrease in Severity
1 and 2 bugs discovered; and No ‘Must Fix’ bugs remaining, despite sustained testing.
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 18 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 19/47
4 Software Risk Issues
4.1 Schedule
The schedule for each phase is very aggressive and could affect testing. A slip in
the schedule in one of the other phases could result in a subsequent slip in the
test phase. Close project management is crucial to meeting the forecasted
completion date.
4.2 Technical
Since these are new XXXX systems, in the event of a failure, the old system canbe used. We will run our test in parallel with the production system so that there
is no downtime of the current system.
4.3 Management
Management support is required so when the project falls behind, the test
schedule does not get squeezed to make up for the delay. Management can
reduce the risk of delays by supporting the test team throughout the testing
phase and assigning people to this project with the required skills set.
4.4 Personnel
Due to the aggressive schedule, it is very important to have experienced testers
on this project. Unexpected turnovers can impact the schedule. If attrition does
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 19 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 20/47
happen, all efforts must be made to replace the experienced individual. This
may pose a challenge, since most of our testers are XXXX employees with
primary responsibilities in other business units.
4.5 Requirements
The test plan and test schedule are based on the current Requirements
Document. Any changes to the requirements could affect the test schedule andwill need to be approved by senior management.
5 Test Approach
Functional testing will be conducted during the entire Application Development
Life Cycle by the XXXX Software QA and Development departments. Formal
testing will be conducted by XXXX’s business unit subject matter experts in two
cycles; while overall testing of the entire system will be conducted by the
Software QA and Development departments in one final cycle.
The overall testing approach of the project will address and encompass the
following tools and processes:
5.1 Special Testing Tools
Microsoft Visual Studio 2010 Ultimate - The comprehensive suite of
application lifecycle management tools used by the XXXX Software
Development Team to ensure quality results, from design to deployment.
It includes: Integrated Development Environment, Development Platform
Support, Team Foundation Server, MSDN Subscription, Testing Tools,
Database Development, Debugging and Diagnostics, Application Lifecycle
Management, Architecture and Modeling, and Lab Management.
Visual Studio Test Professional 2010 - An integrated testing toolset
included with Microsoft Visual Studio 2010 Ultimate; that delivers a
complete plan-test-track workflow for in-context collaboration between
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 20 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 21/47
testers and developers. This tool includes Test Manager 2010 and Test
Runner and is used by the QA Engineer to facilitate manual/automated
testing and traceability.
Microsoft Visual Studio Team Foundation Server 2010 (TFS) - The
collaboration platform at the core of our application lifecycle management
(ALM) process. Team Foundation Server 2010 automates the software
delivery process and enables everyone on our team to collaborate more
effectively, be more agile, and deliver better quality software while
building and sharing institutional knowledge. Project artifacts like
requirements, tasks, bugs, source code, build and test results are stored
in a data warehouse. The tool also contains the reporting, historical
trending, full traceability, and real-time visibility into quality and progress.
5.2 Special Training on Testing Tools
Since Microsoft Visual Studio 2010 Ultimate is a relatively new tool on the
market, some special training will be required. Due to the current timeconstraints of our projects, this training will be scheduled and completed in-
house utilizing AppDev OnDemand , a self-paced online courseware learning tool
and includes the latest development products and technologies; including
SharePoint 2007, SQL Server 2005/2008, SQL Server 2005 Business Intelligence,
Visual Basic 2005/2008, Visual C# 2005/2008, ASP.NET, AJAX, .NET Framework,
Windows Workflow Foundation, Microsoft Windows Server, Microsoft Office,
Microsoft Certification; all with hands-on lab exercises, sample code, and
pre/post exams.
5.3 Test Metrics
The Microsoft Visual Studio Test Professional 2010 application will be configured
to collect test metric data and to analyze the current level of maturity in testing
and give a projection on how we will go about testing activities by allowing us to
set goals and predict future trends.
The objective of our test metrics is to capture the planned and actual quantities
the effort, time and resources required to complete all the phases of Testing of
the XXXX application improvement projects.
The SQA Engineer will use the test metrics as a mechanism to know the
effectiveness of the testing that can be measured quantitatively. It is a feedback
mechanism to improve the testing process that is followed currently and will be
used to track actual testing progress against the plan and therefore to be able to
be proactive upon early indications that testing activity is falling behind. The
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 21 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 22/47
SQA Engineer has created a test metric as a standard means of measuring
different attributes of the software testing process. The metrics are a means of
establishing test progress against the test schedule and may be an indicator of
expected future results. Our metrics will be produced in two forms – Base
Metrics and Derived Metrics as outlined below:
Base Metrics
Number of Test Cases
Number of New Test Cases
Number of Test Cases Executed
Number of Test Cases Unexecuted
Number of Test Cases Re-executed
Number of Passes
Number of Fails
Number of Test Cases Under Investigation
Number of Test Cases Blocked
Number of 1st Run Fails Number of Testers
Test Case Execution Time
Derived Metrics
Percentage of Test Cases Complete
Percentage of Test Cases Passed
Percentage of Test Cases Failed
Percentage of Test Cases Blocked
Percentage of Test Defects Corrected
5.4 Configuration Management
Test configuration management will be managed by the Lab Manager tool;
included with the Microsoft Visual Studio 2010 Ultimate package.
Lab Manager can fully provision and ready multiple environments for testing so
that build scripts can explicitly target a particular lab configuration at build time.
Lab Management stores the environments as virtual machine images in a library
of pre-built images using System Center Virtual Machine Manager (SCVMM) to
ensure teams always begin their testing from a known configuration.
The following operating systems and browser will be used in multiple
combinations for testing the XXXX applications:
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 22 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 23/47
Operating Systems
Windows XP, Service Pack 3 or greater
Windows Vista
Windows 7
Browsers
Firefox 3.0 Internet Explorer 7.0
Internet Explorer 8.0
5.5 Regression Testing
Regression testing will be conducted at the conclusion of each iteration of
testing by the SQA Department and any assigned business unit subject matter
experts. In most cases, the testing will be based on severity of defects detected.
5.6 Requirements Management
The business requirements will be elicited and managed by the Sr. Systems
Analyst. Any elements in the requirements and design that do not make sense
or are not testable will be immediately documented and reported to the Sr.
System Analyst; who will in turn address these issues with the shareholders for
further clarification.
6 Test Strategy
The test strategy consists of a series of different tests that will fully exercise the
applications. The primary purpose of these tests is to uncover the systems
limitations and measure its full capabilities. The list of the various planned tests and
a brief explanation are:
6.1 System Test
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 23 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 24/47
The System tests will focus on the behavior of the applications. User scenarios
will be executed against the system as well as screen mapping and error
message testing. Overall, the system tests will test the integrated system and
verify that it meets the requirements defined in the requirements document.
6.2 Performance Test
Performance test will be conducted to ensure that the application’s responsetimes meet the user expectations and meet and/or exceeds the specified
performance criteria. During these tests, response times will be measured under
heavy stress and/or volume.
6.3 Security Test
Security tests will determine how secure the applications are. The tests will
verify that unauthorized user access to confidential data is prevented.
6.4 Automated Test
A suite of automated tests will be developed to test the basic functionality of the
systems and perform regression testing on areas of the systems that previously
had critical/major defects. The tool will also assist us by executing user scenarios
thereby emulating several users.
6.5 Stress and Volume Test
We will subject the systems to high input conditions and a high volume of data
during the peak times. The Systems will be stress tested using twice (20 users)
the number of expected users.
6.6 Recovery Test
Recovery tests will force the system to fail in various ways and verify the
recovery is properly performed. It is vitally important that all data is recovered
after a system failure and no corruption of the data occurred.
6.7 Documentation Test
Tests will be conducted to check the accuracy of the user documentation. These
tests will ensure that no features are missing, and the contents can be easily
understood.
6.8 Beta Test
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 24 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 25/47
The internal business unit testers will beta test the new systems and will report
any defects they find. This will subject the system to tests that could not be
performed in our test environment.
6.9 Business Acceptance Test (BAT)
Once the applications are ready for implementation, the internal business unit
testers will perform Business Acceptance Testing (BAT). The purpose of thesetests is to confirm that the system is developed according to the specified user
requirements and is ready for operational use.
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 25 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 26/47
7 Entry and Exit Criteria
7.1 Test Plan
7.1.1 Test Plan Entry Criteria
The Entrance Criteria specified by the system test controller, should be fulfilled
before System Test can commence. In the event, that any criterion has not beenachieved, the System Test may commence if Business Team and Test Controller
are in full agreement that the risk is manageable.
• All developed code must be unit tested. Unit and Link Testing must be
completed and signed off by development team;
• System Test plans must be signed off by Sr. App Engineer and Sr. Systems
Analyst;
• All human resources must be assigned and in place;
• All test hardware and environments must be in place, and free for Systemtest use; and
• The Acceptance Tests must be completed, with a pass rate of not less
than 80%.
7.2.2 Test Cycle Exit Criteria
The Exit Criteria detailed below must be achieved before the Round 1 software
can be recommended for promotion to Acceptance status. Furthermore, I
recommend that there be a minimum 1 day effort Final Integration testing
AFTER the final fix/change has been retested.
• All High Priority errors from System Test must be fixed and tested;
• If any medium or low-priority errors are outstanding - the implementation
risk must be signed off as acceptable by Sr. App Engineer and Sr. Systems
Analyst;
• Project Integration Test must be signed off by Sr. App Engineer and Sr.
Systems Analyst; and
• Business Acceptance Test must be signed off by Business Experts.
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 26 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 27/47
8 Test Deliverables
The following artifacts will be testing deliverables, available to the stakeholders:
Deliverable Responsibility Delivery Date
Develop Test cases QA Engineer TBD Test Case Review QA Engineer, Sr. App
Engineer, Testers
TBD
Develop Automated test
suites
QA Engineer TBD
Requirements Validation
Matrix
QA Engineer TBD
Execute manual and
automated tests
QA Engineer and Testers TBD
Complete Defect Reports Testers TBD
Document and
communicate test
status/coverage
QA Engineer and Sr. Systems
Analyst
TBD
Execute Beta tests Testers TBD
Document and
communicate Beta test
status/coverage
QA Engineer and Sr. Systems
Analyst
TBD
Execute User Acceptance
tests
Testers TBD
Document and
communicate Acceptance
test status/coverage
QA Engineer and Sr. Systems
Analyst
TBD
Final Test Summary Report QA Engineer TBD
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 27 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 28/47
Here is a diagram indicating the dependencies of the various deliverables:
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 28 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 29/47
X X X X B u s in e s sR e q u i r e m e n t s
(P M/B A)
X X X X P r o je c tP l a n(P M/B A)
X X X XF u n c t i o n a l
S p e c s(P M/B A)
T e s t P l a n(Q A)
D e t a i l ed D e s i g n( D E V)
T e s t S p e c s/O u t l i n e(Q A)
T e s t C a s eR e s u l t s
B u gR e s u l t s
T e s t C a s e s B u g s
T e s t C a s eC o v e r a g e
R e p o r t s
B u g R e p o r t s
D a i l y S t a t u sR e p o r t s
As the diagram above shows, there is a progression from one deliverable to the
next. Each deliverable has its own dependencies, without which it is not possible tofully complete the deliverable.
8.1 Deliverables Matrix
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 29 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 30/47
The following matrix depicts all of the deliverables that QA Testing will use. This
matrix should be updated routinely throughout the project development cycle in
the project specific Test Plan.
Deliverable Milestone Sign-Off
Documents
Test Approach Planning
Test Schedule Design
Test Specifications Developme
nt
Test Case / Bug Write-Ups
Test Manager Test Cases/Results All
Test Manager Coverage Reports All
Bug Tracking System - Bugs and
Regression Results
All
TFS Bug Tracking Reports All
Reports
Daily Status Reports All
Phase Completion Reports All
Test Final Report - Sign-Off Stabilization
TFS Source Control Archive Stabilizatio
n
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 30 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 31/47
8.2 Documents
8.2.1 Test Approach Document
The Test Approach document is derived from the Project Plan, Requirements and
Functional Specification documents. This document defines the overall test
approach to be taken for the project. The Standard Test Approach document
that you are currently reading is a boilerplate from which the more specific
project Test Approach document can be extracted.
When this document is completed, the QA Engineer will distribute it to the IT
Director, Sr. Application Engineer/Architect, Sr. Systems Analyst, Application
Engineer, and others as needed for review and sign-off.
The purpose of the Standard Test Approach document is to:
Specify the approach that QA Testing will use to test the product, andthe deliverables (extracted from the Test Approach);
Break the product down into distinct areas and identify features of theproduct that are to be tested;
Specify the procedures to be used for testing sign-off and productrelease;
Indicate the tools used to test the product; List the resource and scheduling plans;
Indicate the contact persons responsible for various areas of theproject;
Identify risks and contingency plans that may impact the testing of theproduct;
Specify bug management procedures for the project; and
Specify criteria for acceptance of development drops to QA Testing (of builds).
8.2.2 Test Schedule
The Test Schedule is the responsibility of the QA Engineer and will be based oninformation from the Project Scheduler (done by Product Manager). The project
specific Test Schedule will be done in Microsoft Project.
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 31 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 32/47
8.2.3 Test Specifications
A Test Specification document is derived from the Test Plan as well as the
Requirements, Functional Spec., and Design Spec documents. It provides
specifications for the construction of Test Cases and includes list(s) of test case
areas and test objectives for each of the components to be tested as identified in
the project’s Test Plan.
8.3 Test Case/Bug Write-Ups
8.3.1 Test Manager Test Cases
Test Cases will be documented in Test Manager. Test Cases are developed from
the Test Specifications, Functional Spec., and Design Spec. Test Cases are
detailed at the lowest level of complexity. Results will be tracked as either Pass
or Fail in the Test Manager and subsequently in the Team Foundation Server.
There must be an associated bug tracking number for every Failed Test Case.
8.3.2 Test Case Coverage Reports
Test Case Coverage Reports will be generated in Team Foundation Server’s
reporting component and will provide the current status of test cases and
pass/fail information. The reports can breakdown the status information across
the different test case areas, by level (Smoke, Critical Path, Standard,
Suggested, etc.), and in other formats.
8.3.3 Bug Tracking System and Regression Results
Bugs found will be documented and tracked in Team Foundation Server. There
will be an associated Test Case (in Test Manager) for every bug written.
Standards for writing up bugs are detailed in a later section entitled Test
Manager and Bug Tracking Standards section of this document.
8.3.4 TFS Bug Tracking Reports
Reports from the TFS Bug Reports component will be used to communicate
information on all bugs to appropriate project personnel.
The following bug reports will be generated from TFS:
Bug Status Report
Helps you track the team's progress toward resolving bugs and shows the
number of bugs in each state over time, a breakdown of bugs by priorityor severity, and the number of bugs that are assigned to each team
member.
Bug Trends Report
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 32 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 33/47
Helps you track the rate at which the team is discovering and resolving
bugs. Shows a moving average of bugs discovered and resolved over
time.
Reactivation
Helps you track how effectively the team is resolving bugs and shows the
number of bugs that the team resolved over time in relation to thenumber of bugs that the team resolved and later reactivated.
8.4 Reports
The QA Engineer will be responsible for writing and disseminating the following
reports to the appropriate XXXX project personnel as required:
8.4.1 Weekly Status Reports
A daily status report will be provided by the QA Engineer to project personnel.
This report will summarize daily testing activities, issues, risks, bug counts, test
case coverage, and other relevant metrics.
8.4.2 Phase Completion Reports
When each phase of testing is completed, the QA Engineer will distribute a Phase
Completion Report to the Product Manager, Development Lead, and other
relevant project team members for review and sign-off.
The document must contain the following metrics:
Total Test Cases, Number Executed, Number Passes/Fails, Number Yetto Execute;
Number of Bugs Found to Date, Number Resolved, and Number stillOpen;
Breakdown of Bugs by Severity/Priority Matrix;
Discussion of Unresolved Risks; and
Discussion of Schedule Progress (are we where we are supposed tobe?).
8.4.3 Test Final Report - Sign-Off
A Final Test Report will be issued by the QA Engineer. It will certify as to the
extent to which testing has actually completed (test case coverage report
suggested), and an assessment of the product’s readiness for Release to
Production.
8.4.4 TFS Source Control Archive
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 33 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 34/47
Once a project release cycle has been completed, all source code,
documentation (including requirements, functional spec, design spec, test plan,
etc.), all testware automation, etc. should be archived into TFS for source control
and permanent storage.
9 Resource & Environment Needs
This section presents the non-human resources required for the Test Plan.
9.1 Base System Hardware
The following table sets forth the system resources for the test effort presented
in this Test Plan.
System ResourcesResource Quantity Name and Type
Application Server 1 HP ProLiant DL360 G5—CPUs 2 x Intel Xeon L5240 Processor
—Memory 4GB RAM—Hard Disk 1 72GB (Mirrored) HD
—Hard Disk 2 146GB (Mirrored) HD—Server Name ADMINPREP—IP Address 150.1.1.88
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 34 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 35/47
System Resources
Resource Quantity Name and Type Test Development PCs TBD TBD
9.2 Base Software Elements in the Test Environment
The following base software elements are required in the test environment for
this Test Plan.
Software Element Name Version Type and Other Notes
Windows Server 2008 (64-bit) Operating SystemSQL Server 2008 (64-bit) Database Engine
9.3 Productivity and Support Tools
The following tools will be employed to support the test process for this Test
Plan.
Tool Category or Type Tool Brand Name Vendor or In-house Version
Test Management Visual Studio TFS Microsoft 2010Defect Tracking TBD
9.4 Testing Tools
9.4.1 Tracking Tools
9.4.1.1 Team Foundation Server
The TFS bug tracking and reporting component is used by XXXX to enter (bug
can also be entered in Test Manager and Test Runner as well) and track all bugs
and project issues. The component gives meaningful bug reports.
9.4.1.2 MS Test Manager 2010
Test Manager 2010 will be used to track status of test cases, avoid duplication of
efforts (re-use test cases with slight modification) by QA Testing. The QA
Engineer is responsible for managing the Test Manager/Team Foundation Server
testing and bug reporting component. User Stories (requirements) and Work
Items (test cases) in TFS will be created for the project by the QA Engineer and
Sr. Application Engineer, when applicable.
9.4.1.3 Configuration Management
With Visual Studio Lab Management you can manage a set of virtual machines
as a single entity called a virtual environment. Each environment consists of one
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 35 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 36/47
or many virtual machines for each role required for your application. The
environments that you create make up your virtual lab. Then you use the virtual
lab to deploy applications and run tests using Microsoft Test Manager. These
environments are created by using the Lab Center in Microsoft Test Manager.
Applications are deployed to these environments in your lab by using Team
Foundation Build. Tests are run on these environments in your lab from the
Test Center by using Microsoft Test Manager. Development and QA Testing will
use Microsoft Lab Center to manage the virtual test environments and
configurations.
9.4.1.4 Issues Database
All projects will have issues that do not belong in Test Manager/TFS, but
nonetheless need to be dealt with. Too often these issues are not written down,
and are thus forgotten in the verbal maze of “Go-Dos”. Thus, we will at some
point build a tool to track RFI’s (Request for Information), COP’s (Change Order
Proposals are RFI’s that have a schedule or cost impact), and CO’s (Change
Orders are COP’s which have been approved). Until then, we will put the issues
into Test Manager/TFS as a suggested bug.
9.4.2 Diagnostic Tools
9.4.2.1 ExamDiff Pro
Compare file differences. Registry Dumps, Dir c:\*.* /s, or .ini file snapshot
changes over time—take snapshots both before and after broken state occurs…
then compare. (or use fc.exe f/NT too).
9.4.3 Automation Tools
9.4.3.1 Microsoft Visual Studio Ultimate/Test Professional 2010
Use Visual Studio 2010 for Automation where appropriate (read: repeatable).
We have to be very careful not to be trapped into the allure of automating
everything. It is most valuable to automate Smoke Tests and many Critical Path
Tests as they are repeated the most.
9.5 Test Environment
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 36 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 37/47
9.5.1 Hardware
QA Testing will have access control to one or more application/database servers
(Dev and Test) separate from any used by non-test members of the project
team. QA Testing will also have access control to an adequate number of
variously configured PC workstations to assure testing a range from the
minimum to the recommended client hardware configurations listed in the
project’s Requirements, Functional Specification and Design Specificationdocuments.
9.5.2 Software
In addition to the application and any other customer specified software, the
following list of software should be considered a minimum:
MS Visual Studio Ultimate Test Professional 2010 (Testing Tool)
MS Team Foundation Server (Testing Tool Server)
MS Project
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 37 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 38/47
10Responsibilities, Staffing and Training Needs
This section outlines the personnel necessary to successfully test the XXXX
applications. Since staff size is fixed these number may change.
10.1 People and Roles
This table shows the staffing assumptions for the test effort:
Human Resources
Role Minimum ResourcesRecommended
(number of full-timeroles allocated)
Specific Responsibilities orComments
QA Engineer 1 Identifies and defines the specific teststo be conducted.
Responsibilities include:
• identify test ideas• define test details• determine test results• document change requests• evaluate product quality
QA Engineer 1 Defines the technical approach to theimplementation of the test effort.
Responsibilities include:• define test approach• define test automation architecture• verify test techniques• define testability elements• structure test implementation
Business Unit Testers
6 Implements and executes the tests.
Responsibilities include:• implement tests and test suites• execute test suites• log results• analyze and recover from test
failures• document incidents
DatabaseAdministrator,
Database Manager
1 Ensures test data (database)
environment and assets are managed
and maintained.
Responsibilities include:• Support the administration of test
data and test beds (database).
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 38 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 39/47
Human Resources
Role Minimum ResourcesRecommended
(number of full-timeroles allocated)
Specific Responsibilities orComments
Sr. Application
Engineer,ApplicationEngineer, and Sr.Systems Analyst
2 Identifies and defines the operations,
attributes, and associations of the testclasses.
Responsibilities include:• defines the test classes required
to support testabilityrequirements as defined by thetest team
Sr. ApplicationEngineer,ApplicationEngineer, and Sr.Systems Analyst
2 Implements and unit tests the testclasses and test packages.
Responsibilities include:• creates the test components
required to support testabilityrequirements as defined by thedesigner
10.2 Staffing and Training Needs
Staffing is fixed for the duration of this project. It is likely most of the staff will
assume some testing role.
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 39 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 40/47
11 Test Manager Test Cases
QA Testing has defined standards for the structure of and entry of data into both
Test Manager and Test Manager/TFS. These standards will be adhered to in all
projects.
11.1 Test Cases
QA Testing’s test cases will be entered and administered by the QA Engineer.
The QA Engineer will be responsible for the maintenance of data entered into
Test Manager and Team Foundation Server. QA Testing will track and report the
success or failure of the test cases. Tracking of test cases will include when
tests were run, by whom, and code version/build number as well as any
comments logged after the test case was run. The QA Engineer will provide theproject team and Product Management with reports.
11.2 Test Case Contents
Test Case Title
The title should include a brief description of the test case’s purpose.
TCID
This field is automatically generated by the Test Manager application. It
becomes the Test Case Number and you cannot alter it.
Status
o Assigned To: The person currently working on the test case.
o State: The workflow state of the test case, such as:
Closed – The test case is no longer required for future iterations of thisteam project
Design -The test case is being designed and has not yet been reviewedand approved
Ready –The test case has been reviewed and approved and is ready to berun
o Priority: Importance of the test case to the business goals of theproduct
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 40 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 41/47
o Automation Status: Identifies test case as manual or automated andis useful if you plan to automate in future, such as:
Not Automated - This is a manual test case onlyPlanned - The plan is to add automation for this test case in the future
Automated - This value is automatically set if an automated test is added
to this test case
Classification
o Area: The area of the product with which the test case is associatedand maps to the feature areas in the application under development
o Iteration: The phase within which the bug will be fixed
Steps
o You can write step and its expected result in the steps section,
o You can write a common step, or set of common steps by creating
“Shared steps”. Shared steps can be added to other tests. You maywant to write common things like – launching of application under test,
logging in to the application, closing the application as shared steps
that you know you will require in many other tests,
o The normal step or shared step can be parameterized – you can
actually make your manual test data independent or can make a single
test case to be executed against a set of data over iterations. Again,
common things here could be application URL, login credentials or any
test data that you want to use during the test.
o
You can also attach a file to the step – like a document for reference ora screen shot.
Summary
This is where you would add a detailed description of the test case
Tested User Stories
You can attach a test case to the user story (requirement) right when you are
creating it by going here and adding a link to the user story(s) being tested
by that test case
All Links
Bugs can be attached/linked to the test case by using “All links” tab
AttachmentsPrepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 41 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 42/47
This is where you can add any file to the test case. For example, you could
add a video recording file, a screen shot file or a log file
Associated Automation
If you have automated test case, you can link the test case to the automated
test method from the associated automation tab
Parameter Values
These are the variable values set to replace data on each iteration of the test
12Bug Management
12.1 Bug Documentation
All bugs will be documented in Test Manager and/or TFS. Bug descriptions will
follow the XXXX test case/bug write-up standard (included below). The QA
Engineer will be responsible for the maintenance of Test Manager/TFS database
and bugs therein. All necessary project team members should have access to TFS.
Every bug entered into TFS’s tracking system will have an associated Test
Manager test case number associated with it. Where a bug does not fit into an
existing test case, a new test case will be created and its number referenced in
Test Manager/TFS bug entry. The new test case will be categorized or listed as a
Smoke Test item in the Test Plan. Older cases may be updated to extend their
potential for catching the new bug as long as it doesn’t significantly alter the
objective of the original test case. In the event that a bug is found by a person
outside of QA Testing, the bug should be reported to the QA Engineer who will
then assure that further testing is completed to verify the existence of the bug,refine the repro steps, incorporate it into Test Manager, and add it to TFS for bug
tracking.
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 42 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 43/47
12.1.1 Bug Severity and Priority Definition
Bug Severity and Priority fields are both very important for categorizing bugs
and prioritizing if and when the bugs will be fixed. The bug Severity and Priority
levels will be defined as outlined in the following tables below. QA Testing will
assign a severity level to all bugs. The QA Engineer will be responsible to see
that a correct severity level is assigned to each bug.
The QA Engineer, Sr. App Engineer, App Engineer, Sr. Systems Analyst and IT
Director will participate in bug review meetings to assign the priority of all
currently active bugs. This meeting will be known as “Bug Triage Meetings”.
The QA Engineer is responsible for setting up these meetings on a routine basis
to address the current set of new and existing but unresolved bugs.
12.1.1.1 Severity List
The tester entering a bug into Bug Tracking System is also responsible forentering the bug Severity.
Severity ID Severity Severity Description
1 Crash The module/product crashes or the bug causes non-recoverable conditions. System crashes, GP Faults, ordatabase or file corruption, or potential data loss, programhangs requiring reboot are all examples of a Severity 1bug.
2 Major Major system component unusable due to failure orincorrect functionality. Severity 2 bugs cause serious
problems such as a lack of functionality, or insufficient orunclear error messages that can have a major impact tothe user, prevents other areas of the app from beingtested, etc. Severity 2 bugs can have a work around, butthe work around is inconvenient or difficult.
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 43 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 44/47
3 Minor Incorrect functionality of component or process. There is asimple work around for the bug if it is Severity 3.
4 Trivial Documentation errors or signed off severity 3 bugs.
12.1.1.2 Priority List
Priority Priority Level Priority Description1 Must Fix This bug must be fixed immediately; the product cannot
ship with this bug.
2 Should Fix These are important problems that should be fixed as
soon as possible. It would be an embarrassment to the
company if this bug shipped.
3 Fix When Have
Time
The problem should be fixed within the time available. If
the bug does not delay shipping date, then fix it.
4 Low Priority It is not important (at this time) that these bugs be
addressed. Fix these bugs after all other bugs have
been fixed.
12.2 Test Runner Bug Entry Fields
The QA Engineer will be responsible for managing the bug reporting process.
QA Testing’s standard bug reporting tools and processes will be used. TFS is the
company-wide standard Bug Logging/Tracking tool. QA Testing and
Development will enter their data into Test Manager/TFS database following the
field entry definitions defined below:
Bug Title
The title should include a brief description of the test case’s purpose.
Status
o Assigned To - Select the person the bug is being assigned to.
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 44 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 45/47
o State - Select whether the bug is “Active” (default) or “Inactive” in thetest cycle.
o Reason - Select whether the bug is related to a “New” defect or a“Build Failure”.
o Resolved Reason - Enter a short description of the resolution.
Classification
o Area - Select the appropriate area in the team project for this bug.o Iteration - Select the appropriate iteration for this bug.
Planning
o Stack Rank - The stack rank is used as a way to prioritize your work. The lower the stack rank the higher the priority the work item is.
o Priority - Select a priority rating from 1 to 4. 1 representing the mosturgent.
o Severity - Select the severity of the bug from 1 to 4. 1 representingthe most critical.
Details The Details display the test steps and detailed actions that were
automatically added to a specific test step, such as input data, expected and
actual results, comments, and attachments.
System Info
The System Info displays detailed information about the configuration of the
computer used during the test.
Test Cases
Displays additional test cases related to the bug.
All Links
Displays test result attachments that are added as links. This includes
diagnostic trace data.
Attachments
A set of files attached to help provide additional information to support the
issue.
12.3 Bug Reporting Process
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 45 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 46/47
Is it really a bug?
Developer marks the
issue as “Not a Bug” andreassigns back to tester
with an explanation for
re-testing
Developer checks innew Source Code from
build and reassigns
Bug back to Tester for
re-testing
Is it reall y fixed?
Tester “Creates a
Bug” in Microsoft TestRunner and assigns
to a Developer for
analysis
Tester finds an Issue (Bug/Defect)
Tester marks the Bug as“Resolved” and continues testing
Yes
Developer fixes the
issue, marks it as
“Resolved” and
reassigns to tester for
re-testing
Developer submits
the Bug to Avectra for
analysis/fix
Avectra resolvesthe issue,
publishes a new
Build and marks
it as “fixed”
Developer
receives email
confirmation
from Avectraacknowledging
Bug fix
Developer submits
the Bug to Eureka or
Vendor for analysis/
fix
Vendor resolvesthe issue,
publishes a new
Build and marks
it as “fixed”
Developer
receives email
confirmation
from Eureka or
Vendoracknowledging
Bug fix
Is bug a netForum
issue and outside the
scope of XXXX code
customization?
Is bug a Prep Course
or Vendor issue and
outside the scope of
XXXX code
customization?
No
YesYes
Yes
NoNo
No
Prepared by: André J. Jackson, Software QA Engineer
Proprietary and Confidential
Standard Test Approach
January 2011
Version: DRAFT 3
Page 46 of 47
8/7/2019 Standard Test Approach_Sample Doc
http://slidepdf.com/reader/full/standard-test-approachsample-doc 47/47
13Documentation
The following documentation will be available at the end of the test phase:
Standard Approach Document
Test Cases
Test Case review
Requirements Validation Matrix
Defect reports
Final Test Summary Report
Prepared by: André J Jackson Software QA Engineer