RS test plan sample of project

Embed Size (px)

Citation preview

  • 8/7/2019 RS test plan sample of project

    1/30

    Chapter 1 contains the following:

    1. INTRODUCTION

    1.1. Overview of System X

    1.2. Purpose of this Document

    1.3. Formal Reviewing

    1.4. Objectives of System Test

    y 2. SCOPE AND OBJECTIVES o 2.1. Scope of Test Approach - System Functions

    2.1.1. Inclusions 2.1.2. Exclusions

    o 2.2. Testing Process o 2.3. Testing Scope

    2.3.1. Functional Testing 2.3.2. Integration Testing 2.3.3. Business (User) Acceptance Test 2.3.4. Performance Testing 2.3.5. Regression Testing 2.3.6. Bash & Multi-User Testing 2.3.7. Technical Testing 2.3.8. Operations Acceptance Testing (OAT)

    o 2.4. System Test Entrance/Exit Criteria Entrance Criteria Exit Criteria

    3. TEST PHASES AND CYCLES

    o 3.1. System Testing Cycles o 3.1.1. Test Cases by Release version. o 3.1.2. Automated Testing.

    o 3.2. Software Delivery o 3.2.1. Release Schedule.

    o 3.3. Formal Review Points o 3.3.1. Review Points. o 3.3.2. Progress/Results Monitoring

    4. System Test Schedule 5. RESOURCES

    o 5.1. Human o 5.2. Hardware

    o Hardware components required o 5.3. Software

    o Test Host environments o Test Branch Software o Error Measurement System

    6. ROLES AND RESPONSIBILITIES

  • 8/7/2019 RS test plan sample of project

    2/30

    o 6.1. Management Team o 6.2. Testing Team o 6.3. Business Team o 6.4. Testing Support Team o 6.5. External Support Team

    7 . Error Management/Configuration Management

    y 8. STATUS REPORTING o 8.1. Status Reporting

    y 9. ISSUES/RISKS & ASSUMPTIONS y 9.1. Issues/Risks y 9.2. Assumptions y 10. Signoff y 11. APPENDICES

    o 11.1. Purpose of Error Review Team. o 11.2. Error Review Team Meeting Agenda. o 11.3. Classification of Bugs o 11.4. Procedure for maintenance of Error logging system. o 11. 5. Overnight Processing - Checking Bookkeeping & CIS

    y 11.7. SOFTWARE QUALITY ASSURANCE MEASURES o (i) DATES. o (ii) EFFORT. o (iii) VOLUME. o (iv) QUALITY. o (v) TURNAROUND.

  • 8/7/2019 RS test plan sample of project

    3/30

    1. INTRODUCTION 1.1. Overview of System X

    To aim of this phase of the project is to implement a new X System platform that willenable:

    y Removal of legacy office systems y Introduction of ABC y Processing of Special Transactions y No constraint on location of capture y Enable capture of transactions for other processing systems y New Reconciliation Processy Positioning for European ECU Currency and future initiatives

    This programme will result in significant changes to the current departmental and inter-office processes. The functionality will be delivered on a phased basis.

    Phase 1 will incorporate the following facilities :

    y Replacement of the legacy System A y New Reconciliation System y Outsourcing system for departments in different european countries. y New/Revised Audit Trail & Query Facilities

    [Detailed inclusions are listed later in this document]

    1.2. Purpose of this Document

    This document is to serve as the Draft Test Approach for the Business SystemsDevelopment Project .

    Preparation for this test consists of three major stages:-

  • 8/7/2019 RS test plan sample of project

    4/30

  • 8/7/2019 RS test plan sample of project

    5/30

    The above V Model shows the optimum testing process, where test preparation commencesas soon as the Requirements Catalogue is produced. System Test planning commenced atan early stage, and for this reason, the System test will benefit from Quality initiativesthroughout the project lifecycle.

    The responsibility for testing between the Project & Software Qualtiy Assurance (S.Q.A.) isas follows:

    y Unit Test is the responsibility of the Development Team y System Testing is the responsibility of SQA y User Acceptance Testing is the Responsibility of the User Representatives Team y Technology Compliance Testing is the responsibility of the Systems Installation &

    Support Group.

    2. SCOPE AND OBJECTIVES 2.1. Scope of Test Approach - System Functions 2.1.1. INCLUSIONS The contents of this release are as follows :-

    Phase 1 Deliverables

    o New & revised Transaction Processing with automated support o New Customer Query Processes and systems o Revised Inter-Office Audit process o Relocate Exceptions to Head Office o New centralised Agency Management system o Revised Query Management process

  • 8/7/2019 RS test plan sample of project

    6/30

    o Revised Retrievals process o New International Reconciliation process o New Account Reconciliation process

    2.1.2. EXCLUSIONS When the scope of each Phase has been agreed and signed off, no further inclusions will beconsidered for inclusion in this release, except:

    (1) Where there is the express permission and agreement of the Business Analyst and theSystem Test Controller;

    (2) Where the changes/inclusions will not require significant effort on behalf of the testteam (i.e. requiring extra preparation - new test conditions etc.) and will not adverselyaffect the test schedule.

    [See Section 9.1.]

    2.1.3. SPECIFIC EXCLUSIONS

    y Cash management is not included in this phase y Sign On/Sign Off functions are excluded - this will be addressed by existing

    processes y The existing Special Order facility will not be replaced y Foreign Currency Transactions y International Data Exchanges y Accounting or reporting of Euro transactions

    Reference & Source Documentation:

    1. Business Processes Design Document - Document Ref: BPD-1011 2. Transaction Requirements for Phase 1 - Document Ref: TR_PHASE1-4032 3. Project Issues & Risks Database - T:\Data\Project\PROJECT.MDB 4. The System Development Standards - Document Ref: DEVSTD-1098-2 5. System Development Lifecycle - Document Ref: SDLC-301

    2.2. Testing Process

  • 8/7/2019 RS test plan sample of project

    7/30

    The diagram above outlines the Test Process approach that will be followed.

    a. O rganise Project involves creating a System Test Plan, Schedule & Test Approach, andrequesting/assigning resources.

    b. Design/Build System Test involves identifying Test Cycles, Test Cases, Entrance & ExitCriteria, Expected Results, etc. In general, test conditions/expected results will be identified bythe Test Team in conjunction with the Project Business Analyst or Business Expert. The TestTeam will then identify Test Cases and the Data required. The Test conditions are derived fromthe Business Design and the Transaction Requirements Documents

    c. Design/Build Test Procedures includes setting up procedures such as Error Managementsystems and Status reporting, and setting up the data tables for the Automated Testing Tool.

    d. Build Test Environment includes requesting/building hardware, software and data set-ups.

    e. Execute Project Integration Test - See Section 3 - Test Phases & Cycles

    f. Execute O perations Acceptance Test - See Section 3 - Test Phases & Cycles

    g. Signoff - Signoff happens when all pre-defined exit criteria have been achieved. See Section 2.4.

    2 . 2 .1 . Exclusions

    SQA will not deal directly with the business design regarding any design / functional issues /queries.

    The development team is the supplier to SQA - if design / functional issues arise they shouldbe resolved by the development team and its suppliers.

  • 8/7/2019 RS test plan sample of project

    8/30

    2.3. Testing Scope

    Outlined below are the main test types that will be performed for this release. All system

    test plans and conditions will be developed from the functional specification and therequirements catalogue.

    2.3.1. Functional Testing The objective of this test is to ensure that each element of the application meets thefunctional requirements of the business as outlined in the :

    y Requirements Catalogue y Business Design Specification y Year 2000 Development Standards y Other functional documents produced during the course of the project i.e. resolution

    to issues/change requests/feedback.

    This stage will also include V alidation Testing - which is intensive testing of the new Frontend fields and screens. Windows GUI Standards; valid, invalid and limit data input; screen & field look and appearance, and overall consistency with the rest of the application.

    The third stage includes Specific Functional testing - these are low-level tests which aimto test the individual processes and data flows.

    2.3.2. Integration Testing This test proves that all areas of the system interface with each other correctly and thatthere are no gaps in the data flow. Final Integration Test proves that system works asintegrated unit when all the fixes are complete.

    2.3.3. Business (User) Acceptance Test This test, which is planned and executed by the Business Representative(s), ensures thatthe system operates in the manner expected, and any supporting material such asprocedures, forms etc. are accurate and suitable for the purpose intended. It is high leveltesting, ensuring that there are no gaps in functionality.

    2.3.4. Performance Testing These tests ensure that the system provides acceptable response times (which should notexceed 4 seconds).

    2.3.5. Regression Testing A Regression test will be performed after the release of each Phase to ensure that -

    y There is no impact on previously released software, and

  • 8/7/2019 RS test plan sample of project

    9/30

    y to ensure that there is an increase in the functionality and stability of the software.

    The regression testing will be automated using the automated testing tool.

    2.3.6. Bash & Multi-User Testing Multi-user testing will attempt to prove that it is possible for an acceptable number of usersto work with the system at the same time. The object of Bash testing is an ad-hoc attemptto break the system.

    2.3.7. Technical Testing Technical Testing will be the responsibility of the Development Team.

    2.3.8. Operations Acceptance Testing (OAT) This phase of testing is to be performed by the Systems Installation and Support group,prior to implementing the system in a live site. The SIS team will define their own testingcriteria, and carry out the tests.

    2.4. System Test Entrance/Exit Criteria

    2.4.1. Entrance Criteria The Entrance Criteria specified by the system test controller, should be fulfilled beforeSystem Test can commence. In the event, that any criterion has not been achieved, theSystem Test may commence if Business Team and Test Controller are in full agreement thatthe risk is manageable.

    y All developed code must be unit tested. Unit and Link Testing must be completedand signed off by development team.

    y System Test plans must be signed off by Business Analyst and Test Controller. y All human resources must be assigned and in place. y All test hardware and environments must be in place, and free for System test use. y The Acceptance Tests must be completed, with a pass rate of not less than 80%.

    Acceptance Tests: 25 test cases will be performed for the acceptance tests. To achieve the acceptance criteria20 of the 25 cases should be completed successfully - i.e. a pass rate of 80% must beachieved before the software will be accepted for System Test proper to start. This meansthat any errors found during acceptance testing should not prevent the completion of 80%

    of the acceptance test applications.

    Note: These tests are not intended to perform in depth testing of the software. [F or details of the acceptance tests to be performed see

    X:\Testing\Phase_1\Testcond\Criteria.doc]

    Resumption Criteria

  • 8/7/2019 RS test plan sample of project

    10/30

    In the event that system testing is suspended resumption criteria will be specified andtesting will not re-commence until the software reaches these criteria.

    2.4.2. Exit Criteria The Exit Criteria detailed below must be achieved before the Phase 1 software can berecommended for promotion to Operations Acceptance status. Furthermore, I recommendthat there be a minimum 2 days effort F inal Integration testing A F TER the final fix/changehas been retested. [S ee section 9.3]

    y All High Priority errors from System Test must be fixed and tested y If any medium or low-priority errors are outstanding - the implementation risk must

    be signed off as acceptable by Business Analyst and Business Expert y Project Integration Test must be signed off by Test Controller and Business Analyst. y Business Acceptance Test must be signed off by Business Expert.

    3. TEST PHASES AND CYCLES There will be two main stages of testing for the new application during System Test :-

    y System Testing y Operations Acceptance Testing

  • 8/7/2019 RS test plan sample of project

    11/30

    3.1. System Testing Cycles

    The main thrust of the approach is to intensively test the front end in the first two releases,thus raising approximately 80% of errors in this period. With the majority of these errorsfixed, standard and/or frequently used actions will be tested to prove individual elementsand total system processing in Release v0.3. Regression testing of outstanding errors will beperformed on an ongoing basis.

    When all errors (which potentially impact overall processing) are fixed, an additional set of test cases are processed in Release v0.4 to ensure the system works in an integratedmanner. It is intended that Release v0.4 be the final proving of the system as a singleapplication. There should be no A or B class errors outstanding prior to the start of Releasev0.4 testing.

    T est Cases by Release version:

    T esting by Phase

    Acceptance 1 Release v0. 1 Functional 1

    User Acceptance

    Acceptance 2 Release v0. 2 Functional 2

    Regression 1 Acceptance 3 Functional 3

    Release v0.3 Performance 1 Bash & Multi-User Testing Regression 1 Regression 2 Integration 1 Technical 1

    Release v0.4 Regression 1 Regression 2 Regression 3 Installation Test

    Contingency Per Bug F ix Test Only

    3. 1 .2 . Automated Testing

    Automated testing tools will be used in the test environment for functional and regressiontesting. The main focus of the automated testing will be the regression testing of thepreviously delivered functionality - i.e. when development version 0.2 of the software isdelivered the majority of the regression testing of the functionality delivered in development

  • 8/7/2019 RS test plan sample of project

    12/30

    version 0.1 will be automated. It is estimated that the full benefit of the automated testingwill only occur when the tests have been executed three or more times.

    3.2. Software Delivery

    During System Test the release of new versions of the software will be co-ordinatedbetween the Development Team leader and the System Test Controller. However, unless itconcerns a fix to a very serious error, new versions should only be released when a agreedtargets have been reached (i.e. the next version contains fixes to X or more numbers of bugs).

    Release Schedule:

    Functionality tobe Delivered *

    v0.11st May

    v0.2 17th May

    v0.331st May

    v0.418th June

    v1.0 29th June

    1. Function A

    2. Process B No New Bug Fix 3. Euro Reqs' Functionality contingency

    4. Y2K Reqs. to be release 5. Inter Office Trans delivered only. 6. International Trans. in this 7. Other. release.

    * (per functional spec, by priority)

    Notes:It is intended that 80% of the functionality will have been tested in full prior to the Phase 3Release.All the functionality must be present in the Phase 3 Release.No previously undelivered functionality will be accepted for testing after Phase 3.

    3.3. Formal Reviewing

    There will be several formal review points before and during system test, including the

    review of this document. This is a vital element in achieving a quality product.

  • 8/7/2019 RS test plan sample of project

    13/30

    3.3. 1 . Formal Review Points

    1. Design Documentation - Requirements Specification & Functional Specification 2. System Test Plan 3. Unit Test Plans & Test Conditions 4. Unit Test Results 5. System Test Conditions 6. System Test Progress/Results 7. Post System Test Review 8. Integration Test Results 9. Pilot Implementation Review 10. Project Review

    The diagram above outlines the Test Approach. Boxes 1 - 6 show the major review stagesprior to Test execution. Boxes 7 - 10 show the phases planned for & after test execution.

    While the above diagram concentrates on the testing aspect of SQA's role, there is anongoing role also, in ensuring the quality of the major deliverables throughout the lifecycleof the project. SQA's role will be to ensure that all Quality Inspections occur for all theagreed deliverables and that follow up actions and initiatives are pursued.

    3.3. 2 . Progress/Results Monitoring

    y Acceptance Test 1 Results y Test Results - Release v0.1 y Test Results - Release v0.2 y Test Results - Release v0.3 y Performance Test 1 Results y Regression 1 & 2 Results y Test Results - Release v0.4 y Technical Test Results

  • 8/7/2019 RS test plan sample of project

    14/30

    4. System Test Schedule

    These are screenshots of several high level views of the project schedule.These schedules are intended as examples only and probably will not correspond exactly with therest of the test plan.

  • 8/7/2019 RS test plan sample of project

    15/30

  • 8/7/2019 RS test plan sample of project

    16/30

  • 8/7/2019 RS test plan sample of project

    17/30

    . RESOURCES

    5.1. Human

    Resource Type Resource Title No. Date Req'd Who Status

    Project Mgmt/Functional Business Analyst 1 - A.N.Other

    Assigned

    Testing Test Controller 1 - A. Smith Assigned

    Testers 4 1st May To Be Assigned

    Test Support Team Support Programmers 4 15th May To be Assigned

    Technical Support 1 1st May To be Assigned

    WAN Support 1 25th May To be Assigned

    Technical - External CIS Support 1 25th May To be Assigned

    Bookkeeping Support 1 15th May To be Assigned

    External Liaison Support 1 25th May C. Jones Assigned

    Business Business Expert/Business Representative

    1 1st May To be Assigned

  • 8/7/2019 RS test plan sample of project

    18/30

    5.2. Hardware

    One seperate, controlled system will be required for the initial phaseof testing, setup as per one standard, complete office environment.In order to maintain the integrity of the test environment his networkwill not be accessible to anybody outside this project. The printers arealso exclusively for use by the test network.

    H ardware components required

    y 1 Network Controllery 6 Networked PC's ( See below) y 1 DAP Workstationy 1 Motorola 6 520y 1 Alpha AXP Servery 1 Batch Waste Printery 1 HP LaserJet 4v Printer

  • 8/7/2019 RS test plan sample of project

    19/30

    P C Specifications The 6 PC's required for the test environment will include the following:1 x P100, 1Gb HD, 1 6Mb RAM [Current Minimum Specification]3 x P1 66, 1. 5Gb HD, 32Mb RAM [Current Standard Specification]1 x P333, 2. 5Gb HD, 64Mb RAM [Current Maximum Specification]

    These specifications are the various specifications currently in use in different branches.

    1 x Pentium running Windows NT is also required as the Test center for controlling andexecuting the automated testing.

    5.3. Software

    Test IMS environmentsTest IMS region X will be required for System Testing. Additional or amendeddata will be populated where required.

    Test Environment SoftwareSystem Test will be run on the following Software Versions :-

    Custom Destop Vers.97.0.1Windows 95 Operating SystemVisual Basic 5 Runtime FilesMS Office 97Novell Netware

    Error Measurement System

    This system test will use a bespoke MS Access database Error Management system.A new database will be implemented for the sole use of this project.

    [See Chapter x ]

  • 8/7/2019 RS test plan sample of project

    20/30

    6 . ROLES AND RESPONSIBILITIES6 .1. Management Team

    Project Leader - B. Ruthlenn Ensure Phase 1 is delivered to schedule, budget & qualityEnsure Exit Criteria are achieved prior to System Test Signoff Regularly review Testing progress with Test Controller.Liaise with external Groups e.g. New SystemsRaise and manage issues/risks relating to project or outside Test Teams control.Review & sign off Test approach, plans and schedule.

    SQA Project Leader - C. Nicely Ensure Phase 1 is delivered to schedule, budget & qualityRegularly review Testing progressManage issues/risks relating to System Test TeamProvide resources necessary for completing system test.

    6 .2. Testing Team

    Test Planner / Controller - D. Everyman Ensure Phase 1 is delivered to schedule, budget & qualityProduce High Level and Detailed Test ConditionsProduce Expected ResultsReport progress at regular status reporting meetings

    Co-ordinate review & signoff of Test ConditionsManage individual test cycles & resolve tester queries/problems.Ensure test systems outages/problems are reported immediately and followed up.Ensure Entrance criteria are achieved prior to System Test start.Ensure Exit criteria are achieved prior to System Test signoff.

    Testers Identify Test DataExecute Test Conditions and Markoff resultsRaise Software Error ReportsAdminister Error Measurement System

    6 .3. Business Team

    Business Analyst - E. Showman Review high level / detailed test plans for System TestDefine ProceduresResolve design issuesResolve Business issues

  • 8/7/2019 RS test plan sample of project

    21/30

    Take part in daily test Error Review Team meetings

    Business Representative - ?? (To be Assigned) Execute User Acceptance TestingDefine Test Conditions/Expected Results for Business Acceptance Test

    Resolve user issuesResolve Design issues

    6 .4. Testing Support Team

    Support Programmers Take part in daily Error Review Team meetingsCo-ordinate/provide support for system test.Resolve errorsRe-release test software after amendmentsSupport Systems Testers

    6 .5. External Support Team

    CIS Support Provide CIS support, if required.Resolve CIS queries, if required.

    IMS Support Provide System Test SupportSupport IMS RegionsResolve Spooling Issues (if necessary)Bookkeeping Integration & Compliance (if necessary)Resolve queries arising from remote backup

    Bookkeeping Support Provide Bookkeeping Technical support, if required.Resolve queries, if required.

    Technical Support

    Provide support for hardware environmentProvide support for Test softwarePromote Software to system test environment

    Access Support Provide and support Test Databases

  • 8/7/2019 RS test plan sample of project

    22/30

    . Error Management & Configuration Management During System Test, errors will be recorded as they are detected on Error Report forms. Theseforms will be input on the Error Management System each evening with status "Error Raised" or "Query Raised" . The Error Review Team will meet each morning (10am, Conference Room)to review and prioritise DN's raised the previous day, and assign them or drop them asappropriate. This team will consist of the following representatives:-

    y A. Boring - Development Team Leader y B. Curie - Business Analysty C. Durine - Test Controller y D. Ewards - Business Representative

    Errors, which are agreed as valid, will be categorised as follows by the Error Review Team :-

    y Category A - Serious errors that prevent System test of a particular function continuingor serious data type error

    y Category B - Serious or missing data related errors that will not prevent implementation.y Category C - Minor errors that do not prevent or hinder functionality.

    Category A errors should be turned around by Bug Fix Team in 4 8 hours (this is turn aroundfrom time raised at Error Review Team meeting to time fix is released to System Testenvironment). In the event of an A error that prevents System Test continuing, the turnaroundshould be within 4 hours.Category B errors should be turned around in 1 day; whileCategory C errors should be turned around in 3 days.

    However, the release of newer versions of the software will be co-ordinated with the TestController - new versions should only be released when agreed, and where there is a definitebenefit (i.e. contains fixes to X or more numbers of bugs).

  • 8/7/2019 RS test plan sample of project

    23/30

    8 . STATUS REPORTING 8 .1. Status Reporting

    Test preparation and Testing progress will be formally reported during a weekly Status Meeting.The attendees at this meeting are :-

    y Byron Ruthlenn - Project Manager y Dion Ryan- Business Design Teamy Pat Smith - Development Team Leader

    A status report will be prepared by the Test Controller to facilitate this meeting. This report willcontain the following information :-

    1. Current Status v. Plan (Ahead/Behind/On Schedule)2. Progress of tasks planned for previous week 3. Tasks planned for next week including tasks carried from previous week 4. Error Statistics from Error Measurement system5. Issues/Risks6. AOB.

  • 8/7/2019 RS test plan sample of project

    24/30

    . Issues, Risks and Assumptions

    9.1. Issues/Risks

    1. No further changes or inclusions will be considered for inclusion in this release except (1)where there is the express permission and agreement of the Business Analyst and the SystemTest Controller; (2) Where the changes/inclusions will not require significant effort on behalf of the test team and will not adversely affect the test schedule. This is a potentially serious issue, asany major changes to design will entail additional time to re-plan testing and to create or amendtest conditions.Resp : Byron RuthlennFinal list of inclusions to be Signed off.

    2. The design of the software must be final, and design documentation must be complete,informative and signed off by all parties prior to System Test proper commences.

    Resp : D.A. Stone

    3. A weakness in the 'phased delivery' approach is that the the high degree of interdependency inthe code means that the smallest changes can have serious effects to areas of the applicationwhich apparently have not been changed. The assumption of the test team is that previouslydelivered and tested functionality will only require regression testing to verify that it 'still' works.I.e. testing will not be intended to discover new errors. Because of this I recommend that there be a minimum 2 days regression testing AF TE R the final fix/change has been retested. Thishowever, imposes a fixed time constraint on the completion of system testing which requires theagreement of the Project Leader.

    Resp : Byron Ruthlenn

    4. Automated Testing The majority of the Regression testing will be performed using the automated test tool. However,due to the workload required to implement (and debug) the test tool fully it is likely that thereturn will only be maximised after the 3rd time running the regression test suite for each release.The other major uses of the test tool are for (1) Load Testing, (2) Multi-User Testing, and (3)Repetitive data entry.

    Resp : Test Controller

    9.2. Assumptions y Software will be delivered on time.y Software is of the required quality.y The software will not be impacted by impending Y2K compliance changes to the external

    software infrastructure - i.e. any external software changes will have to be compatiblewith this application.

  • 8/7/2019 RS test plan sample of project

    25/30

    y All "Show-Stopper" bugs receive immediate attention from the development team.y All bugs found in a version of the software will be fixed and unit tested by the

    development team before the next version is released.y Functionality is delivered to schedule.y Required resources available.y

    All service agreements will be met.y The automated test tool will function & interface correctly with the software.y All documentation will be up to date and delivered to the system test team.y Functional and technical specifications will be signed off by the business.y All service agreements will be met.y The Intranet will be fully functional prior to project commencement.

  • 8/7/2019 RS test plan sample of project

    26/30

    . Formal Signoff

    This document must be formally approved before System Test can commence. The followingpeople will be required to sign off :-

    GroupSignatures:

    Project Manager Byron RuthlennSQA Colm JonesTesting Team Dion HaisDevelopment Team Erwin Smith

  • 8/7/2019 RS test plan sample of project

    27/30

    11. APPENDICES 11.1. Purpose of Error Review Team.

    Ensure maximum efficiency of the development and system testing teams for the release of thenew office software through close co-operation of all involved part ies.

    This will be achieved through daily meetings whose function will be to

    y Agree status of each raised Error y Prioritisation of valid Error'sy Ensure that enough documentation is available with Error's.y Agree content and timescale for software releases into System test.y Ensure one agreed source of Error reporting information.y Identify any issues which may affect the performance of system testing.

    11.2. Error Review Team Meeting Agenda. y Review any actions from last meeting.

    y Classify and prioritise each Error.

    y Review Error's raised for Duplicates etc.y Agree priority of each Error y Determine adequacy of documentation associated with raised Error's.

    y Agree release content and timescale.

    y Review of assigned actions from meeting.

    y AOB

    11.3. Classification of Bugs

    1. An "A" bug is a either a showstopper or of such importance as to radically affect thefunctionality of the system i.e. :

    - Examples of showstoppers

    - If, because of a consistent crash during processing of a particular type of application, auser could not complete that type of application.

    - Incorrect data is passed to legacy system resulting in corruption or system crashes

    - Example of severally affected functionality- Calculation of repayment term/amount are incorrect- Incorrect credit agreements produced

  • 8/7/2019 RS test plan sample of project

    28/30

    2. Bugs would be classified as "B" where :- a less important element of functionality is affected

    - Example : a value is not defaulting correctly and it is necessary to input the correctvalue

    - data is affected which does not have a major impact- Example : where, for instance, some element of client capture was not propagated to thedatabase

    - there is an alternative method of completing a particular process- Example : a problem might occur reading all the details of a credit - this change can be

    manually input.

    3. "C" type bugs are mainly cosmetic bugs i.e. :- incorrect / no helptext on screens- drop down lists repeat an option

    11.4. Procedure for maintenance of Error Management system.

    1. The Test Controller will refer any major error/anomaly to either Devopment Team Leader or designated representative on the development team before raising a formal error record. Thishas several advantages :-- it prevents the testers trying to proceed beyond 'showstoppers'- it puts the developer on immediate notice of the problem- it allows the developer to put on any traces that might be necessary to track down the error.

    2. All bugs raised will be on the correct Error form, and contain all relevant data.3. These errors will be logged on the day they occur with a status of 'RAISED'4. There will be a daily 'System Test Support Group' meeting to discuss, prioritise and agree all

    logged errors.During this meeting some errors may be dropped, identified as duplicates, passed toprogrammer, etc.

    5. The Error Log will be updated with the status of all errors after this meeting. e.g. with pgmr,dropped, duplicate.

    6. Once errors have been fixed and 'rebundelled' for a release the paper forms must be passed tothe Test Controller and he will change their status to 'Fixed to be retested'

    7. Once the error has been retested and proven to be corrected the status will be changed to'Closed'

    8. Regular status reports will be produced from the Error system, for use in the Error ReviewTeam meetings.

    11.5. Overnight Processing - Checking Accounting & Audit & CIS

    Test Requirement Check Items Level of

  • 8/7/2019 RS test plan sample of project

    29/30

    Testing

    Accounting

    When spooling complete the Summary report should bechecked against :

    1. Similar Legacy Transactions2. Test Input forms

    1. Legacy Txs onReport

    VOffice Transactionson

    Report

    2. Summary reportV

    Applic input forms

    1. Checkingat

    field level2. Checking

    atfield level

    Accounting : after open/amend the amendment report shouldbe checked:1. For rejected open/amend instructions

    2. Detail should correspond to input Applic Forms

    1. Amendmentreport

    2. AmendmentreportV

    Test input forms

    1. Satisfy asto

    reasons for

    rejection

    2. Checkingat

    field level

    Print off Account and Customer records and check fielddetail against applic input forms/branch summary report

    Bookkeeping -Input tx's

    VTest Input

    forms/Amend rpt

    Checking atfield level

    11.7. SOFTWARE QUALITY ASSURANCE MEASURES (i) DATES.

    - Start date of SQA involvement.

    (ii) EFFORT.

    - No. of SQA Man Days Test Planning- No. of SQA Man Days Reviewing Test Plans- No. of SQA Man Days Executing Tests

    (iii) VOLUME.

    - No. of Tests Identified

    (iv) QUALITY.

  • 8/7/2019 RS test plan sample of project

    30/30

    - No. of Tests Passed First Time- Percentage of Tests Passed First Time- No. of Error's Raised During Regression Testing- No. of Error's Generated as a Result of Incorrect Fixes

    - No. of Error's Raised by Category (A/B/C)- No. of Error's Raised by Reason Code- No. of Error's Raised by High Level Business Function

    (v) TURNAROUND.

    - Average Error Turnaround Time