25
Institutional and Sector Modernisation Facility ICT Standards Project Funded by the European Union ICT Acceptance procedures Document number: ISMF-ICT/3.17 Version: 1.00

ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

Embed Size (px)

Citation preview

Page 1: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

Institutional and Sector Modernisation Facility

ICT Standards

Project Funded by the European Union

ICT Acceptance procedures Document number: ISMF-ICT/3.17 Version: 1.00

Page 2: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

1 Document control

Page 3: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

1.1 List of Abbreviations Abbreviation Description E-Gov E-Government ISMF Institutional and Sector Modernisation Facility MoCT Ministry of Communications and Technology MoAg Ministry of Agriculture MoEl Ministry of Electricity MoET Ministry of Economy and Trade MoF Ministry of Finance MoI Ministry of Industry MoP Ministry of Petroleum MoT Ministry of Transport NISFED National Information System For Economic Development PICU Project Implementation and Coordination Unit (from ISMF) PMO Office of the Prime-Minister SCS Syrian Computer Society SPC State Planning Commission STE Syrian Telecommunications Establishment TOR Terms Of Reference TFIP Task Force Industrial Policy SVGA Super Video Graphics Array OS Operating System DDR Double Data Rate CD Compact Disc CD-RW Compact Disc Re Writable DVD Digital Versatile Disk DVD RW Digital Versatile Disk Re Writable Wi-Fi Wireless Fidelity LCD Liquid Crystal Display S/N Serial Number CPU Central Processing Unit USB Universal Serial Bus SMART Self-Monitoring, Analysis, and Reporting Technology RAID Redundant Array of Independent Disks IP Internet Protocol MAC Media Access Control SNMP Simple Network Management Protocol VLAN Virtual Local Area Network QOS Quality Of Service

Page 4: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

1.2 Purpose of the document The purpose of this document is to define the standards and deliverables for User Acceptance Testing for the Syrian Public Sector (Ministries and their directorates, public institutions and organizations). This document also provides a template and instructions for developing a User Acceptance Test Plan for a specific application.

Page 5: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

2 Introduction The user acceptance test standard guide aims to implement a program of acceptance procedures for the Syrian Public Sector information systems and/or related studies. This document is to be used by the Project Managers or anyone tasked with preparing a User Acceptance Test Plan. This document consists of two major parts: � Software and Integrated Systems Acceptance Test Plan � Hardware and Network Acceptance Test Plan

2.1 User Acceptance Test Standards The purpose of this section is to define the levels of testing required for User Acceptance Testing, depending on the extent of change being applied to the system.

2.1.1 Overview

This section is a descriptive overview, which will assist in developing a User Acceptance Test Plan. The possible levels of testing required in a User Acceptance Test Plan are: � New System Test: where the system to be tested is entirely new (is not an enhancement

or system upgrade). � Regression Test: where the amount of change in an existing system requires a full

system retest. � Limited Test: where the amount of change in an existing system requires only change

specific testing.

2.1.2 New System Test

The Purpose is: � To ensure that the system meets all specified objectives; � To ensure that all requirements are included in the new system. The planning of the User Acceptance Test must be as early in the life cycle as possible. It is important that the original project scope is verified in the User Acceptance Test Plan by including the key points of the original project documentation in the test scripts. The Following Material can contribute to the Test Plan (if available): Systems Analysis documentation, System Design Document, Project Statement, Technical documentation, Online Help/Documentation, User Training material, User Manual, Application Change Requests, Contractor Test plans, Contractor Test Results, Similar User Acceptance Test Plans (from other systems).

2.1.3 Regression Test

The Purpose is: � To ensure the "entire" system is working correctly; � To ensure that system changes have not changed existing functionality; � To ensure that the new development work meets all requirements.

Page 6: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

The planning of the User Acceptance Test must begin as early in the life cycle as possible. It is important that the original project scope is verified in the User Acceptance Test Plan by including the key points of the original project documentation in the test scripts. The Following Material can contribute to the Test Plan (if available): Requirement documentation, System Design documentation, Project Statement, Technical documentation specific to the changes, Online Help/Documentation, User Training material, User Manual, Contractor Test plans, Contractor Test Results, Previous User Acceptance Test Plans.

2.1.4 Limited Testing

The Purpose is: � To ensure the "entire" system is working correctly; � To ensure that system changes have not changed existing functionality; � To ensure that the new development work meets all requirements;

2.2 Methodology The key to a successful project is gaining acceptance from the customer that each deliverable produced meets (or exceeds) his/ her requirements. To clarify the criteria used to judge each deliverable for customer acceptance, an Acceptance Test Plan is produced. The Acceptance Test Plan provides the criteria for obtaining customer acceptance, a schedule of acceptance reviews within which customer acceptance will be sought and a summary of the process used to gain acceptance of each deliverable from the customer. The acceptance test plan incorporates four phases: � Defining the acceptance test criteria; � Developing an acceptance test plan; � Executing the acceptance test plan; � Reaching an acceptance decision based on the test results. The testing process begins by developing a comprehensive plan to test the general functionality and special features on a variety of platform combinations. Strict quality control procedures are used. The process verifies that the application meets the requirements specified in the system requirements document and is bug free. At the end of each testing day, the team prepares a summary of completed and failed tests. A report is prepared at the end of testing to show exactly what was tested and to list the final outcomes. Without a testing methodology, the actual test tends to be all over the place. The real answer is that a methodology is required to test anything thoroughly. Acceptance committee should perform inspections and testing for samples from the delivered hardware, software or network devices to ensure that they are conformity with the contract. The delivered equipments should be conformity with contract requirements in quantity and specifications.

2.3 Assessing the Test Types for a Test Plan

Page 7: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

The purpose of this section is to describe the process of planning and assessing the level of testing required for a specific application.

2.3.1 Overview

To complete a User Acceptance Test Plan for a specific system, the tests must first be planned based on the initial system project documentation. By researching the purpose of the development work, and the degree to which this development work will affect the rest of the system, the scripts needed for testing can then begin to be created. (A ‘test script ' includes the individual test steps to be executed in order to verify a system is working as expected). The development of a User Acceptance Test Plan involves a number of iterative steps: � Assess the type of testing required � Develop the procedures and instructions for testing � Develop the necessary test scripts � Execute the test scripts � Report any defects � Retest any fixes

2.3.2 Assessing the Test Type Required

The criteria for determining the degree and type of testing that may be required are listed below. They can be used as a guide for determining what test scripting will be required for a particular User Acceptance Test Plan. If the system changes to be tested fall in more than one criterion, then multiple test script types may be required. Once the appropriate test types to follow have been determined, the User Acceptance Test Plan can be completed. � New System – (not replacing an existing system).

When developing the User Acceptance Test Plan there should be involvement in the design and reviews of the new system with the contractor. The User Acceptance Test Plan should be developed with communication from the contractor and with as much information gathered through the system documentation as possible.

� New System – (replacing an existing System).

The Test Plan should be developed using the required aspects of the system that is being replaced. Test scripts for any enhancements to the new (replacement system) should be developed as if the system is a new system – basing information of requirement and design documents. Running a parallel test with the old system, and comparing critical report results would be the optimal test scenario for any functionality that is to be duplicated in the new system.

� Database Change – (no other change to the System)

In this case running a parallel test with the system using the old database and the system using the new database is advised. Comparing critical report results would be the optimal test scenario to ensure that the new database and drivers are producing

Page 8: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

identical results to the old system. Performance testing should be included in these test criteria. Create a User Acceptance Test Plan to manage the parallel test.

� System Enhancement – (the existing system is enhanced with new functionality).

Test scripts should be developed and tested to ensure that the new functionality is integrated with the existing system correctly, and to ensure that no existing functionality has been lost in the enhancement process. Test scripts should be developed based on the specific requirements for new functions. In this case, unchanged functions may not require testing, if they are not involved in any business processes which have been changed.

� System Enhancement – (the existing system is being enhanced to change the existing

functionality).

Test scripts should be developed based on the requirements documents to ensure that the system meets its enhanced requirements. Existing test scripts can be amended in this case. Test scripts should also be developed and tested to ensure that the enhanced functionality is integrated with the existing system correctly, and to ensure that no existing functionality has been lost in the enhancement process. In this scenario, for every function that has been changed, or is affected by a change, new data or new functionality should be tested to assure that existing functionality is not lost. This will ensure that the new requirements are being met.

� Infrastructure Change – (the system is not changing but a test is required to port it to a new

environment, new server, or the system be being utilized for a new division or purpose).

In this case, a full test of test scripts and all business processes should be repeated as new environments can create unexpected problems for an existing system.

2.4 Testing Environment The hardware, software and network environment within which a system will be tested must be defined.

2.5 Status Reporting Test preparation and testing progress should be formally reported during a weekly Status Meeting. The status report will be prepared by the Test Controller.

2.6 Test Schedules A test schedule that shows a high level view of the project must be defined. Test schedule is a timeline for software and integrated systems tests, hardware and network tests.

2.7 Roles and Resources

Page 9: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

Roles and Resources must be defined, giving the time frame that their tasks must be finished. The following table can be used for initial planning, assignment and subsequent follow up. Resource Type Resource Title No. Date

Required Who Status

Project Mgmt/Functional

Business Analyst 1

Testing Test Controller 1 Testers 4

Test Support Team Support Programmers 4 Technical Support 1 Network Support 1

Technical - External 1 External Liaison Support

1

Business Business Expert/ Business Representative

1

Page 10: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

3 Appendix A: Software and Integrated Systems Acceptance Test Plan

3.1 Testing Approach The test mechanism that will be adopted on the project can be defined from the following or a combination of them: � Manual Test ; � GUI testing tools ; � Code ; � Script.

3.2 Test Reports A status report will be prepared by the Test Controller. This report will contain the following information: � Current Status v. Plan (Ahead/Behind/On Schedule) ; � Progress of tasks planned for previous week ; � Tasks planned for next week including tasks carried from previous week ; � Error Statistics from Error Measurement system ; � Issues/Risks.

3.3 Software Tools The software tools that will be used during the test process must be defined (if applicable).

3.4 Testing Process The diagram below outlines the test process approach that should be followed. � Organise Project involves creating a System Test Plan, Schedule & Test Approach, and

requesting/assigning resources; � Design/Build System Test involves identifying Test Cycles, Test Cases, Entrance & Exit

Criteria, Expected Results, etc. In general, test conditions/expected results will be identified by the test team in conjunction with the project business analyst or business expert. The test team will then identify test cases and the data required. the test conditions are derived from the business design and the transaction requirements documents.

� Design/Build Test Procedures includes setting up procedures such as error management systems and Status reporting, and setting up the data tables for the automated testing tool;

� Build Test Environment includes requesting/building hardware, software and data set-ups; � Execute Project Integration Test ; � Execute Operations Acceptance Test ; � Signoff - Signoff happens when all pre-defined exit criteria have been achieved.

Testing Process Roles and Responsibilities � Project Manager:

� Creating a System Test Plan; � Creating Schedule & Test Approach; � Assigning Resources;

Page 11: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

� Signoff. � Testing Team & Business Analyst:

� Identify Test Cycles; � Identify Test Cases; � Identify Entrance & Exit; � Identify Expected Results; � Build Test Environment.

� Testers: � Execute Project Integration Test; � Execute Operations Acceptance Test ;

Page 12: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

Testing Process

Design/Build System ExecuteOrganise Project

System Test Plan

identify Test Cycles

all pre-defined exit criteria have been achieved

IdentifyExpected Results

Execute Project Integration Test

Execute Operations Acceptance

Test

identify Test Cases

Build Test Environment

Signoff

Identify Entrance & Exit

Criteria

End

YES

Schedule & Test Approach

requesting/assigning resources

NO

Page 13: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

3.5 Testing Scope Outlined below are the main test types that will be performed. All system test plans and conditions will be developed from the functional specification and the requirements catalogue.

3.5.1 Functional Testing

The objective of this test is to ensure that each element of the application meets the functional requirements of the business as outlined in the: � Requirements Catalogue; � Business Design Specification ; � Development Standards ; � Other functional documents i.e. resolution to issues/change requests/feedback. This stage will also include Validation Testing - which is intensive testing of the new Front end fields and screens. Windows GUI Standards; valid, invalid and limit data input; screen & field look and appearance, and overall consistency with the rest of the application. The third stage includes Specific Functional testing - these are low-level tests which aim to test the individual processes and data flows.

3.5.2 Integration Testing

This test proves that all areas of the system interface with each other correctly and that there are no gaps in the data flow. Final Integration Test proves that system works as integrated unit when all the fixes are complete.

3.5.3 Business (User) Acceptance Test

This test, which is planned and executed by the Business Representative(s), ensures that the system operates in the manner expected, and any supporting material such as procedures, forms etc. are accurate and suitable for the purpose intended. It is high level testing, ensuring that there are no gaps in functionality.

3.5.4 Performance Testing

These tests ensure that the system provides acceptable response times.

3.5.5 Regression Testing

A Regression test will be performed after the release of each Phase to ensure that: � There is no impact on previously released software, and � To ensure that there is an increase in the functionality and stability of the software. The regression testing can be automated using an automated testing software (if applicable).

3.5.6 Bash & Multi-User Testing

Multi-user testing will attempt to prove that it is possible for an acceptable number of users to work with the system at the same time. The object of Bash testing is an ad-hoc attempt to break the system.

3.5.7 Technical Testing

Technical Testing will be the responsibility of the Development Team.

3.5.8 Operations Acceptance Testing (OAT)

Page 14: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

This phase of testing is to be performed by the Systems Installation and Support group, prior to implementing the system in a live site. The testing team will define their own testing criteria, and carry out the tests.

3.5.9 Types of Testing

There are three types of testing: � Form-Based Testing

� To verify that individual application forms are performing correctly; � To ensure that all required fields and buttons exist on the application forms; � To ensure that the flow of information and data entry is logical and correct based on

the application’s business requirements; � To ensure that a new form added to an existing application is functioning according to

specifications; � To verify that new functionality added to an existing form has not adversely affected

the existing functionality on that form; � To ensure that the flow of fields on a new form is sensible; � To check common relationships of fields between forms; � To verify navigation between forms.

Material to Review:

� Requirement documentation; � Technical documentation; � Analysis documentation ; � Contractor Test plans; � Contractor Test Results;

� Business Process Testing:

� To ensure that all business related functions and processes are supported by the application;

� To ensure that the flow of data and screens is logical for each business process; � To ensure that security and access requirements are met; � To test the application with "real" scenarios; � To test all batch processes; � To test for performance related problems; � To test the application’s interfaces.

Material to Review:

� Requirement documentation ; � Technical documentation ; � Analysis documentation; � Contractor Test plans;

Page 15: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

� Contractor Test Results; � Previous User Acceptance Test Plans; � Security Information; � Business Process Documentation.

� Report Testing:

� To ensure that the new report meets its requirements; � To ensure that the data extracted for the new report is correct; � To ensure that the report format is correct and logical; � To ensure that the print process is working correctly;

Material to Review:

� Requirement documentation; � System Design Documentation; � Technical Specifications; � Contractor Test Plans; � Contractor Test Results;

3.6 Creating a Software and Integrated Systems Acceptance Test Plan

3.6.1 Instructions for the User Acceptance Tester

New System and Regression Testing Instructions The purpose of regression testing and new system testing is to assure that the entire application is functioning correctly. This test plan is designed to ensure that all components of the application are tested. Complete all test scripts included within this test plan for all applicable user levels and modules as listed.

Form-Based Testing Procedures The purpose of testing the individual forms and the user interface is to assure that all of the Menus and Graphical Interface buttons, pull down lists, scrolling lists, and check boxes are performing correctly. It is important to perform each interface for each module/screen as not all modules have the same buttons or menus.

Business Process Testing Procedures The purpose of testing the Business Processes is to ensure that all of the functional requirements for the application are performing correctly. It is important to complete all test cases for each User Security level (e.g. System Administrator, Processing Clerk etc.) This will assure that each Security Level has access to the appropriate functions.

Report Testing Procedures Ask Users to specify which data they would like to report on. It is important that this process is tested for each module. If the result of printing is not as expected be sure to fill out a defect report form for that test. It is important to repeat each test script for the different user Security levels listed in the User Security Matrix to ensure that report security rules are functioning correctly.

Page 16: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

Defect Reporting � Defect Tracking Form A Defect Tracking Form must be completed at the time that a problem is found in order to assure that all details are documented correctly. � Defect Tracking Log

3.6.2 Form-Based Testing

Listed below are the modules to be tested. Each Screen should be reviewed for correctness as per the Form-Based Test Script.

List of Modules to be reviewed for Form-Based Testing Module ID Module name

Form-Based Test Script This checklist must be completed for ALL modules as listed above. Photocopy this page for each online module and fill in the module identifier in the space provided. Complete the test environment information. Attach a Defect Report if necessary to describe any anomalies. MODULE: ____________________ TEST ENVIRONMENT: Operating System: _____________ Network: _____________________ Workstation Memory: __________

Form Based Testing Component Pass/Fail Date Initials

1. Are all fonts, colours, shading and toolbars consistent with standards and project guidelines?

2. Is the online help module available? 3. Are all date formats correct (DD-MON-YYYY) Are the

date edits being correctly applied? Are dates greater than 2000 accepted?

4. Does all text wrap when displayed in the text editor?

Page 17: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

Form Based Testing Component Pass/Fail Date Initials 5. Is there a scroll bar on every applicable block? 6. Is the Toolbar List button enabled only when a list of

values is available for an item?

7. Do the window titles correctly identify each module? 8. Is there hint text available for all applicable items? 9. Do all of the initial 'window display' sizes fit entirely on

the screen (assuming an SVGA 800x600 resolution)?

10. Are the correct items case sensitive? (i.e. Do fields allow lower case text if they should only accept upper case?)

11. Are error, warning and information messages accurate and understandable?

12. Do all DELETE operations display a ‘Delete Confirmation’ alert?

13. Is the field tab order correct 14. Are the appropriate edits done on all fields (range of

values, valid values etc.)

15. Are defaults appropriate? 16. Are the correct fields mandatory? 17. Is the tool bar present and appropriate buttons enabled? 18. Are screen & field labels appropriate? 19. Are fields & buttons ordered appropriately? 20. Are all codes valid? 21. Are all field labels are consistent across the application

3.6.3 User Security Matrix

List all User IDs available for testing: Examples: SYS_ADMIN System Administrator – access to all CLERK Processing Clerk – limited access REPORT Reporting Clerk – Access reports only

List all security access availability for each user level in the spreadsheet below. User Security/Access Level Matrix USER ROLE Module

Label Module Description Details Module

Type 1

234…

Page 18: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

3.6.4 User Acceptance Business Process Test Scripts

Sample- Main Menu Using module <insert module ID>, complete the following checklist. Complete the User Security Level information and Test Environment information at the top of each form. Attach a Defect Report if necessary to describe any anomalies. USER SECURITY LEVEL: ___________________________ TEST ENVIRONMENT: Operating System: _____________ Network: ______________ Workstation Memory: __________ Acceptance Testing Action Pass/Fail Date Initials Does each button navigate to the correct module? Is the correct module name/date/version # displayed at the top of the window?

Are the appropriate buttons available, (based on the security matrix)?

Are the appropriate menu options available, (based on the security matrix)?

Access to application not allowed unless user is set up in Staffs and as an active application user (as defined in user detail maintenance)

Does the message/text box display text correctly? Do the graphics appear clearly and display correctly? Does the menu bar appear at the top of the page?

3.6.5 Report Test Scripts

Sample- Detailed Information Report Using report <insert report ID>, complete the following checklist. Complete the User Security Level information and Test Environment information at the top of each form. Attach a Defect Report if necessary to describe any anomalies. USER SECURITY LEVEL: ___________________________ TEST ENVIRONMENT: Operating System: _____________ Network: ______________ Workstation Memory: __________ Acceptance Testing Action Pass/Fail Date Initials Can you access the Report module from the main menu? Are all appropriate reports listed based on the Security of the user being tested?

Can you access the Detailed Information Report from the list of available reports?

Page 19: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

Is the correct report name/date/version # displayed at the top of the window?

Are the appropriate buttons available (based on the security matrix)?

Are the appropriate menu options available (based on the security matrix)?

Can you pull up the report to view on screen? Does all appropriate information list correctly? Does all information sort correctly? Do all dates display correctly DD-MM-YYYY? Are the fields available appropriate to the data displaying? Does the report display with adequate room for each column (no overlapping test or columns running into other columns)

Is all formatting correct (bold, italics, underline)? Can you print to the printer? Does all appropriate information list correctly? Does all information sort correctly? Do all dates display correctly DD-MM-YYYY? Are the fields available appropriate to the data displaying? Does the report display with adequate room for each column (no overlapping test or columns running into other columns)

Is all formatting correct (bold, italics, underline)?

3.6.6 Defect Tracking Form

User Acceptance Testing DEFECT #

Application Defect Tracking Form Test case Step #: Tester Name: Module ID: Date: User Security Level tested: APPMAN CRS_APPLICATION_MANAGER Access available to all functions CLERK1 CRS_PROCESSING_CLERK Data Entry Functionality CLERK2 CRS_PROCESSING_CLERK Data Entry Functionality CLERK3 CRS_PROCESSING_CLERK Data Entry Functionality Problem Severity (Check One): 1. System Crash (Data Loss) 2. System Crash (No Data Loss) 3. Incorrect Functionality (No Work Around) 4. Incorrect Functionality (Workaround) 5. Cosmetic Can be Reproduced (Check One): (E) Every Time

Page 20: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

User Acceptance Testing DEFECT #

Application Defect Tracking Form Test case Step #: (S) Sometimes (O) Occasionally (1x) Happened Once Defect Summary Description (One sentence description of problem):

Defect Description: (please print) Screen Print/Error Message Attached

Steps to Reproduce:

3.6.7 Defect Tracking Log

Defect # Tester Description Date Reported User Level Tested

Severity (1,2,3,4,5)

Repeated Date Reported to Contractor

Status (Open or Fixed)

Page 21: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

4 Appendix B: Hardware and Network Acceptance Test Plan In accordance to the Ministry Acceptance test plan this document outlines the procedures and suggested tools used to test and accept delivered new systems (workstations, servers and networks). The purpose for the Computer Acceptance procedure is to have a standard common process within the ministry for acceptance testing. It must be noted here, that for large volumes of ‘similar’ hardware (e.g. 2000 PCs), since it is not feasible to pass all the hardware from individual testing, a sampling method is advised.

4.1 Acceptance procedure for New Workstations In order to ensure new delivered workstations are comply to the required specifications and in proper working and have not been damaged due to the effect of manufacturing defects, shipping damage or other possible sources of infant mortality failure (early failure in the span of a component), a “Burn-in” test is performed. This test is done to ensure that all major subcomponents are operating within the device’s specified parameters at the maximum rated operating condition. The test should include at least the following components:

4.1.1 Processor

Tests: Execute a processor intensive operation where faults can be quantitatively measured. Requirements to Pass: No Faults must be recorded during the duration of test.

4.1.2 Memory

Tests: � Test memory using software that requires a small OS/kernel footprint (i.e. Freedos) and

use various test algorithms/patterns. Test with all available algorithms/patterns once. Requirements to Pass: No errors detected.

� Test memory bandwidth using software that detect/measure memory bandwidth

Requirements to Pass: Expected bandwidth achieved (e.g. DDR-400 memory should achieve 3.2GB with FSB 200MHz)

4.1.3 Video Card

Testes: Test 2D&3D with XP based application. Requirements to Pass: Picture should look stable on test monitor at the test duration.

4.1.4 Hard Drive

Tests:

Page 22: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

Execute a read test. Test should be executed a minimal of one time.

Requirements to Pass: Test should report no error

4.1.5 Optical Drive

Tests: Perform a write test (a read test is perform on write validation) on a 700MB CD-RW or DVD RW(+or-) The test data should consist of multiple files of various sizes and should have a total size of approx. 700MBor 4.7GB Fastest write speed that the optical drive will support should be configured. A quick erase should be first done on the CD-RW or DVD RW Validate the files written. Note the write and read throughput.

Requirements to Pass: Test should report no errors

4.1.6 Network Interface Card

Tests: � (Wired) Test the auto-sensing and clocking of the 10/100 and/or 1000 Mbps mode by

connecting to the Ethernet port the highest rated Ethernet speed network. Requirements to Pass: Verify that link light on network card is lit and the OS should report the connection at the correct Ethernet speed

� (Wireless) Test the auto-sensing and clocking of the Wi-Fi interface by connecting to an

802.11a, 802.11b or 802.11g network. Requirements to Pass: The OS should report the connection at the correct 802.11 network

4.1.7 Sound Card

Tests: Test line in / line output via using a loop-back cable and an application that can playout and record simultaneously. Application should compare the capture audio file to the playout for distortion. Requirements to Pass: Distortion should be minimal.

4.1.8 Motherboard

Tests: � For onboard audio, see tests under “sound card” � For onboard network, see tests under “Network Interface Card” � For onboard video, see tests under “Video Card” � P/S2 Keyboard &mouse

4.1.9 Monitor

Tests: For all monitors, display the following solid colour test patterns (black, white, red, green, blue, yellow, cyan and magenta). For each colour pattern, look to see if the colours of the test

Page 23: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

pattern are uniform and that there are no dead or dark spots. For LCD monitors, look for constantly dark or lit pixels.

Requirements to Pass: No significant uniformity in the test pattern and no more than 2 bright or dark pixels or 4 malfunctioning sub-pixels should be seen on monitors with native resolutions 1024x768, 3 bright or dark pixels or 4 malfunctioning sub-pixels should be seen on monitors with native resolutions 1280x1024.

Sample Test Report for Workstation Acceptance Workstation Brand Name: Workstation S/N: Test Date: Tester:

Specification Test Results Test Duration

S/W Tools Notes and records

CPU Motherboard

Keyboard&mouse USB Firewire Audio Network

Video Card 2D&3D

Memory Bandwidth : Hard Drive

Write Test- SMART

Optical Drive Write / Read Test CD

Write / Read Test DVD

Floppy Drive Chassis&power supply voltage

Network Card Sound Card

4.2 Acceptance procedure for New Servers The following components should be tested during the acceptance test procedure In order to ensure that the delivered servers are comply to the required specifications and in proper working and have not been damaged due to the effect of manufacturing defects.

Page 24: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

The test should include at least the following components � Processor: � Memory � Internal storage � CD/DVD-ROM Drive � Tape drive � Power supply � Console � External storage

� RAID controller � Hot plug and redundant power supplies

4.3 Acceptance procedure for Network Devices The aim of this procedure is to provide a thorough acceptance test procedures for the delivered network devices to ensure that it is comply to the required specifications The test procedure include the following devices:

4.3.1 Switches

� Rechecking the IP address(es) of the switch and its default gateway � Checking that switch is appropriately labelled and named � Recording the MAC address of the switch � Checking that the SNMP is available � Checking that the VLANs are operating � Checking the 100 or 1000 Megabit uplink(s) � Verifying that the correct number of RJ45 (or RJ21) ports are delivered � Verifying QOS works � Possible checking of protocols � Possibly verifying that Multicast is functioning properly

4.3.2 Firewalls

The following specifications should be tested during the acceptance test procedure In order to ensure that the delivered firewalls are comply to the required specifications � Security Credentials � Performance and Reliability � Manageability � Versatility � Application Proxies � Routing

4.3.3 Twisted pair cabling

� Verify that the identification number (label) is identical for both the jacks in the workstation wall plate and in the corresponding patch panel jacks to be tested.

� Verify that a mechanically sound connection exists for each wire at each of the attached terminations on an RJ-11 or RJ-45 jack.

Page 25: ICT Acceptance procedures - moct.gov.symoct.gov.sy/ICTSandards/en_pdf/31.pdf · ICT Acceptance procedures Document number ... The purpose of this document is to define the standards

� Verify correct straight-through correspondence of all four (4) wires at each RJ-11 or RJ-45 termination of the cable.

� If a single-end test device is to be used, verify the pin-to-pin continuity of each pair within the loopback plugs to be used at the opposite end.

4.4 Recommended testing software for workstations and servers � Memtest86+ (for memory testing) � MicroProbe 2005 (for motherboard, hard drive and memory) � Speed Fan (for voltage, temperature, fan and SMART monitoring) � Nero CD-DVD Speed (for DVD+/-R testing) � CheckeMon (for test patterns during monitor testing) � IPerf (for network test) � Passmark BurnIn Test (for CPU, hard drive, video card, USB, serial, audio, firewire,

network, optical drive, floppy)

4.5 Recommended testing software for network devices and performance Acceptance of the network devices, workstations, and servers should be followed by an overall testing and management process to ensure that the performance of the network is up to the needed level many software packages can be used to do that, such as: � HP OpenView � IBM Tivoli � SolarWinds