Upload
others
View
0
Download
0
Embed Size (px)
Citation preview
M7.1 Software Test Plan - Group Project
Software Test Plan
Supply Chain Risk Management
Tracker Analytics
CSOL 560 Group
Adrian Cordero
Bennie Hill
Steve Gomez
University of San Diego
October 1st, 2018
Global Software Enterprises
WORLDWIDE SUPPLY CHAIN ANALYTICS TEST PLAN
Version 1.0 10/22/2018
<Project Name>
Page 3 of 24
VERSION HISTORY
Version #
Implemented By
Revision Date
Approved By
Approval Date
Reason
1.0 GS Engineering Team
10/22/18 Customer Project Manager Name Redacted
10/22/18 Initial Test Plan
<Project Name>
Page 4 of 24
TABLE OF CONTENTS 1 INTRODUCTION .................................................................................................................... 6
1.1 Purpose of The Test Plan Document .................................................................. 6
2 UNIT TESTING ....................................................................................................................... 7 2.1 Purpose .................................................................................................................... 7 2.2 Test Risks / Issues ................................................................................................. 7 2.3 Items to be Tested / Not Tested ........................................................................... 7 2.4 Test Approach(s) .................................................................................................... 8 2.5 Test Regulatory / Mandate Criteria ...................................................................... 9 2.6 Test Pass / Fail Criteria ......................................................................................... 9 2.7 Test Deliverables .................................................................................................... 9 2.8 Test Suspension / Resumption Criteria .............................................................. 9
3 FUNCTIONAL TESTING .................................................................................................... 10 3.1 Purpose .................................................................................................................. 10 3.2 Test Risks / Issues ............................................................................................... 10 3.3 Items to be Tested / Not Tested ......................................................................... 10 3.4 Test Approach(s) .................................................................................................. 13 3.5 Test Regulatory / Mandate Criteria .................................................................... 13 3.6 Test Pass / Fail Criteria ....................................................................................... 13 3.7 Test Entry / Exit Criteria ....................................................................................... 13 3.8 Test Deliverables .................................................................................................. 14 3.9 Test Suspension / Resumption Criteria ............................................................ 14
4 REGRESSION TESTING ..................................................................................................... 15 4.1 Purpose .................................................................................................................. 15 4.2 Test Risks / Issues ............................................................................................... 15 4.3 Example of Regression Testing Plan (If these adjustments were made) .... 15 4.4 Test Approach(s) .................................................................................................. 16 4.5 Test Regulatory / Mandate Criteria .................................................................... 16 4.6 Test Pass / Fail Criteria ....................................................................................... 16 4.7 Test Entry / Exit Criteria ....................................................................................... 16 4.8 Test Deliverables .................................................................................................. 16 4.9 Test Suspension / Resumption Criteria ............................................................ 16
5 VERIFICATION PROCESS ................................................................................................ 17 5.1 Purpose .................................................................................................................. 17 5.2 Test Risks / Issues ............................................................................................... 17 5.3 Items to be Tested / Not Tested ......................................................................... 17 5.4 Test Approach(s) .................................................................................................. 18
<Project Name>
Page 5 of 24
5.5 Test Regulatory / Mandate Criteria .................................................................... 18 5.6 Test Pass / Fail Criteria ....................................................................................... 18 5.7 Test Entry / Exit Criteria ....................................................................................... 18 5.8 Test Deliverables .................................................................................................. 19 5.9 Test Suspension / Resumption Criteria ............................................................ 19
6 VALIDATION PROCESS .................................................................................................... 20 6.1 Test Risks / Issues ............................................................................................... 20 6.2 Items to be Tested / Not Tested ......................................................................... 20 6.3 Test Approach(s) .................................................................................................. 21 6.4 Test Regulatory / Mandate Criteria .................................................................... 21 6.5 Test Pass / Fail Criteria ....................................................................................... 21 6.6 Test Entry / Exit Criteria ....................................................................................... 21 6.7 Test Deliverables .................................................................................................. 21 Customer and Peer review sign-off on report deliverable. ........................................... 22 6.8 Test Suspension / Resumption Criteria ............................................................ 22
7 MITIGATION STRATEGIES ............................................................................................. 23 7.1 Test Risks / Issues ............................................................................................... 23 7.2 Mitigation Approach .............................................................................................. 23
7.2.1 Poorly Targeted/Misaligned Test Mitigation Strategy ................................ 23 7.2.2 Excessive Testing Risk Mitigation Strategy ................................................ 23
TEST PLAN APPROVAL ......................................................................................................... 24
<Project Name>
Page 6 of 24
1 INTRODUCTION 1.1 PURPOSE OF THE TEST PLAN DOCUMENT
The Test Plan document documents and tracks the necessary information required to effectively define the approach to be used in the testing of Worldwide Supply Chain Analytics. This Test Plan has been initially created during the Planning Phase of this project and will be revised as required as the project progresses. Its intended audience is the project manager, project team, and testing team. Some portions of this Test Plan may on occasion be shared with the client, users, auditors, and other stakeholders whose input/approval into the testing process is required.
<Project Name>
Page 7 of 24
2 UNIT TESTING 2.1 PURPOSE
Unit testing is a software development process in which the smallest testable parts of an application, called units, are individually and independently scrutinized for proper operation. Unit testing can be done manually but is often automated. Unit tests ensure that each module, class, and/or function/method within the Worldwide Supply Chain Analytics application is performing as intended.
2.2 TEST RISKS / ISSUES As the purpose of this unit testing is to ensure that the program functions correctly at its smallest units, the resulting risk would be missing an error, resulting in an unstable or unusable platform. Additionally, a false positive could unnecessarily consume additional resources, resulting in higher costs for the program.
2.3 ITEMS TO BE TESTED / NOT TESTED
Test ID Service/ Req Unit Test Description Test Date Responsibility
UT1 Geo Location service
Run NUNIT Auto Test for Item Location component
10/18/18 GSE Team
UT1.1 Req: DR2, TR5
Request/Obtain Location Unit
Execute against location unit
10/18/18 GSE Team
UT1.2 Req: BR10 Map Location unit
Execute against Map Location unit
10/18/18 GSE Team
UT2 Inventory Service
Run Junit and PHPUnit Auto Test for Item Inventory Service
10/18/18 GSE Team
UT2.1 Req: DR1, BR7
Update Inventory Records
Execute against update inventory records
10/18/18 GSE Team
UT2.2 Req: BR15 View Inventory Records
Execute against view inventory records
10/18/18 GSE Team
UT3 Shipping Service
Run NUNIT, Junit, and PHPUnit Auto Test for Shipping Services
10/19/18 GSE Team
UT3.1 Req: BR2 View Item Location Unit
Execute EMMA and PHPUnit test scripts
10/19/18 GSE Team
UT3.2 Req: BR4, BR8, DR3
View Source and Destination Unit
Execute NUnit, EMMA, and PHPUnit test scripts
10/19/18 GSE Team
UT3.3 Req: BR14 Shipping Cancellation Unit
Execute Junit and PHPUnit test scripts
10/19/18 GSE Team
<Project Name>
Page 8 of 24
UT4 Account Service
Run Junit and PHPUnit Auto Test for Account Service
10/19/18 GSE Team
UT4.1 Req: BR16 Validate User RBAC Access Request
Execute Junit and PHP Unit test scripts
10/19/18 GSE Team
UT5 Inventory Attribute Service
Run Junit and PHPUnit Auto Test for Inventory Attribute Service
10/19/18 GSE Team
UT5.1 Req: BR1 Update/Display Inventory Attribute Unit
Execute auto test against Update/Display Inventory Attribute Unit
10/19/18 GSE Team
UT6 Item Risk Service
Run JMockit and PHPUnit for Item Risk Service
10/19/18 GSE Team
UT6.1 Req: BR11 Priority Update Unit
Execute test against Priority Update Unit
10/19/18 GSE Team
UT7 API Gateway/ Global Requirements
Run JMockit and PHPUnit for API Gateway/ Global Requirements
10/20/18 GSE Team
UT7.1 Req: BR5, BR6, BR17
Data Reporting and Export Unit
Execute test against Data Reporting and Export Unit
10/20/18 GSE Team
UT7.2 Req: TR3 AutoArchive Unit
Execute test against AutoArchive Unit
10/20/18 GSE Team
2.4 TEST APPROACH(S) Business Requirements (BR) Testing For these requirements unit test will be conducted automatically. There will be code utilized so the following unit testing tools will be used: Jtest, Junit, JMockit, and EMMA. These functions will also be tested in a standalone environment so to eliminate possible functionality dependencies. The automatic unit testing tool used for a standalone test would be PHPUnit. Any mock objects needed will be used. Data Requirements (DR) Testing Automatic Unit testing will be conducted in this section. The Unit testing tool to be used is NUnit, because data drive tool. Timing Requirements (TR) The unit test used in this section will be performed manually. The manual unit test is being used because this section is one of the most important to the customers, so catching all errors is important. However, this function will also be tested in a standalone environment so to eliminate possible functionality dependencies. The automatic unit testing tool used for a standalone test would be PHPUnit.
<Project Name>
Page 9 of 24
2.5 TEST REGULATORY / MANDATE CRITERIA No testing against regulations needed. Testing must pass the required criteria found in the requirements section of table 2.3
2.6 TEST PASS / FAIL CRITERIA New unit functions as required / unit fails to function as required. No negative impact to other services / unit negatively impacts other units tested.
2.7 TEST DELIVERABLES 1) Individual Test Reports Comprised of:
a. Test Execution Log; b. Tester Report; and c. Tester Signature of Completed Test;
2) Compilation of all Test Reports; and 3) Acceptance by customer PM.
2.8 TEST SUSPENSION / RESUMPTION CRITERIA Testing is suspended when a test fails or yields an unexplainable result Testing is resumed when the failure or unexplainable result is addressed by the coding team. A report of the occurrence is logged and reviewed and approved by the Global Software Engineering Team Lead.
<Project Name>
Page 10 of 24
3 FUNCTIONAL TESTING 3.1 PURPOSE
Functional testing is a software testing process used within software development in which software is tested to ensure that it conforms with all requirements. Functional testing is a way of checking software to ensure that it has all the required functionality that's specified within its functional requirements.
3.2 TEST RISKS / ISSUES Severe weather concerns Limited wireless connectivity Alternative plan: In case of low connectivity displaying most up to date data is sufficient.
3.3 ITEMS TO BE TESTED / NOT TESTED
Test ID Item to Test Requirements Test Description Test Date Responsibility
FT1 Geo Location service
FT1.1 DR2: Item Location (Latitude and Longitude)
Log into a user account and verify latitude and longitude against Google Earth for accuracy.
11/1/18 Adrian Cordero using Customer login with GSE Support
FT1.2 TR5: Refresh: Every 5 seconds
Keep app running and time refresh to assure 5 second refresh.
11/1/18 Adrian Cordero using Customer login with GSE Support
FT1.3 BR9: Allow user to see last time system was refreshed
Ensure that the system show the most recent refresh.
11/1/18 Adrian Cordero using Customer login with GSE Support
FT1.4 BR10: Allow user to view most optimal routes of shipping
Compare routes with other know routes to ensure most optimal route is being displayed.
11/1/18 Adrian Cordero using Customer login with GSE Support
FT2 Inventory Service
FT2.1 DR1: Items description (weight, tracking number, part number, priority)
Verify that entered weight, part number, priority, and tracking number are all accurate for test order.
11/1/18 Adrian Cordero using Customer login with GSE Support
<Project Name>
Page 11 of 24
FT2.2 BR7: Allow users to add and remove an item
Attempt to remove and add an item from test order.
11/1/18 Adrian Cordero using Customer login with GSE Support
FT2.3 BR15: Allow user view information of individual who last scanned document
Verify that last scanned item is accurate.
11/1/18 Adrian Cordero using Customer login with GSE Support
FT3 Shipping Service
FT3.1 BR2: Allow users to view current position of item
Confirm that current position is accurate.
11/1/18 Adrian Cordero using Customer login with GSE Support
FT3.2 BR4: Allow users to view where the item came from
Confirm that original delivery location is accurate.
11/1/18 Adrian Cordero using Customer login with GSE Support
FT3.3 BR8: Allow users to view destination of item
Confirm that final destination is accurate.
11/1/18 Adrian Cordero using Customer login with GSE Support
FT3.4 BR14: Allow user to cancel shipments
Attempt to cancel a test order.
11/1/18 Adrian Cordero using Customer login with GSE Support
FT3.5 DR3: Dates (Current, Arrival, and Departure)
Confirm all dates match. 11/1/18 Adrian Cordero using Customer login with GSE Support
FT4 Account Service
BR16: Allow only authorized users to access
Attempt to access system without a proper login.
11/1/18 Adrian Cordero using Customer login with GSE Support
FT5 Inventory Attribute Service
BR1: Allow users to create descriptions of items
Attempt to create an item description. Verify that description is accurate once finished.
11/1/18 Adrian Cordero using Customer login with GSE Support
<Project Name>
Page 12 of 24
FT6 Item Risk Service
BR11: Allow users to set the priority of the item (red, yellow, and green)
Set priority of item and verify that priority is accurate.
11/1/18 Adrian Cordero using Customer login with GSE Support
FT7 API Gateway/Global Requirements
FT7.1 TR3: Duration: Data shall be archived for as long as customer wants
Archive test orders and check back weekly to ensure data integrity.
12/1/18 Bennie Hill
FT7.2 TR1: Latency: No more than 1 second response time on any task of the interface
Time and record response time of every task and verify 1 second response time.
11/8/18 Bennie Hill
FT7.3 BR5: Allow users to export current and past items for inventory
Attempt to export current and past test orders.
11/8/18 Adrian Cordero using Customer login with GSE Support
FT7.4 BR6: Allow generating reports on logistical trends
Generate a report on logistical trends.
11/8/18 Adrian Cordero using Customer login with GSE Support
FT7.5 BR17: Allow user to interface with map
Test the GUI to ensure proper interface.
11/1/18 Adrian Cordero using Customer login with GSE Support
FT7.6 DR5: External Interfaces (Excel and Satellite Imagery)
Check external interfaces for accuracy.
11/1/18 Adrian Cordero using Customer login with GSE Support
FT7.7 TR2: Frequency: Upgrade
Ensure that defined upgrade frequency is being done by checking version numbers on a weekly basis.
11/8/18 Thru 11/10/18
Steve Gomez
<Project Name>
Page 13 of 24
3.4 TEST APPROACH(S) Agile testing occurs as soon as functional requirements are built. Unit test for each individual component to assure each is operating as intended.
3.5 TEST REGULATORY / MANDATE CRITERIA No testing against regulations needed. Testing must pass the required criteria found in the requirements section of table 3.3.
3.6 TEST PASS / FAIL CRITERIA Pass: Each unit preforms as intended by the requirements section of Table 3.3. Fail: The unit does not perform as intended by the requirements section of Table 3.3.
3.7 TEST ENTRY / EXIT CRITERIA Entry: Prior testing phases are cleared/program is ready for functional testing. Exit: Testing has concluded successfully without any failures; or failures prevent testing from moving forward, resulting in additional coding corrections.
<Project Name>
Page 14 of 24
3.8 TEST DELIVERABLES Report detailing successful execution of the following sections: GEO location service provides accurate longitude and latitude of item refreshed every 5 seconds.
• Allow user to view last time system was refreshed. • Allows user to view optimal routes for shipping.
Inventory service provides items description (weight, tracking number, part number, priority).
• Allow user to add or remove items. • Allow user view information of individual who last scanned document.
Shipping services allows users to view current item location. • Allows user to see where item came from. • Allows user to see where item is going.
Provides user with current arrival and departure dates. Account services allows only authorized users into account. Inventory attribute service allows users to create item descriptions. Item risk service allows user to set item priority (red, yellow, green). API gateway allows data to be achieved as long as user needs.
• No more than 1 second response time on any part of interface. • Allow users to export current and past items for inventory. • Allow generating reports on logistical trends. • Allow user to interface with map. • Gives users external interfaces (excel and satellite imagery).
Customer Sign-off on report deliverable.
3.9 TEST SUSPENSION / RESUMPTION CRITERIA Any bugs or behavior that does not comply with the requirements section of Table 3.3.
• Testing would resume after bug or behavior was isolated and corrected. Inability to establish connection with servers due to weather, lack of connection, or bug.
• Testing would resume once weather clears, connection is reestablished, or bug is isolated and corrected.
<Project Name>
Page 15 of 24
4 REGRESSION TESTING 4.1 PURPOSE
Regression Testing confirms that recent program or code changes do not adversely impact existing features. Regression Testing is nothing but full or partial selection of already executed test cases which are re-executed to ensure existing functionalities operate correctly.
4.2 TEST RISKS / ISSUES As the purpose of this regression testing is to ensure that program modifications do not adversely impact other components of our platform, the resulting risk would be missing a compatibility conflict, resulting in an unstable or unusable platform, other unintended negative consequences resulting from software changes. Additionally, a false positive could unnecessarily consume additional resources, resulting in higher costs for the program.
4.3 EXAMPLE OF REGRESSION TESTING PLAN (IF THESE ADJUSTMENTS WERE MADE)
Test ID Item to Test Test Description Scheduled Test Date
Responsibility
RT1 Geo Location service
No Retest Required NA GS Engineering Team
RT2 Inventory Service
New Functionality Added Rerun Tests Integration Testing of Inventory Service Rerun UT2
10/31/18 GS Engineering Team
RT3 Shipping Service
No Retest Required NA GS Engineering Team
RT4 Account Service
New Functionality Added Rerun Tests Integration Testing of Account Service Rerun UT4
10/31/18 GS Engineering Team
RT5 Inventory Attribute Service
Bug Fixed Rerun Tests Integration Testing of Inventory Attribute Service Rerun UT5
10/31/18 GS Engineering Team
RT6 Item Risk Service
No Retest Required NA GS Engineering Team
RT7 API Gateway API Gateway tested through retested services / no additional testing required
NA GS Engineering Team
<Project Name>
Page 16 of 24
4.4 TEST APPROACH(S) Take components offline one at a time to test impact to other services Place different versions of each service online in random sequences to ensure no impact to other services.
4.5 TEST REGULATORY / MANDATE CRITERIA Updating a component does not disable any non-dependent functionality of any other service.
4.6 TEST PASS / FAIL CRITERIA New/Corrected Service functions as expected. No negative impact to other services.
4.7 TEST ENTRY / EXIT CRITERIA Regression testing starts once errors from previous tests, or added functionality are addressed or completed. Regression testing is completed once all required tests rerun successfully without errors that impact the required operations of the software. This includes all test execution logs, tester reports, and tester signatures for each re-executed test. These items are compiled into a regression testing report document, which is approved/accepted by the customer’s Project Manager.
4.8 TEST DELIVERABLES 1) Individual Test Reports Comprised of:
a. Test Execution Log; b. Tester Report; and c. Tester Signature of Completed Test;
2) Compilation of all Test Reports; and 3) Acceptance by customer PM.
4.9 TEST SUSPENSION / RESUMPTION CRITERIA Testing is suspended when a test fails or yields an unexplainable result Testing is resumed when the failure or unexplainable result is addressed by the coding team. A report of the occurrence is logged and reviewed, and approved by the Global Software Engineering Team Lead.
<Project Name>
Page 17 of 24
5 VERIFICATION PROCESS 5.1 PURPOSE
Verification and validation ensure that the product meets all the requirements, and that it is ultimately useful to the intended end-user.
5.2 TEST RISKS / ISSUES The purpose of this regression testing is to ensure that program modifications do not adversely impact other components of our platform. The resulting risk would be missing a compatibility conflict, resulting in an unstable or unusable platform, other unintended negative consequences resulting from software changes. Additionally, a false positive could unnecessarily consume additional resources, resulting in higher costs for the program.
5.3 ITEMS TO BE TESTED / NOT TESTED
Test ID Item to Test Test Description Scheduled Test Date
Responsibility
VT1 Geo Location service
Execute Style Checker Execute Static Analysis Execute Robustness Analysis Run Consistency Checking
11/12/18 GS Engineering Team
VT2 Inventory Service
Execute Style Checker Execute Static Analysis Execute Robustness Analysis Run Consistency Checking
11/12/18 GS Engineering Team
VT3 Shipping Service
Execute Style Checker Execute Static Analysis Execute Robustness Analysis Run Consistency Checking
11/14/18 GS Engineering Team
VT4 Account Service
Execute Style Checker Execute Static Analysis Execute Robustness Analysis Run Consistency Checking
11/14/18 GS Engineering Team
VT5 Inventory Attribute Service
Execute Style Checker Execute Static Analysis Execute Robustness Analysis Run Consistency Checking
11/18/18 GS Engineering Team
<Project Name>
Page 18 of 24
VT6 Item Risk Service
Execute Style Checker Execute Static Analysis Execute Robustness Analysis Run Consistency Checking
11/18/18 GS Engineering Team
VT7 API Gateway Execute Style Checker Execute Static Analysis Execute Robustness Analysis Run Consistency Checking
11/12/18 GS Engineering Team
5.4 TEST APPROACH(S) Each service shall receive a week of testing with two GS engineering teams handling one service each. 1) Style Checker: Will leverage an automated test to confirm compliance with
standard practice. 2) Static Analysis: Will leverage an automated test against standard coding errors
as identified in Unit Testing on final product. 3) Consistency Checking and Robustness Analysis: Rerun functional use cases
on completed programs prior to final acceptance to ensure errors have been successfully identified on final product.
5.5 TEST REGULATORY / MANDATE CRITERIA No regulatory requirements. Testing must pass the required criteria found in the requirements section of table 3.3
5.6 TEST PASS / FAIL CRITERIA Pass: Each test preforms as intended by the requirements section of Table 3.3 without bugs, crashes, loops, deadlocks, and freezes. Fail: The unit does not perform as intended by the requirements section of Table 3.3 or is deemed unstable by any test.
5.7 TEST ENTRY / EXIT CRITERIA Entry: Prior testing phases are cleared/program is ready for validation testing. Exit: Testing has concluded successfully without any failures; or failures prevent testing from moving forward, resulting in additional coding corrections, and program is deemed stable under all conditions and tests.
<Project Name>
Page 19 of 24
5.8 TEST DELIVERABLES Report detailing successful execution of the following sections without bugs, crashes, loops, deadlocks, and freezes: GEO location service provides accurate longitude and latitude of item refreshed every 5 seconds.
• Allow user to view last time system was refreshed. • Allows user to view optimal routes for shipping.
Inventory service provides items description (weight, tracking number, part number, priority).
• Allow user to add or remove items. • Allow user view information of individual who last scanned document.
Shipping services allows users to view current item location. • Allows user to see where item came from. • Allows user to see where item is going.
Provides user with current arrival and departure dates. Account services allows only authorized users into account. Inventory attribute service allows users to create item descriptions. Item risk service allows user to set item priority (red, yellow, green). API gateway allows data to be achieved as long as user needs.
• No more than 1 second response time on any part of interface. • Allow users to export current and past items for inventory. • Allow generating reports on logistical trends. • Allow user to interface with map. • Gives users external interfaces (excel and satellite imagery).
Customer and Peer review sign-off on report deliverable.
5.9 TEST SUSPENSION / RESUMPTION CRITERIA Testing is suspended when a test fails or yields an unexplainable result Testing is resumed when the failure or unexplainable result is addressed by the coding team. A report of the occurrence is logged and reviewed, and approved by the Global Software Engineering Team Lead.
<Project Name>
Page 20 of 24
6 VALIDATION PROCESS 6.1 TEST RISKS / ISSUES
Failed test could result in product delays and or massive re-coding.
6.2 ITEMS TO BE TESTED / NOT TESTED
Test ID Item to Test Test Description Scheduled Test Date
Responsibility
VT1 Geo Location service
Prototyping Modeling Model Checking Goal Analysis Model/Specification Inspection
12/01/18 GS Engineering Team 1
VT2 Inventory Service
Prototyping Modeling Model Checking Goal Analysis Model/Specification Inspection
12/01/18 GS Engineering Team 2
VT3 Shipping Service
Prototyping Modeling Model Checking Goal Analysis Model/Specification Inspection
12/08/18 GS Engineering Team 1
VT4 Account Service
Prototyping Modeling Model Checking Goal Analysis Model/Specification Inspection
12/08/18 GS Engineering Team 2
VT5 Inventory Attribute Service
Prototyping Modeling Model Checking Goal Analysis Model/Specification Inspection
12/15/18 GS Engineering Team 1
VT6 Item Risk Service
Prototyping Modeling Model Checking Goal Analysis Model/Specification Inspection
12/15/18 GS Engineering Team 2
VT7 API Gateway Prototyping Modeling Model Checking Goal Analysis Model/Specification Inspection
12/22/18 GS Engineering Team 1&2
<Project Name>
Page 21 of 24
6.3 TEST APPROACH(S) Each service shall receive a week of testing with two GS engineering teams handling one service each. 1) Prototyping: Test an almost finished version of app by creating a user account
and running through typical customer experience including: Log in, placing orders, removing orders, canceling orders, testing GEO location services, and account services, etc.
2) Modeling: Will take prototyping information to design a software model to best fit the final solution using UML (Unified Model Language)
3) Model Checking: Run through every possible event to ensure no system crashes, loops, deadlocks, while ensuring every service function as described in Section 3.3.
4) Goal Analysis: Non-functional testing to be done by a select group of testers that will use the app during a trial phase to refine and analyze quality requirements, to manage change and to stress customer acceptance.
5) Model/Specification Inspection: Group of peer reviewers will be chosen by Steve Gomez to look for any missed defects by the above tests.
6.4 TEST REGULATORY / MANDATE CRITERIA No testing against regulations needed. Testing must pass the required criteria found in the requirements section of table 3.3.
6.5 TEST PASS / FAIL CRITERIA Pass: Each test preforms as intended by the requirements section of Table 3.3 without bugs, crashes, loops, deadlocks, and freezes. Fail: The unit does not perform as intended by the requirements section of Table 3.3 or is deemed unstable by any test.
6.6 TEST ENTRY / EXIT CRITERIA Entry: Prior testing phases are cleared/program is ready for validation testing. Exit: Testing has concluded successfully without any failures; or failures prevent testing from moving forward, resulting in additional coding corrections, and program is deemed stable under all conditions and tests.
6.7 TEST DELIVERABLES Report detailing successful execution of the following sections without bugs, crashes, loops, deadlocks, and freezes: GEO location service provides accurate longitude and latitude of item refreshed every 5 seconds.
• Allow user to view last time system was refreshed. • Allows user to view optimal routes for shipping.
Inventory service provides items description (weight, tracking number, part number, priority).
<Project Name>
Page 22 of 24
• Allow user to add or remove items. • Allow user view information of individual who last scanned document.
Shipping services allows users to view current item location. • Allows user to see where item came from. • Allows user to see where item is going.
Provides user with current arrival and departure dates. Account services allows only authorized users into account. Inventory attribute service allows users to create item descriptions. Item risk service allows user to set item priority (red, yellow, green). API gateway allows data to be achieved as long as user needs.
• No more than 1 second response time on any part of interface. • Allow users to export current and past items for inventory. • Allow generating reports on logistical trends. • Allow user to interface with map. • Gives users external interfaces (excel and satellite imagery).
Customer and Peer review sign-off on report deliverable.
6.8 TEST SUSPENSION / RESUMPTION CRITERIA Any bugs, loops, deadlocks, freezes, or behavior that does not comply with the requirements section of Table 3.3 and that does not get sign-off by each test in Section 6.6.
• Testing would resume after bug or behavior is isolated and corrected and test is re-run.
<Project Name>
Page 23 of 24
7 MITIGATION STRATEGIES 7.1 TEST RISKS / ISSUES
The primary risks resulting from an insufficient testing process revolve around increased cost and schedule, resulting in additional time and funding requirements, impacting Global Software Enterprises revenues and potentially its reputation. The most likely causes of increases to cost and time requirements come from: Poorly Targeted/Misaligned Tests missing issues that must be addressed later in the software development process; with a greater cost of time and resources to address the further in the process the issue is discovered in the following areas:
• Product Stability
• Product Security
• Product Functionality Excessive Testing of properly functioning components that result in additional time and resources to address a false positive.
7.2 MITIGATION APPROACH 7.2.1 Poorly Targeted/Misaligned Test Mitigation Strategy
Our test plan leverages best practices to target known common software design issues; however, as the project progresses, we will re-evaluate our test plan to ensure it remains aligned to the risks posed to the software.
7.2.2 Excessive Testing Risk Mitigation Strategy
To reduce the risk of excessive testing, we will test failed test steps twice before requiring remediation steps. If the step fails again, we will begin remediation analysis. If the test passes the second time, we will run it once more to ensure the failure was a fluke. As software issues are easier to address the earlier they are caught, we will err on the side of over testing and remediating to releasing software with errors that could cause issues or vulnerabilities in the field. In addition to these two risks, we will release functionality upgrades and patches per contractual requirements, and release security patches to all customers, regardless of contractual status.
<Project Name>
Page 24 of 24
TEST PLAN APPROVAL The undersigned acknowledge they have reviewed the WORLDWIDE SUPPLY CHAIN ANALYTICS Test Plan document and agree with the approach it presents. Any changes to this Requirements Definition will be coordinated with and approved by the undersigned or their designated representatives.
Signature: Signature on File Date: 10/22/18 Print Name: REDACTED - See Name on File Title: Project Manager Role: Customer Project Manager
Signature: Signature on File Date: 10/22/18 Print Name: Adrian Cordero Title: GS Engineer Role: GS Project Manager
Signature: Signature on File Date: 10/22/18 Print Name: Bennie Title: GS Engineer Role: GS Engineering Team One Lead
Signature: Signature on File Date: 10/22/18 Print Name: Steve Gomez Title: GS Engineer Role: GS Engineering Team Two Lead