Upload
doandung
View
215
Download
2
Embed Size (px)
Citation preview
Senior Design Documentation Library System Test Plan
System Test Plan
Department of Computer Science and Engineering UNIVERSITY OF TEXAS AT ARLINGTON
Team AVIAR
Child-Parent Pairing System
Team Members: Bhavin Mistry
Brandon Gallagher Brandon Salahat
Eugen Eremia Tony Kieu
Late Updated: 8 April 2013 @ 11:11:00 AM
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 1 Table of Contents
Table of Contents
Table of Contents .......................................................................................................... 1
List of Tables ................................................................................................................. 3
List of Figures ............................................................................................................... 4
Document Revision History ......................................................................................... 5
1 Introduction ......................................................................................................... 6
1.1 Document Overview ............................................................................................. 6 1.2 Purpose ............................................................................................................... 7 1.3 Scope .................................................................................................................. 7 1.4 Definitions and Acronyms .................................................................................... 7
2 References .......................................................................................................... 8
2.1 Overview .............................................................................................................. 8 2.2 System Requirement Specification ...................................................................... 8 2.3 Architecture Design Specification ....................................................................... 10 2.4 Detailed Design Specification............................................................................. 14
3 Test Items .......................................................................................................... 17
3.1 Overview ............................................................................................................ 17 3.2 Relational Diagram ............................................................................................ 18 3.3 Hardware Tests ................................................................................................. 18 3.4 Unit Tests .......................................................................................................... 19 3.5 Component Tests ............................................................................................... 21 3.6 Integration Tests ................................................................................................ 22 3.7 System Verification Tests ................................................................................... 23
4 Risks .................................................................................................................. 24
4.1 Overview ............................................................................................................ 24 4.2 Risks Table ........................................................................................................ 24
5 Features to be tested ....................................................................................... 26
5.1 Overview ............................................................................................................ 26 5.2 Testable Features .............................................................................................. 26
6 Features Verified by Design ............................................................................ 29
6.1 Overview ............................................................................................................ 29 6.2 Packaging Requirements ................................................................................... 29
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 2 Table of Contents
6.3 Performance Requirements ............................................................................... 29 6.4 Safety Requirements ......................................................................................... 29 6.5 Maintenance and Support Requirements ........................................................... 29 6.6 Other Requirements ........................................................................................... 29
7 Overall Test Strategy ........................................................................................ 30
7.1 Overview ............................................................................................................ 30 7.2 Configurations .................................................................................................... 30 7.3 Strategy ............................................................................................................. 30 7.4 Metrics ............................................................................................................... 31 7.5 Regression......................................................................................................... 31
8 Acceptance Criteria .......................................................................................... 32
8.1 Overview ............................................................................................................ 32 8.2 Hardware Tests ................................................................................................. 32 8.3 Unit Tests .......................................................................................................... 32 8.4 Component Tests ............................................................................................... 32 8.5 Integration Tests ................................................................................................ 33 8.6 System Verification Tests ................................................................................... 33
9 Test Deliverables .............................................................................................. 34
9.1 Overview ............................................................................................................ 34 9.2 Deliverables ....................................................................................................... 34
10 Test Schedule ................................................................................................... 35
10.1 Overview ............................................................................................................ 35 10.2 Test Schedule .................................................................................................... 35
11 Approvals .......................................................................................................... 36
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 3 List of Tables
List of Tables
Table 2-1: Data Elements Description ........................................................................... 13
Table 2-2: Requirements Mapping Table ...................................................................... 14
Table 2-3: Requirements Traceability Matrix ................................................................. 16
Table 3-1: Hardware Tests ............................................................................................ 19
Table 3-2: Unit Tests ..................................................................................................... 21
Table 3-3: Component Tests ......................................................................................... 22
Table 3-4: Integration Tests .......................................................................................... 23
Table 3-5: System Verification Tests ............................................................................. 23
Table 4-1: Risks Table .................................................................................................. 25
Table 10-1: Test Schedule ............................................................................................ 35
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 4 List of Figures
List of Figures
Figure 1-1: Child-Parent Pairing System Concept ........................................................... 6
Figure 2-1: Architecture Overview ................................................................................. 10
Figure 2-2: Data Flow between Subsystems ................................................................. 12
Figure 2-3: Data Flow between Modules and Subsystems ........................................... 15
Figure 3-1: Relational Diagram ..................................................................................... 18
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 5 Document Revision History
Document Revision History
Revision # Revision Date Description
0.1 03/25/2013 Initial Draft
0.2 03/26/2013 Review 1 on the Initial Draft
0.3 03/27/2013 Review 2 on the Initial Draft
1.0 03/28/2013 Release Initial Draft
2.0 04/08/2013 Baseline
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 6 Introduction
1 Introduction
Figure 1-1: Child-Parent Pairing System Concept
1.1 Document Overview
The System Test Plan Document (STPD) specifies the testing procedures for the Child-Parent Pairing System. The test methods were discussed in the Architectural Design Specification (ADS) and Detailed Design Specification (DDS) documents, while this document builds off of the procedures laid out in those documents; this document will specify testing in much more detail. The STPD will detail the test procedures and the
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 7 Introduction
metrics that will be used to ensure all requirements specified in the System Requirements Specification (SRS) have been met. The testing process will be composed of system verification testing, integration testing, component testing, unit testing, and hardware testing and shall be described in the following document body.
1.2 Purpose
The Child-Parent Pairing System (CPPS) will be a user friendly system capable of mitigating the risk of child transfer at child care facilities. The system will verify the identity of the individuals dropping off and picking up children at the facility, and then alert the child it is time to be picked up via a wearable alert device. The CPPS will use radio communication to activate the wearable device, once a parent or guardian has verified their identity against what is stored in the local database. The system shall consist of GUI software for the facility staff, a card reader, camera, and keypad for the parent to verify their identity with, and a wearable unit to alert the child it is time to leave the facility.
1.3 Scope
The Child-Parent Pairing System is intended for use in facilities where childcare is provided. A few examples are as follows: Daycare centers, churches that have children’s services or programs (e.g. Sunday school), or possibly even schools that may have an after-school program or weekend classes. Therefore, the consumers will generally be organizations or companies, as opposed to individuals.
1.4 Definitions and Acronyms
Abbreviation Definition
H Hardware Testing
U Unit Testing
C Component Testing
I Integration Testing
S System Testing
CPPS Child-Parent Pairing System
GUI Graphical User Interface
ADS Architectural Design Specification
DDS Detailed Design Specification
SRS System Requirements Specification
I/O Input/Output
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 8 References
2 References
2.1 Overview
The STPD draws from elements within the SRS, ADS, and DDS documents in order to ensure that all aspects of the CPPS have been tested and verified to the best of ability.
2.2 System Requirement Specification
The SRS outlines the system requirements of the CPPS
2.2.1 Customer Requirements
2.2.1.1 Child-Care Facility
2.2.1.2 Easy Pairing
2.2.1.3 Secure Pairing
2.2.1.4 Registration
2.2.1.5 Identification Pairing Registration
2.2.1.6 Wearable Unit
2.2.1.7 Wireless Wearable Unit
2.2.1.8 Age Limitations
2.2.1.9 Charging Rack
2.2.1.10 Data Security
2.2.1.11 Pick-up Notification
2.2.1.12 Visual Notification
2.2.1.13 Small Form Factor
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 9 References
2.2.1.14 Light Weight
2.2.1.15 Base Unit
2.2.1.16 Check-in
2.2.1.17 Check-out
2.2.1.18 Software Authentication
2.2.1.19 Graphical User Interface
2.2.2 Packaging Requirements
2.2.2.1 Hardware Component
2.2.2.2 Software Component
2.2.2.3 Power Source
2.2.3 Performance Requirements
2.2.3.1 Open Space Operation
2.2.3.2 Walled Space Operation
2.2.3.3 Test Connection
2.2.3.4 Type of Power Source
2.2.3.5 Operation Time
2.2.3.6 Durability
2.2.3.7 Call Time
2.2.3.8 Charge Time
2.2.4 Safety Requirements
2.2.4.1 Operational Frequency
2.2.4.2 Cutting Prevention
2.2.4.3 Material Toxicity
2.2.4.4 Eye Protection
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 10 References
2.2.4.5 Removability
2.2.5 Maintenance and Support Requirements
2.2.5.1 Testing
2.2.5.2 Code Documentation
2.2.5.3 System Maintenance
2.2.6 Other Requirements
2.2.6.1 Operating System
2.2.6.2 Wireless Communication Interference
2.2.6.3 Standard English Documentation
2.2.6.4 Standard US Time Stamp
2.3 Architecture Design Specification
The ADS outlines the planned architecture of the CPPS.
2.3.1 Layer Overview
The CPPS’ design will be layered as defined by Figure 2-1
Figure 2-1: Architecture Overview
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 11 References
2.3.1.1 I/O Layer
The I/O Layer is responsible for handling staff/parent input and GUI output of the CPP System. The I/O Layer takes input from the staff or parent, performs minimum formatting on it and passes it on to the Application Layer. Also, I/O Layer takes messages from the Application Layer and passes it on to the GUI.
2.3.1.2 Application Layer
The Application Layer is the core framework that controls all the activities in the CPPS. The data for Application Layer comes from the I/O Layer and the Data Storage Layer. For system management, the Application Layer will be responsible to store staff, user and device data to the Data Storage Layer. For pairing events, Application Layer will request resources from the Data Storage Layer before talking to the Communication Layer.
2.3.1.3 Data Storage Layer
The Data Storage Layer stores all relevant staff authentication data, parents’ data, children data, pairing data, and units’ data. It interacts with the Services Subsystem in the Application Layer. When any of the services request data in order to perform their actions, the Data Storage Layer replies with the inquired information. The same, when services from the Application Layer need to store data, the Data Storage Layer will make sure that it is done.
2.3.1.4 Communication Layer
The Communication Layer of the CPPS will process the raw data packets received from the Application Layer for testing or notification of a particular child unit. This layer will encode the data packets and then transmit them securely to the Child Unit Layer.
2.3.1.5 Child Unit Layer
The Child Unit Layer will be responsible for securely and accurately notifying the arrival of the parent to the right child. This layer will receive encoded data packets from the Communication Layer and then it will be processed to notify the right child unit.
2.3.2 Subsystem Overview
Each layer in the CPPS will further be divided into smaller subsystems as detailed in Figure 2-2.
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 12 References
Figure 2-2: Data Flow between Subsystems
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 13 References
2.3.3 Inter-Subsystem Data Flow
The data flows shown in Figure 2-2 are defined in Table 2-1.
# Description
I1 Digital Image Format from the Webcam
I2 Text String from the Keyboard and Action Events from the Mouse
I3 Text String from the Card Scanner
I4 Text String from the Keypad
I5 Input Objects that contain staff members or parents information
A1 Output Objects that contain status messages and system data
A2 Data Objects that contain staff members, parents and devices information
A3 Data Objects that contain staff members, parents and devices information
A4 Requested Data that is used for storing, matching, updating or editing
A5 Data Objects that contain devices and pairing information
A6 Unit ID
D1 Responses to the data request
D2 Query to the Database Subsystem
D3 Result query from the Database Subsystem
C1 Digital Data Packets generated from the USB Driver Subsystem
C2 Analog Data Packets that will be encoded for transmission
C3 Analog Data Packets that will be sent for Child Unit Layer
U1 Analog Data Packets that are received by Child Unit Layer
U2 Digital Data Packets used to display the notification on the Child Unit Table 2-1: Data Elements Description
2.3.1 Requirements Mapping
# Requirement I/O Application Data Storage
Communication
Child Unit
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 14 References
3.1 Child-Care Facility
3.2 Easy Pairing
3.3 Secure Pairing
3.4 Registration
3.5 Identification Pairing Registration
3.6 Wearable Unit
3.7 Wireless Wearable Unit
3.8 Age Limitations
5.1 Open Space Operation
5.2 Walled Space Operation
5.3 Test Connection
5.4 Type of Power Source
5.5 Length of Operation Time
5.6 Durability
6.1 Operation Frequency
6.2 Cutting Prevention
6.3 Material Toxicity
6.4 Eye Protection
6.5 Removability
8.1 Operating System Table 2-2: Requirements Mapping Table
2.4 Detailed Design Specification
2.4.1 Module Decomposition Chart
The subsystems detailed in the ADS have been divided into smaller modules with data flows as described in Figure 2-3.
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 15 References
Figure 2-3: Data Flow between Modules and Subsystems
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 16 References
2.4.2 Requirements Traceability Matrix
MODULES
Imag
e P
repr
oces
sor
Tex
t P
repr
oces
sor
Inpu
t O
bjec
t C
onve
rter
Out
put
Obj
ect
Con
vert
er
Par
ser
Use
r O
bjec
t H
andl
er
Pai
ring
Obj
ect
Han
dler
Abs
trac
t R
eque
st B
uild
er
Dat
a D
ecom
pose
r
Uni
t ID
Par
ser
Aut
hent
icat
or
Dat
abas
e
Pac
ket
Send
er
Enc
oder
Con
vert
er
Rad
io R
ecei
ver
Dec
oder
Not
ifica
tion
REQ
UIR
EM
EN
TS
3.1 Child-Care Facility
3.2 Easy Pairing
3.3 Secure Pairing
3.4 Registration
3.5 Identification Pairing Registration
3.7 Wireless Wearable Unit
5.1 Open Space Operation
5.2 Walled Space Operation
5.3 Test Connection
5.4 Type of Power Source
5.5 Length of Operation Time
5.6 Durability
6.1 Operation Frequency
8.1 Operating System
Table 2-3: Requirements Traceability Matrix
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 17 Test Items
3 Test Items
3.1 Overview
This section describes the testing strategy and approach team AVIAR shall use to verify that CPPS meets the established requirements. The test plan composes of five different phases:
1. Hardware Testing Tests performed on the vendor hardware
2. Unit Testing Tests performed on the sub-system modules
3. Component Testing Tests performed on all sub-systems in a single layer
4. Integration Testing Tests performed on all the layers
5. System Testing Tests performed on the whole system in contrast to the established requirements.
Figure 3.1 represents the system test plan. The test phase on the left is the dependency of the test phase on right. Each phase must be successfully completed before moving on the next phase. The phase is decomposed into different units that are connected by the arrow paths. The subsystems and the layers have been colored to match the prior documentations which include System Requirement Specification (SRS), Architectural Design Specification (ADS) and Detailed Design Specification (DDS). The testing strategy and approach is described in the table format. The table header includes inputs, expected outputs, risk level and associated risks.
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 18 Test Items
3.2 Relational Diagram
Figure 3-1: Relational Diagram
3.3 Hardware Tests
ID Hardware Input Expected Output
Risk Level
Associated Risks
H1 Webcam Connect the host computer to the Internet and use the online webcam testing software www.testmycam.com
The website shall provide a video buffer from the webcam.
High R8
H2 Keyboard/Mouse Press a random button on mouse or keyboard
Digital signal identifying button pressed.
Medium R10
H3 Card Reader Open any text editor in One beeping noise High R9
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 19 Test Items
the host computer and slide a magnetic card through the card scanner.
from the card scanner and string output of the magnetic card information.
H4 Keypad Press a random button on keypad
Digital signal identifying button pressed.
Medium R10
H5 USB Driver Packet sent from the computer in 8 bit binary number
Correct packet of 8 bit binary number
Low R6
H6 Transmitter Packet sent as encoded analog data packet with 16 byte data
Test packet identical to the sent 16 byte data packet
High R1, R6
H7 Receiver Analog data packet 8 bit binary number containing unit ID
High R1, R2, R3
H8 LED 5 volt electricity LED illuminates Medium R2, R3, R4 Table 3-1: Hardware Tests
3.4 Unit Tests
ID Modules Input Expected Output
Risk Level
Associated Risks
U1 Image Preprocessor
Buffered image from the webcam
String objects that contain the image directory
Low R8
U2 Text Preprocessor Buffered text from keyboard/mouse, card scanner and keypad
String objects that contain the input text
High R9, R10
U3 Input Object Converter
String Objects from Image and Text Preprocessor
Object that contains Strings from Image and Text Preprocessor Modules
High R8, R9, R10
U4 Output Object Converter
Output data objects for the GUI set
Data in the form of displayed text or image on the GUI
Medium R7, R8, R9, R10
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 20 Test Items
fields.
U5 Parser Object that contains both Strings from Image and Text Preprocessor Modules.
Object that contains data for user management or pairing events
Medium R8, R9, R10
U6 User Object Handler
Object that contains data for user management
Abstract requested command for user data
High R7
U7 Pairing Object Handler
Object that contains data for events
Abstract requested command for parent/child and device data
High R7
U8 Abstract Request Builder
Abstract requested command that contains either user or parent/child/device data
Abstract requested data
Medium R7
U9 Data Decomposer Responses Data Object Responses objects of user or device
Medium R7
U10 Unit ID Parser Unit Data Object that contains Child and Unit information
Unit ID in the form of Integer
High R7
U11 Authenticator Username and password combination or magnetic card ID and PIN combination
SQL query containing salted and hashed password or PIN
Medium R9, R10
U12 Database SQL queries Database records High R7
U13 Packet Sender Digital data packet sent with 8-bit binary number
Digital data packet of 8-bit binary number
Medium R1, R6
U14 Encoder Digital data packet of unit id
Binary packet that contains encoded unit ID in digital data format
Low R7
U15 Converter Digital data packet of unit ID
Analog data packet that contains unit ID.
High R6
U16 Radio Receiver Analog data packets of Packets converted High R1, R2, R3, R6
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 21 Test Items
unit ID to digital data packet of unit ID
U17 Decoder Encoded digital data packet that contains unit ID
Unit ID which is in format of 8-bit binary number.
High R1, R2, R3, R6
U18 Notification 8-bit binary number of the unit ID
LED illuminates when the right ID received
Medium R1, R2, R3, R4, R6
Table 3-2: Unit Tests
3.5 Component Tests
ID Subsystems Input Expected Output
Risk Level
Associated Risks
C1 Input Buffered text and image input
Object that contains both String objects from Image and Text Preprocessor Modules
High R8, R9, R10
C2 Output Output data objects for the GUI set
Data in the form of displayed text or image on the GUI fields.
Low R7
C3 Data Processing Object that contains data from Input Subsystem of the I/O Layer
Abstract requested commands for user data or pairing events
Medium R8, R9, R10
C4 Data Management Abstract requested commands for user data or pairing events
Abstract requested data
Medium R8, R9, R10
C5 Unit Unit data object that contains child and unit information
Unit ID in the form of Integer
High R7
C6 Authentication Instantiated objects and authentication details (username/password or magnetic card ID/PIN)
Boolean value that indicates whether the user is authorized to
Medium R7
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 22 Test Items
perform the requested action(s)
C7 Database Formatted, syntactically correct database queries
Database result set(s) containing accurate records that have been previously stored
High R7
C8 USB Driver Digital data packet sent with 8-bit binary number
Digital data packet of 8-bit binary number
Medium R6
C9 Encoding Digital data packet of unit id
Binary packet that contains encoded unit ID in digital data format
Low R6
C10 Transmission Digital data packet of unit ID
Analog data packet that contains unit ID.
High R6
C11 Radio Analog data packets of unit ID
Packets converted to digital data packet of unit ID
High R1
C12 Processing Encoded digital data packet that contains unit ID
Unit ID which is in format of 8-bit binary number.
High R2, R3
C13 Notification 8-bit binary number of the unit ID
LED illuminates when the right ID received
Medium R4
Table 3-3: Component Tests
3.6 Integration Tests
ID Layers Input Expected Output
Risk Level
Associated Risks
I1 I/O Buffered text and image input
Displayed data on GUI set
High R8, R9, R10
I2 Application Object that contains data from Input Subsystem of the I/O
Abstract requested data and Unit ID
High R7, R8, R9, R10
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 23 Test Items
Layer
I3 Data Storage Staff, parent, child, or device storage or extraction requests
Storage confirmation or requested columns from existing database records
High R7
I4 Communication 8-bit binary number unit ID in digital data format
Analog data packet that contains the encoded unit ID in 8-bit binary number.
Medium R1, R6
I5 Child Unit Analog data packet that contains the unit ID
When the processing it is the correct unit so LED will be illuminated to notify
Medium R1, R2, R3, R4
Table 3-4: Integration Tests
3.7 System Verification Tests
ID System Input Expected Output
Risk Level
Associated Risks
S1 Notification Admin clicks to notify the unit
LED illuminates on the unit to identify the child
High R1, R2, R3, R4, R5, R6
Table 3-5: System Verification Tests
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 24 Risks
4 Risks
4.1 Overview
The risks section shall identify any risks associated with the System Test Plan for the CPPS. The impact, severity and affected components shall be assessed in the risk table and Team AVIAR shall develop a management plan for each risk.
4.2 Risks Table
ID Risk Impact Severity Affected Components
Management Plan
R1 Interference with the radio waves
Unit ID radio signals will not be transmitted
High Communication Layer
Change the operational frequency until an interference free one is found
R2 Child unit is not charged
Unit ID signal cannot be received
High Child Unit Make sure the unit is charged before putting it on a child
R3 Child unit does not hold a charge
Unit ID signals cannot be received
High Child Unit Have a backup capacitors on hand
R4 Child unit’s LED malfunctions and stops working
Child will not be able to see the notification
High Child Unit Have backup LEDs on hand
R5 Child unit comes loose
Child loses the unit and will not be able to be notified
High Child Unit Replace the unit with one that has a better grip
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 25 Risks
R6 Base Unit malfunctions and stops working
Unit ID radio signals cannot be transmitted
High Communication Layer
Have a backup base unit on hand and make sure all components are connected properly
R7 Database malfunctions and stops working
Data cannot be loaded into the system
High Storage Layer Have a backup computer with the database software installed and backup data loaded
R8 Webcam malfunctions and stops working
Photos of parents cannot be taken
Medium I/O Layer Have a backup webcam on hand
R9 Card reader malfunctions and stops working
State ID card cannot be scanned
Medium I/O Layer Type the State ID card number on the keyboard
R10 Keypad malfunctions and stops working
PIN cannot be typed
Medium I/O Layer Type the PIN on the keyboard
Table 4-1: Risks Table
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 26 Features to be tested
5 Features to be tested
5.1 Overview
Testable features include the list of requirements listed in the SRS; these features can be tested and verified by the users, as well as by team AVIAR. There are three levels of risk associated with each test:
• High : Feature may be difficult to test • Medium : Has been tested and may not work as expected • Low : Will be implemented and work properly
5.2 Testable Features
5.2.1 Easy Pairing
5.2.1.1 Risk
Medium
5.2.1.2 Description
This test will verify whether or not a child unit and the system can be easily paired. When the staff wants to assign a child unit to the child it should be simple process overall.
5.2.2 Secure Pairing
5.2.2.1 Risk
Medium
5.2.2.2 Description
This test will verify whether or not the child unit has been paired securely. This involves the Communication Layer to send data packets in manner that only the expected unit gets the notification.
5.2.3 Registration
5.2.3.1 Risk
Low
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 27 Features to be tested
5.2.3.2 Description
This test will verify whether a parent can be registered in the system or not. The parent will be required to register their child into the system and it should be a simple process.
5.2.4 Wireless Wearable Unit
5.2.4.1 Risk
Medium
5.2.4.2 Description
This test will verify whether the wearable unit will be wireless or not. The wearable unit will receive the data packets wirelessly from the transmitter in order to notify the child.
5.2.5 Charging Rack
5.2.5.1 Risk
High
5.2.5.2 Description
This test will verify whether the charging rack for the system is working properly or not. Based on the number of created child units, team AVIAR will create a charging rack to charge the child units.
5.2.6 Data Security
5.2.6.1 Risk
Low
5.2.6.2 Description
This test will verify whether the data has been stored securely in the system or not. There will be personal information in the system about the parents and children, therefore the data should be securely stored.
5.2.7 Visual Notification
5.2.7.1 Risk
Medium
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 28 Features to be tested
5.2.7.2 Description
This test will verify whether the child unit visual notification works or not. Visual notification will be a LED placed on the child unit and will verify that it turns on when the unit has been notified.
5.2.8 Software Authentication
5.2.8.1 Risk
Low
5.2.8.2 Description
This test will verify whether the software authenticates the user or not. If a parent has never been to the child care facility before, he/she should not be able to login to the system because he does not exist in the system.
5.2.9 Test Connection
5.2.9.1 Risk
Medium
5.2.9.2 Description
This test will verify whether the connection between the system and the child unit can be tested or not. It is required to test the connection to see if the child unit works before assigning that particular child unit to the child.
5.2.10 Charge Time
5.2.10.1 Risk
Medium
5.2.10.2 Description
This test will verify whether the child unit will be on for at least 6 hours. The child unit shall be able to stay on for this period of time so child can be notified even if the child will be in the facility for a longer period of time.
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 29 Features Verified by Design
6 Features Verified by Design
6.1 Overview
The features verified by design are requirements that cannot be tested because of constraints, testing difficulty level, or because they are non-functional requirements.
6.2 Packaging Requirements
Packaging requirements will not be tested due to time constraints of the project, or because they imply visual inspection of physical parts.
6.3 Performance Requirements
6.3.1 Durability
Durability will not be tested, because of the time constraints of the project.
6.3.2 Power source replacement
The operating life of the power source will not be tested, because of the time constraints of the project.
6.4 Safety Requirements
6.4.1 Material Toxicity
Child Unit’s material toxicity will not be tested, due to lack of toxicity testing equipment.
6.5 Maintenance and Support Requirements
Maintenance and support requirements will not be tested because they include documents and maintenance support which cannot be tested properly.
6.6 Other Requirements
Other requirements will not be tested because they include documentation specification (documents in American English, Standard US Time Stamp) and wireless interference that cannot be tested in an objective manner.
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 30 Overall Test Strategy
7 Overall Test Strategy
7.1 Overview
The overall test strategy describes how team AVIAR is going to approach the testing in terms of tools, procedures and support. Team AVIAR shall test CPPS to ensure that it meets the requirements and architecture specified in SRS, ADS and DDS.
7.2 Configurations
Team AVIAR shall test all configurations in the CPPS from high to low priority. Any future items shall not be tested.
7.3 Strategy
Team AVIAR shall approach the testing in the following order:
• Hardware Test • Unit Test • Component Test • Integration Test • System Verification Test
Each test will be documented in the following format:
• Test ID • Tester Name • Input(s) • Expected Output(s) • Actual Output(s) • Result • Tester Notes
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 31 Overall Test Strategy
7.4 Metrics
Team AVIAR shall set the level of each test according to the requirements established in the SRS and their effects on other components within the system’s domain. The tests shall have three levels of priority, as follows:
7.4.1 High
CPPS fails to meet all high level requirements.
7.4.2 Medium
CPPS meets all high level requirements, but fails one or more medium level requirements.
7.4.3 Low
CPPS meets all high and medium level requirements, but fails one or more low level requirements.
7.5 Regression
Team AVIAR shall utilize regression testing if a bug or malfunction is encountered each time a new component or module is integrated to the system. There are two steps during regression testing: filter out the bug then fix the bugs and retest the system. The tester shall follow the following procedures for the first phase of regression testing.
• Filter the bug by layers • Filter the bug by components • Filter the bug by units
At the second phase of regression testing, the tester shall follow the procedures:
• Test the module that has the bug after fixing • Test the component that has the bug after fixing • Test the layer that has the bug after fixing • Test the whole system after fixing
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 32 Acceptance Criteria
8 Acceptance Criteria
8.1 Overview
Team AVIAR will use the acceptance criteria in order to grade each test on a pass or fail basis. The acceptance criteria are divided according to the different phases of testing: Hardware, unit, component, integration, and system verification.
8.2 Hardware Tests
8.2.1 Pass
The hardware functions as expected and provides the expected output or action.
8.2.2 Fail
The hardware does not function as expected and/or does not provide the expected output or action.
8.3 Unit Tests
8.3.1 Pass
The module accepts valid inputs and produces valid outputs.
8.3.2 Fail
The module does not accept valid inputs and/or produces invalid outputs.
8.4 Component Tests
8.4.1 Pass
Each module works within a subsystem and the subsystem produces expected valid outputs or actions when given valid inputs.
8.4.2 Fail
One or more modules do not work within a subsystem and/or the subsystem does not produce expected valid outputs or actions when given valid inputs.
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 33 Acceptance Criteria
8.5 Integration Tests
8.5.1 Pass
Each subsystem works within a layer and the layer produces expected valid outputs or actions when given valid inputs.
8.5.2 Fail
One or more subsystems do not work within a layer and/or the layer does not produce expected valid outputs or actions when given valid inputs.
8.6 System Verification Tests
8.6.1 Pass
The CPPS performs as described in the SRS.
8.6.2 Fail
The CPPS does not perform as described in the SRS.
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 34 Test Deliverables
9 Test Deliverables
9.1 Overview
Team AVIAR shall keep a record of all the tests performed while developing the CPPS and document the results of such tests. The test report shall include the test performed, the results, any errors or defects found and the steps taken to correct those errors.
9.2 Deliverables
9.2.1 Test cases
Team AVIAR shall develop tests cases in order to test the CPP System. All test cases shall be documented and include the following:
• Test ID • Tester Name • Input(s) • Expected output(s) • Actual output(s) • Result of Test • Tester Notes
9.2.2 Test results
Each test case shall have the results properly documented. Test results shall include a notation of pass, fail or inconclusive and shall be assembled and delivered in the final test report.
9.2.3 Bugs and Defects
Team AVIAR shall document any bugs and defects found in the CPPS. Team AVIAR shall include the test ID, results, cause, impact on system and current status.
9.2.4 Test Code
Team AVIAR shall provide any test code used during the development process with the project closeout documentation.
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 35 Test Schedule
10 Test Schedule
10.1 Overview
This section includes the schedule of the testing process for the CPPS. The schedule follows the planned schedule in team AVIAR’s Microsoft Project Plan.
10.2 Test Schedule
Task Number Task Name Planned Start Planned Finish
1 Hardware Testing 03/26/2013 04/11/2013
2 Unit Testing 03/26/2013 04/11/2013
3 Component Testing 03/26/2013 04/16/2013
4 Layer Testing 03/26/2013 04/24/2013
5 Integration Testing 03/26/2013 04/24/2013
6 System Verification 04/17/2013 05/01/2013 Table 10-1: Test Schedule
Senior Design Documentation Library
8 April 2013 @ 11:11:00 AM 36 Approvals
11 Approvals
The system test plan and the procedures outlined have been approved by the following schedules:
Name Position Signature Date
Bhavin Mistry
Team AVIAR
Brandon Gallagher
Team AVIAR
Brandon Salahat
Team AVIAR
Eugen Eremia
Team AVIAR
Tony Kieu
Team AVIAR
Mike O’Dell
Department Head
David Levine
Project Sponsor