6
 Chicago Run Virtual Marathon Test Plan Page 1 Chicago Run Virtual Marathon  Test Plan 1. Introduction : This document is a test plan for the Chicago Run Virtual Marathon, produced by the DePaul Chicago Run group. The mission of this test plan is to provide i ncreased online interactivity for the Chicago Run student participants. 1.1. Purpose: The objective of the test plan is to detect an y errors based on the requirements  proposed by the system. After detection, these issues must be corrected if necessary. 1.2. Background : This test plan was developed to test the new Chicago Run Virtual Marathon. The site will access existing Chicago Run systems and must be fully compatible with the existing systems. 1.3. Technical Architecture: The system is run using Ruby On Rails, the Google Maps API, and a database. All of the code for the original system was pr ovided by PSC Listens via Google Code. 1.4. Specifications: The system itself requires no special hardware. Accessing the system requires a computer capable of displaying color, and a web browser (Microsoft Internet Explorer or Mozilla Firefox). 1.5. Scope: The two primary resources necessary for testing are: access to the system, and human time. Joseph is responsible f or testing the interacti on of incentives and landmarks. Brian is responsible for testing the Google Maps code. Fernando and Matt are responsible for testing usability . Ben is responsible for testing the user inter face. 1.6. Project Information: This project has a Requirements Document, Usability Test, Usability Test Repor t, and this Test Plan. All documents are available on the Chicago Run cd or through Google Code. 2. Requirements: This section of the test plan lists all requirements to be tested. An y requirement not listed is outside of the scope of the test plan. 2.1. Functional Test Requirements: 2.1.1. Create new users 2.1.2. Edit existing users 2.1.3. Delete users 2.1.4. Access database 2.1.5. Access website 2.1.6. Provide visual map interface to display user p rogress 2.1.7. Access database for user information to display 2.1.8. Display "Landmark" pop-up windows 2.1.9. Display "Incentive" pop-up windows 2.1.10. Provide five static routes for users to “run.” 2.1.11. Application routes must be expandable 2.2. Design Requirements: 2.2.1. Application needs to be visually exciting 2.2.2. Application should use Chicago Run color scheme 2.3. Integration Requirements: - 2.3.1. Landmarks, incentives, and routes should be stored in the database 2.3.2. “Pop-Up” windows must access the database

Chicago Run Test Plan

Embed Size (px)

Citation preview

Page 1: Chicago Run Test Plan

7/27/2019 Chicago Run Test Plan

http://slidepdf.com/reader/full/chicago-run-test-plan 1/5

 

Chicago Run Virtual Marathon Test Plan Page 1

Chicago Run Virtual Marathon

 Test Plan

1.  Introduction :This document is a test plan for the Chicago Run Virtual Marathon, produced by the DePaulChicago Run group. The mission of this test plan is to provide increased online interactivity for 

the Chicago Run student participants.

1.1. Purpose: The objective of the test plan is to detect any errors based on the requirements proposed by the system. After detection, these issues must be corrected if necessary.

1.2. Background: This test plan was developed to test the new Chicago Run VirtualMarathon. The site will access existing Chicago Run systems and must be fully compatiblewith the existing systems.

1.3. Technical Architecture: The system is run using Ruby On Rails, the Google MapsAPI, and a database. All of the code for the original system was provided by PSC Listens

via Google Code.

1.4. Specifications: The system itself requires no special hardware. Accessing the systemrequires a computer capable of displaying color, and a web browser (Microsoft Internet

Explorer or Mozilla Firefox).

1.5. Scope: The two primary resources necessary for testing are: access to the system, and human time. Joseph is responsible for testing the interaction of incentives and landmarks.

Brian is responsible for testing the Google Maps code. Fernando and Matt are responsiblefor testing usability. Ben is responsible for testing the user interface.

1.6. Project Information: This project has a Requirements Document, Usability Test,Usability Test Report, and this Test Plan. All documents are available on the Chicago Runcd or through Google Code.

2.  Requirements:This section of the test plan lists all requirements to be tested. Any requirement not listed isoutside of the scope of the test plan.

2.1. Functional Test Requirements:2.1.1.  Create new users

2.1.2.  Edit existing users2.1.3.  Delete users

2.1.4.  Access database

2.1.5.  Access website2.1.6.  Provide visual map interface to display user progress

2.1.7.  Access database for user information to display

2.1.8.  Display "Landmark" pop-up windows2.1.9.  Display "Incentive" pop-up windows

2.1.10. Provide five static routes for users to “run.”

2.1.11. Application routes must be expandable2.2. Design Requirements: 

2.2.1.  Application needs to be visually exciting2.2.2.  Application should use Chicago Run color scheme

2.3. Integration Requirements: -

2.3.1.  Landmarks, incentives, and routes should be stored in the database2.3.2.  “Pop-Up” windows must access the database

Page 2: Chicago Run Test Plan

7/27/2019 Chicago Run Test Plan

http://slidepdf.com/reader/full/chicago-run-test-plan 2/5

 

Chicago Run Virtual Marathon Test Plan Page 2

Chicago Run Virtual Marathon

 Test Plan

3.  Test Strategy:This test will be performed by the project team using a combination of web browsers (InternetExplorer and Firefox), operating systems (Windows, Linux), and hardware (PC, Mac) with test

user data.

3.1. Objective:This is a complete system test designed to meet all requirements documented in the

Requirements document. It will also assess modifications that were necessary after usability

testing.

3.2. Technique:Test cases were designed based on the Requirements document and the Usability Test

Report. Tests performed using Internet Explorer and Firefox using test user data. Tests will

 be performed cyclically after code modification or error correction.

3.3. Special Considerations:Actual routes, “Incentives”, and “Landmarks” must be accessible, along with test user data

3.4. Test Cases:

3.4.1.  Create new users:3.4.1.1. Login to Application as “super”

3.4.1.2. Access Users3.4.1.3. Create new user 

3.4.1.4. Verify user was added to database

3.4.1.5. Logout3.4.2.  Edit existing users:

3.4.2.1. Login to Application as “super”

3.4.2.2. Access Users

3.4.2.3. Modify existing user 3.4.2.4. Verify changes were added to database

3.4.2.5. Logout3.4.3.  Delete users:

3.4.3.1. Login to Application as “super”

3.4.3.2. Access Users

3.4.3.3. Delete existing user 3.4.3.4. Verify changes were added to database

3.4.3.5. Logout

3.4.4.  Access database:

3.4.4.1. Verify Application accesses database3.4.5.  Provide visual map interface to display user progress:

3.4.5.1. Login to Application as student

3.4.5.2. Verify map3.4.5.3. Logout

3.4.6.  Access database for user information to display:

3.4.6.1. Login to Application as student3.4.6.2. Verify user information is properly displayed on map

3.4.6.3. Logout

Page 3: Chicago Run Test Plan

7/27/2019 Chicago Run Test Plan

http://slidepdf.com/reader/full/chicago-run-test-plan 3/5

 

Chicago Run Virtual Marathon Test Plan Page 3

Chicago Run Virtual Marathon

 Test Plan

3.4.7.  Display "Landmark" pop-up windows:

3.4.7.1. Login to Application as student3.4.7.2. Verify “Landmark” pop-up windows display

3.4.7.3. Access a “Landmark”

3.4.7.4. Logout

3.4.8.  Display "Incentive" pop-up windows:3.4.8.1. Login to Application as student

3.4.8.2. Verify “Incentive” pop-up windows display

3.4.8.3. Access an “Incentive”3.4.8.4. Logout

3.4.9.  Provide static routes for users to “run:”

3.4.9.1. Login to Application as “super”3.4.9.2. View Routes

3.4.9.3. Logout

3.4.10. Application routes must be expandable:3.4.10.1.  Login to Application as “super”

3.4.10.2.  Access Routes

3.4.10.3.  Click “Add New Route”

3.4.10.4.  Add name and description3.4.10.5.  Upload .KML file

3.4.10.6.  Click “Create”

3.4.10.7.  Verify route was added to database3.4.10.8.  Logout

3.4.11. Application needs to be visually exciting:

3.4.11.1.  Login to Application as student3.4.11.2.  Verify site is visually exciting

3.4.11.3.  Verify icons, colors, pop-up windows

3.4.11.4.  Logout

3.4.12. Application should use Chicago Run color scheme:3.4.12.1.  Login to Application as student

3.4.12.2.  Verify site uses Chicago Run color scheme

3.4.12.3.  Logout3.4.13. “Landmarks”, “Incentives”, and routes should be stored in the database:

3.4.13.1.  Login to Application as “super”

3.4.13.2.  Access Routes3.4.13.2.1.  Verify new routes

3.4.13.2.2.  Verify deleted routes

3.4.13.2.3.  Verify modified routes3.4.13.3.  Access “Landmarks”

3.4.13.3.1.  Verify new “Landmarks”3.4.13.3.2.  Verify deleted “Landmarks”

3.4.13.3.3.  Verify modified “Landmarks”3.4.13.4.  Access “Incentives”

3.4.13.4.1.  Verify new “Incentives”

3.4.13.4.2.  Verify deleted “Incentives”3.4.13.4.3.  Verify modified “Incentives”

3.4.13.5.  Logout

Page 4: Chicago Run Test Plan

7/27/2019 Chicago Run Test Plan

http://slidepdf.com/reader/full/chicago-run-test-plan 4/5

 

Chicago Run Virtual Marathon Test Plan Page 4

Chicago Run Virtual Marathon

 Test Plan

3.4.14. Pop-up windows must access the database:

3.4.14.1.  Login to Application as student

3.4.14.2.  Determine total mileage

3.4.14.3.  Access map3.4.14.4.  Verify only pop-up windows within proximity display

3.4.14.5.  Logout

3.5. Completion Criteria:All tests must be completed successfully without critical or sever defects. Any test that fails

will be determined to be a severe or critical defect and will be corrected. Cosmetic and annoyance defects will be corrected by the project team if time allows.

3.6.  Assumptions: The existing infrastructure was created by PSC Listens for Chicago Run. The project teamwill require support from the PSC liaison. Chicago Run will provide the project team with

“Landmarks,” “Incentives,” and routes.

3.7. Tools: 

Tool Support Contact

Mozilla Firefox 2 [Support] 

Mozilla Firefox 3 [Support] 

Internet Explorer 7 [Support] 

Internet Explorer 8 [Support] 

Google Maps [Support] 

4.  Resources: Resource roles and responsibilities required for testing.

4.1. Project Plan: The project plan was updated to reflect resources and deadlines.

4.2.  Application Access: Access to the application is necessary for testing.

5.  Schedule:The Application will be available for testing continuously, except in the event of scheduled 

downtime. Builds will be provided on a regular basis during the testing cycle.

6.  Deliverables:The test plan itself is available on Google Code, as well as the Chicago Run cd. The test caseresults will be available on Google Code, with results appended to this test plan.

Page 5: Chicago Run Test Plan

7/27/2019 Chicago Run Test Plan

http://slidepdf.com/reader/full/chicago-run-test-plan 5/5

 

Chicago Run Virtual Marathon Test Plan Page 5

Chicago Run Virtual Marathon

 Test Plan

7.  Defect Tracking and Reporting: Defects will be tracked and recorded in the Chicago Run section of Google Code. Reports based

on the project teams testing will be made available to Chicago Run on an as needed basis. The project team will test with personal and DePaul equipment. The following is the scale for defect

tracking:

7.1. Critical:Critical defects denote unusable functions with the highest impact. These defects cause

 problems in other areas of the system or shut the system down entirely.

7.2. Severe:

Severe defects denote functions that do not work, but do not affect other areas or shut the

system down.

7.3.  Annoyance:

Annoyance defects denote functions that work, but not as quickly as expected. This can beslow loading, a bad work-around, or a bad user experience.

7.4. Cosmetic:Cosmetic defects are not critical to system performance: bad spelling, improper formatting,

vague or confusing error messages or warnings.

8.   Approval:The test plan has been reviewed and approved by the project team.

9.  Results:When the test effort is complete, document the results. Identify any discrepancies between the plan and the actual implementation, and document how those discrepancies were handled.

Whatever defects you come up with, you should fix. It’s ok that you come up with problems; the

 point is to see if there are errors.

PUT SOME ERRORS IN HERE‼?!??!?!?!?