Upload
others
View
6
Download
0
Embed Size (px)
Citation preview
International Conference OnSoftware Test Automation
March 5-8, 2001San Jose, CA, USA
P R E S E N T A T I O N
Thursday, March 8, 20014:30 PM
REQUIREMENTS-DRIVEN
AUTOMATED TESTING
Jeff TatelmanSpherion Technology Architects
M9PresentationPaperBio
R equir ements Dr ivenAutomated Testing
Do you have your license to drive?
Jeff TatelmanSQuADNovember 14th , 2000
- Business challenges- Business solution
- Building the requirements driventesting process
- Building the automated regressiontesting process
- Q uestions & discussion
Agenda
Testing can no longer be the last step...itmust be an on-going part of the
development process.
Testing can no longer be the last step...itmust be an on-going part of the
development process.
Build 1Build 1 Build 2Build 2 Build 3Build 3 Build 4Build 4Appspec
TestTest TestTest TestTest TestTest
Iterative Development Challenge
Challenge Of L inkingR equir ements To Test Cases
DevelopmentLife Cycle
TestAutomation
TestManagement
RequirementsManagement
TestAutomation
Changes
Challenge Of How To Get Star ted?
- How many test cases do I need?- W hat type of test cases should I
automate?- W hat process should I use?- How can I take advantage of reusability?
B uilding the Requirements DrivenTesting Process
Business Solution
Iterative T esting
TestScope & Strategy
Test Planning
Test CaseConstruction
Design
Construction
Analysis
Implement
Test Execution/Evaluation
Object Or iented T esting Approach
Requirements
Unit Testing
Integration Testing
Systems Testing
Regression Testing
Build The Testing Appr oach
Requirements
BusinessEvents
Traceablility Matrix
DevelopTest Plan
DevelopTest Cases
1. Deter m ine Business E vents
Requirements
ShippingInventory
Maintenance
OrderProcessing
CustomerService
Example of an Inventory Control System
TC1 Add customerTC2 Add Order
TC3 Modify OrderTC4 Change Address
TC5 Ship OrderTC6 Approve Order
TC7 Bill Order
Test Case Repository
Example of an Inventory Control System
R1 Customer Service
R2 Order Processing
R3 Order Change
R4 Shipping
TC2 add order
TC3 Mod Order
TC4 Chg Addr
Run1 Run2
TC5 Ship
Run3
TC6 Approve order
Test PlanningFunction Coverage MatrixRequirements
Repository
2. Develop Traceability Matr ix
TC1 Add customer
T r a c eability M a t r ix Advantages
- Va lidation of requirements- Ensure coverage efficiency
- Knowledge of what is being testing
- Knowledge of what is not being tested- Reduce the number of test cases- Track changes:
- requirements to testing
- testing to requirements- Reusability for regression/automation
Test Runs:
Test Run 1 Description: Add a customer and regular order. Do not approve the order.
Assumptions Customer service, products, prices, records predefined. Order number will be systems generated.
InputsUI objects- Order Service, Customer AccountsCustomer John Smith, Products X23, AR232
OutputReports - Orders by Customer, Billing
Signoff
Run Date: Assigned:
Example of an Inventory Control System
3a. Develop Test Plan (Runs)
Day 1 (test cycle)
Objective: To add initial orders and new customers
Test Run 1 Description: Add a customer and regular order. Do not approve the
order.
Test Run 3 Description: Add an inter-company order with five items.
Approve the order and ship in the same day.
Example of an Inventory Control System
3b. Develop the Test Plan (C ycles)
4. Develop T est C a ses Detail
Design
TestData
TestAssumptions
Test Flow
ExpectedResults
4. Develop T est C a sesTEST CASE SPECIFICATION FORM
Project No: 1965 Project Name: ABC Inventory Ordering System Page 1 of 2
Test Case Description: Add a customer.
Build No: 1 Case No: 2 Execution Retry No:
Written By: Ruby TuesdayDate: 7/25/00
Requirement No: 1, 6, Executed By: Amanda JonesDate: 7/30/00
Setup For Test: Access to ABC Inventory ordering system website
Step Action Expected Results Pass /Fail
Action Takenif Step Fails
2.1 Click on ‘click here to continue’ ABC Search Customer Page becomesactive page.
2.2 Click text field to enter Customer name Cursor appears and text box is enabled
Business Solution
Building the AutomatedRegression Testing Process
- The testing of software after amodification has been made to ensurethe reliability of each software release
- The objective is to prevent defects inproduction by building and executing anautomated testing process that coversthe critical business events.
R egr ession T esting Definition
Test
Test
Howto
How To Develop AutomatedR egr ession T esting
- Build a defined and repeatable process- Use the matrix to help control the scope- Se t up reusable production test data- Se t up a separate test environment
- C reate an automation test plan
- use a subset of systems test cases
- rev iew systems test cases with users
- separate test cases into cycles- Document screens and test files
How To Develop AutomatedR egr ession T esting (cont.)
Run
Run
Run
W hy Is The E vent/Scenar io ApproachSo Important F or T est Automation?
- Managing your tests- Business events to test automation- Ease of maintenance
- Supports data driven approach- Reusability (& modularization)
- Same test event “add customer” is
reused in several automated test scenarios
K ey Components To ReceivingY our L icense To Drive
- Requirements management process
– treating requirements as objects usinga repository
- Test management process
– testing repository
– events/ scenarios- Automated testing standards
– test plan
– reusability- Resources/ Training
- Business challenges
- Iterative development
- Changes in life cycle
- How to get started- Business solution
- Traceability matrix
- Business events/ scenarios
- Reuse test cases for automated
regression testing
Sum m a r y
2. Develop trace1. Gather requirements 3. Develop manual process
4. Manage test cases5. Brainstorm
7. Analyze 8. Design
Develop test case matrixusing business/functions/events
to determine what to test
Gather business functions from requirement repository,
Documents or application.
Develop manual testcases and plan
Enter test cases into a repository
Determine which test cases to automate
Set up data files intest environment
Formulate design approach& identify required routines
P r ocess Summ a r y
Create data source and test scripts
6. Develop environment
Develop automationtest environment
9. Develop
Test Automation Testing Conference Requirements Driven Automated Testing
Inventory Control System Traceability Matrix Example
Function Name Run1 Run2 Run3 Run4 Run5 Run6 Run7 Run8 Run9
Customer ServiceNew X X XExisting X X X X XDelete XOrder Processingorder type phone X X X X X X X mail X Xorder status complete X X X X X X incomplete X X Xpricing taxable X X X X X X tax exempt X X XInventory Maint. instock X X X X X X X out of stock X X replenish inventory X XShippingbill customer approved X X X X X X X past due X X
Test Automation Testing Conference Requirements Driven Automated Testing
Inventory Control System Test Run Example
Test Run Description:(Test Run #1)
Add a new customer, with an order by phone with in stock inventory for all items. The orderis completed and billing is approved.
Assumptions:
1. Use default values unless specified here.
Input:
1. Customer number: D1091662. ITEM ID QTY
P30KS80 10P30KB51 203UA5400-2D 10Q220 2NR421 2SA1E025 2LN1E100 5
3. Attach input screens for verification.
OUTPUT:
1. Use Account Summary Inquiry and Item Inquiry for on-line validation.2. Print screen data before ENTER of data. Attach batch reports printout, and screen
data printout from inquiry screens.
SIGN OFF:
The above test case is:
_______________ Approved
_______________ Need to re-run
______________________ ____________________ _______________
(USER) (ANALYST) (DATE)
Test Automation Testing Conference Requirements Driven Automated Testing
Inventory Control System Test Run Example
Test Run Description:(Test Run #2)
Use an existing customer, with an order by phone with in stock inventory for all items. Theorder is completed and billing is approved.
Assumptions:
1. Customer must exist on the test database.
Input:
1. Customer number: S1229062. ITEM ID QTY
P30KS80 10P30KB51 20LN1E100 5
3.Attach input screens for verification
OUTPUT:
1. Use Account Summary Inquiry for on-line validation2. Use Past Due Report for printed validation.3. Print screen data before ENTER of data. Attach batch reports printout, and screen
data printout from inquiry screens.
SIGN OFF:
The above test case is:
_______________ Approved
_______________ Need to re-run
______________________ ____________________ _______________ (USER) (ANALYST)
Test Automation Testing Conference Requirements Driven Automated Testing
Inventory Control System Test Run Example
Test Run Description:(Test Run #3)
Use an existing customer, with an order by phone with in stockinventory for all items. The order is incomplete and billing ispast due.
Assumptions:
1. Customer must exist on the test database.
Input:
1. Customer number: G3441092. ITEM ID QTY
SNE400 2RU3210 1
3. Attach input screens for verification
Output:
1. Use Account Summary Inquiry for on-line validation2. Use Back Order Report for printed validation.3. Print screen data before ENTER of data. Attach batch reports printout, and screen
data printout from inquiry screens.
SIGN OFF:
The above test case is:
_______________ Approved
_______________ Need to re-run
______________________ ____________________ _______________ (USER) (ANALYST) (DATE)
Test Automation Testing Conference Requirements Driven Automated Testing
Inventory Control System Test Plan Example
DAY 1
OBJECTIVE: Ensure the accuracy of the data converted and loaded into the Inventory System.Sanity check that all programs and database changes are installed correctly.Backup the database for benchmarking.
Test Run 0 Description: Convert tax tables and history tables.
Test Run 2 Description:Use an existing customer, with an order by phone with in stockinventory for all items. The order is tax exempt, completed andbilling is approved.
Test Run 5 Description:Use an existing customer, with an order by phone with in stockinventory for all items. The order is taxable, completed andbilling is approved.
Verification Process:
• Account Summary Inquiry for on-line validation• Back Order printed report• Item Inquiry for on-line validation• Past Due Report for printed validation
Test Automation Testing Conference Requirements Driven Automated Testing
Inventory Control System Test Plan Example
DAY 2
OBJECTIVE: Ensure the accuracy of normal phone orders that are complete and taxable. Test approved in stock and approved out of stock orders. Test past
due and in stock orders.
Test Run 1 Description:Add a new customer, with an order by phone with in stockinventory for all items. The order is completed and billing isapproved.
Test Run 3 Description:Use an existing customer, with an order by phone with in stockinventory for all items. The order is incomplete and billing ispast due.
Test Run 4 Description:Add a new customer, with an order by phone out of stockinventory for two items. The order is completed and billing isapproved.
.
Verification Process:
• Account Summary Inquiry for on-line validation• Back Order printed report• Item Inquiry for on-line validation• Past Due Report for printed validation
Requirements-Driven Automated Testing… A License to Drive
Abstract:
Studies have shown that over 50% of software defects are attributed to poorly definedrequirements. From a process improvement perspective, it is imperative that projectmanagers establish a more effective and efficient way of defining and tracking businessrequirements. At the same time, project managers must have a process that allowsrequirements to drive the automated testing cycle.
Many companies purchase automated testing tools without having a solid requirements-based testing (RBT) process for knowing what to test and when to test. Thesecompanies attempt to use the tools for unit testing when the application is still unstable,and in many cases, out of frustration, the tools end up "on the shelf." However, whenbusiness requirements drive the development of test cases and test scripts, companiesrealize the return on investment in their technology. This presentation demonstrates howRBT best practices coupled with automated software quality (ASQ) techniques solve theproblems that plague a majority of software development projects.
This presentation will describe a “how-to” approach for the development of an automatedregression testing process, which tests the key requirements-based business processes andthe daily business activities. It will also explain how automated test scripts will be reusedfor every maintenance release of the company’s application. In summary, thepresentation will describe:
• Management and definition of business requirements• Developing a matrix on what to test based on requirements• Verification of requirements• Reducing test cases and maximizing application coverage based upon
requirements• Generation of test plans/cases• Creation of automated test scripts from those requirements
Attendees will gain an understanding of how to:
• Employ an automated repository for managing requirements• Use business events (requirements) to identify test cases• Develop a requirements-to-test matrix showing what to test• Develop a test plan driven by requirements• Implement test automation for regression testing
BIOGRAPHY OF PRESENTER
Jeff Tatelman CSTE, Managing Consultant, Spherion Technology Architects
Jeff Tatelman brings more than 20 years of quality assurance and consultingexperience in the software development industry including management of QualityAssurance teams and developing Software Quality processes. Jeff has implementedstructured and automated testing, change management and tools to support thesoftware quality process.
Jeff has been responsible for implementing requirements based testing at severalFortune 1000 companies including NCR, Sprint, Coca-Cola, and SunTrust Bank.
Jeff is well known in the software quality industry and has an extensive background inpublic speaking at various quality organizations, as well as international and localcompanies. He is also the co-founder of the Software Quality Association in Denver.
Jeff TatelmanSpherion Technology Architects4500 Cherry Creek Drive SouthSuite 1050Denver, Colorado 80246720-524-2517
Jeff Tatelman
Jeff Tatelman brings more than twenty years of quality assurance and consultingexperience in the software development industry including management ofQuality Assurance teams and developing Software Quality processes. Jeff hasimplemented structured and automated testing, change management and toolsto support the software quality process.
Jeff has been responsible for implementing requirements based testing atseveral Fortune 1000 companies including NCR, Sprint, Coca-Cola, andSunTrust Bank.
Jeff is well known in the software quality industry and has an extensivebackground in public speaking at various quality organizations, as well asinternational and local companies. He is also the co-founder of the SoftwareQuality Association in Denver.