20
1 www.go-tac.com Copyright ©2004 TAC

1 Copyright ©2004 TAC. 2 T-WorMS Adding Sanity to Your Process Jamie L. Mitchell CTO TAC

Embed Size (px)

Citation preview

1www.go-tac.com Copyright ©2004 TAC

2www.go-tac.com Copyright ©2004 TAC

T-WorMSAdding Sanity to Your Process

Jamie L. Mitchell

CTO TAC

3www.go-tac.com Copyright ©2004 TAC

Do We Really Need Another Tool?

• Integrate our efforts

• Help all stakeholders in testing– QA Engineers– Analysts– Developers– Managers

• Process improvement

4www.go-tac.com Copyright ©2004 TAC

Growth Path

• Starting from totally ad hoc

• Moving towards – Planned testing– Test case development– Scripted test procedures– Feed-back loop through metrics– Adding automation

• Process improvement

5www.go-tac.com Copyright ©2004 TAC

Throughout the Lifecycle• Feature definition• Risk assignment• Test Case creation and tracking• Test planning and estimation• Multi-level defect tracking• Automated and manual…

– Test execution support– Integrated results viewing– Automatic artifact storage– Reporting and metrics

• Power assisted workflow tracking• File Repository• eXploratory testing support• Test Data storage and manipulation

6www.go-tac.com Copyright ©2004 TAC

Feature Definition

• Hierarchical structure– Facilitates traceability to requirements– Add and delete as needed – no penalty for

changes– Feature history shows evolution

• Change reasons required

– Feature ownership tracks responsibility– Feature global tasks

• Test Suspension• Test Review

7www.go-tac.com Copyright ©2004 TAC

Risk Assignment

• Risk tracked 4 ways– Risk to the business/customer (business risk)– Risk inherent in the software (technical risk)– Risk to the project as a whole (project risk)– Relative risk of the test case itself (test case risk)

• Risk review by stakeholders• Allows

– Prioritization of testing– Focused management

8www.go-tac.com Copyright ©2004 TAC

Test Case Creation

• Test cases assigned to (Sub-)Features

• Risk is inherited by tests– Prioritized under (Sub-)Feature

• Common fields allow linking tests

• Specific fields allow for differences

• Categorization– Manual or automation candidate– Test Type– Intended environment

9www.go-tac.com Copyright ©2004 TAC

Test Case (2)

• Time estimation (with feedback)

• Data integration with DiRT

• Specification of pre-requisites

• Automatic run suspension multiple ways– Feature defect records– Test Defect Records– SUT defects– Runnable flag

• File attachment from Repository

10www.go-tac.com Copyright ©2004 TAC

Test Case (3)

• Test history via versioning

• Integrated script table– Task to perform– Data to use– Expected results

• Integrated review tool

• Recommended automation sets

• Maintenance flag

11www.go-tac.com Copyright ©2004 TAC

Test Planning

• Individual Test Planners• Allow focused, planned testing effort• Plan tests for

– Project– Tester– Release– Cycle / Build (optional)– Environment (optional)

• Allow planned eXploratory testing• Time estimation (what if?)

12www.go-tac.com Copyright ©2004 TAC

Defect Tracking

• Feature Defects– Optional suspension of multiple tests

• Test Defects– Optional suspension of single / linked tests

• SUT defects– Optional suspension of finding test

• Simplified metrics collection

• Helps focus maintenance efforts

13www.go-tac.com Copyright ©2004 TAC

Test Execution Support (Manual)

• Manual Assistant to Testing (MAT)• Brings all test information to local workstation• Allows focused execution of Test Planner• Facilitates optional execution of suspended automated test

cases• Helps tester with

– Time metrics and run results recording– Data collection

• Screen snapshots• Memory snaps

– Artifact preservation

• Provides structure for eXploratory testing

14www.go-tac.com Copyright ©2004 TAC

Test Execution Support (Automation)

• Allow anyone to– Select and run sets of tests

• Programming library for automator– Tool independent logging– Collection of OS information– Screen / Memory snapshots– Command line tool usage

• Triage failures / blocked tests• Re-run failed portions of suites• Restart after catastrophic failures• Review full results including artifacts

… WITHOUT KNOWING ANYTHING ABOUT AUTO TOOL

15www.go-tac.com Copyright ©2004 TAC

Integrated Results

• For automation– View results and artifacts from ANY test run any

time on any workstation

• View manual and automated results together

• Pull up any artifacts from any run

• View history of test runs and all SUT defects found by the test case

16www.go-tac.com Copyright ©2004 TAC

Artifact Storage

• Artifact storage on server– Uses ZIP technology

• For any run (manual or automated)– Save artifacts from testing

• Screen snap shots• Memory and OS information• Command line output files• Partial results (in file form)

• Record resultant data into DiRT

17www.go-tac.com Copyright ©2004 TAC

Workflow Support

• Tickler– Automatically get task list when starting WorM– Includes

• Tests assigned to user for review• Owned tests still awaiting review• Owned tests that have complete review• Owned tests waiting for maintenance• Unhandled automation failures and blockages

18www.go-tac.com Copyright ©2004 TAC

File Repository

• Protected storage for any kind of files

• Industry standard Zip archives– Can be password protected

• Change control and serialization enforced

• All previous versions of files saved

• Files may be attached to test cases– Automatic download to workstation for execution (manual

and automation)

• Where used reports

19www.go-tac.com Copyright ©2004 TAC

eXploratory Testing• Start putting rigor around ad hoc testing while still

allowing it to be ad hoc• Initiate test planning• Assisted time recording• Automatic storage of

– Running notes– Screen and memory snap shots– Any desired files

• Record of all SUT defects found• Allows process improvement – at end of cycle, build

formal tests based on notes, defects found• Allows stop and restart of session

20www.go-tac.com Copyright ©2004 TAC

Data Repository for Testing (DiRT)

• Integrated with WorM and MAT• Define and store data without spreadsheet file

problems• Metadata on the data

– Column info helps users understand the data– May be defined using GUI Map info

• Versioning and reporting of data for regulated testing

• Serialization and single location of data• Heart of the upcoming keyword-driven

automation