Upload
techwellpresentations
View
205
Download
4
Tags:
Embed Size (px)
Citation preview
TP PM Tutorial
10/14/2014 1:00:00 PM
"Introducing Keyword-Driven Test
Automation"
Presented by:
Hans Buwalda
LogiGear Corporation
Brought to you by:
340 Corporate Way, Suite 300, Orange Park, FL 32073
888-268-8770 ∙ 904-278-0524 ∙ [email protected] ∙ www.sqe.com
Hans Buwalda
LogiGear Hans Buwalda has been working with information technology since his high school years. In his thirty year career, Hans has gained experience as a developer, manager, and principal consultant for companies and organizations worldwide. He was a pioneer of the keyword approach to testing and automation, now widely used throughout the industry. His approaches to testing, like Action Based Testing and Soap Opera Testing, have helped a variety of customers achieve scalable and maintainable solutions for large and complex testing challenges. Hans is a frequent speaker at STAR conferences and is lead author of Integrated Test Design and Automation: Using the Testframe Method. Speaker Presentations
9/11/2014
1
© 2014 LogiGear Corporation. All Rights Reserved
Hans Buwalda
LogiGear
hans @ logigear.com
STARWEST 2014, Tutorial TP
Anaheim California
Tuesday, October 14, 1.00 PM – 4.30 PM
Introducing
Keyword-driven
Test Automation
© 2014 LogiGear
Introduction
− industries
− roles in testing
9/11/2014
2
© 2014 LogiGear
Who is your speaker
� Software testing company, around since 1994
� Testing and test automation services:− consultancy, training
− test development and automation services
− "test integrated" development services
� Products:− TestArchitect™, TestArchitect for Visual Studio™
− integrating test development with test management and automation
− based on modularized keyword-driven testing
� LogiGear Magazine:− themed issues, non-commercial
www.logigear.comwww.testarchitect.com
� Dutch guy, in California since 2001
� Background in math, computer science, management
� Since 1994 focusing on automated testing− keywords, agile testing, big testing
Hans Buwalda
LogiGear Corporation
hans @ logigear.comwww.happytester.com
© 2014 LogiGear
Topics for this tutorial
� Introduction to keyword driven testing− including "Action Based Testing", my own flavor of it...
� Comparison to other techniques for automation
� Recommendations for a successful application of keyword
driven testing− test design
− automation
− organization
� Some ideas for specific situations:− data driven testing
− non-ui testing
− multi-media
− protocols
− initial data
� Not everything will be equally interesting, or accessible, to
everybody
9/11/2014
3
© 2014 LogiGear
specification development test
Testing Under Pressure
DEADLINE
© 2014 LogiGear
specification development test
Develop tests in time:
• Test design
• Auditing, acceptance
• Preparations
• Automation
Testing Under Pressure
DEADLINE
9/11/2014
4
© 2014 LogiGear
The 5% Rules of Test Automation
� No more than 5% of all test cases should be
executed manually
� No more than 5% of all efforts around
testing should involve automating the tests
© 2014 LogiGear
Why a High Automation Degree ??
� The best way to prepare for efficiency in the crunch zone− good manual test cases can help too, but marginally
� Buy time to do more "exploratory testing", and better test
development
� Credible pay-off for the cost of introduction of automation− initial costs are: tooling, learning curve, adaptation of existing tests
� Automation is better positioned to identify “bonus bugs”− on average 15% of fixes cause new bugs
− many of these bugs are hard to find without integral testing• often a result of violating overall architectures
• the bugs occur because data is left in inconsistent state
� Automated tests have a better chance of being kept up to date if they
form the majority of the testware
� Automation can be re-run, for example as part of the continuous
integration process− either specific based on code changes, or integral, to also catch bonus bugs
9/11/2014
5
© 2014 LogiGear
Why < 5% Automation Efforts ??
� Automation should not dominate testing− it is not a goal in itself
− may never be a bottleneck
� Testers should be able to focus on testing− better tests (higher ambition level)
− communication with stake holders
� High automation efforts can aggravate the
“crunch zone”, instead of relieving it− “invitation to Murphy’s law”
automation should deliver, not dominate…
© 2014 LogiGear
Questions
� How is your test automation
organized? − "we don't have it" is a good answer ☺
� Do you use keywords, or
something similar (like
frameworks, BDD)?
� What are the objectives for
automated testing in your
organization? Any timelines?
9/11/2014
6
© 2014 LogiGear
Record and Playback
select window "Logon"
enter text "username", "administrator"
enter text "password", "testonly"
push button "Ok"
check window title "Welcome"
select window "Main"
push button "New Customer"
expect window "Customer Information"
select field "First Name"
type "Paul"
select field "Last Name"
type "Jones"
select field "Address"
type "54321 Space Drive"
.
.
.
© 2014 LogiGear
Scripting
Test Case
Design
Test Case
Automation
Test Case
Execution
TEST DESIGNER
AUTOMATION ENGINEER
MR. PLAYBACK
9/11/2014
7
© 2014 LogiGear
Example scripting
/// <summary>
/// AddItems - Use 'AddItemsParams' to pass parameters into this method.
/// </summary>
public void AddItems()
{
#region Variable Declarations
WinControl uICalculatorDialog =
this.UICalculatorWindow.UICalculatorDialog;
WinEdit uIItemEdit =
this.UICalculatorWindow.UIItemWindow.UIItemEdit;
#endregion
Keyboard.SendKeys(uICalculatorDialog,
this.AddItemsParams.UICalculatorDialogSendKeys,
ModifierKeys.None);
Keyboard.SendKeys(uIItemEdit,
this.AddItemsParams.UIItemEditSendKeys,
ModifierKeys.None);
}
� State of the art, but stuff for coders . . .
© 2014 LogiGear
Actions
4 actions, each
with an action
keyword and
arguments
read from top
to bottom
fragment from a test with actions
acc nr first last
open account 123123 John Doe
acc nr amount
deposit 123123 10.11
deposit 123123 20.22
acc nr expected
check balance 123123 30.33
• The test developer creates tests using actions with keywords and
arguments
• Checks are, as much as possible, explicit (specified expected values)
• The automation task focuses on automating the keywords, each
keyword is automated only once
9/11/2014
8
© 2014 LogiGear
Potential benefits of keywords
� More tests, better tests− more breadth
− more depth
� Fast, results can be quickly available− the test design directly drives the automation
� Separates the tests from the technical scripting language− easier to involve business subject matter experts
− the action format allows for easy readability
� Less efforts for automation− even "script free" in many cases
� Automation more stable and maintainable− limited and manageable impact of changes in the system under test
� Develop tests more early in the life cycle− deal with execution details later
© 2014 LogiGear
Risks of keyword approaches
� Often seen as silver bullet, complications are
underestimated− often treated as a technical "trick"
− testers can get squeezed and marginalized• developers and users dictating tests
• automation engineers dictating actions
− testers can end up with an automation responsibility, thus becoming pseudo
programmers
� The method needs understanding and experience to be
successful− pitfalls are many, and can have a negative effect on the outcome
� Lack of method and structure can risk manageability− maintainability not as good as hoped
− results can be disappointing, approach will be blamed
9/11/2014
9
© 2014 LogiGear
Combining Approaches . . .
� Use keywords for the automation-ready
description of test cases
� Use scripting to set up structured automation for
the actions
� Use record and playback to record keywords
© 2014 LogiGear
Enter a user id that is greater than 10 characters,
enter proper information for all other fields,
and click on the "Continue" button
There should be an error message stating
that "User Id must be less than 10
characters".
Enter a User Id with special character's), enter
proper information for all other fields
and click on the "Continue" button
An error message should be displayed
indicating that "User Id cannot contain
some special characters".
Enter the information, with a password of 4
characters and click on the "Continue" button
Check for an error message saying:
"Password must contain at least 5
characters".
Comparing FormatsMost values are implicit. The tester has to
figure them out during execution(.classic format
user id message
check registration dialog aaaaabbbbbc User Id must be less than 10 characters
user id message
check registration dialog résoudre User Id cannot contain some special characters
password message
check registration dialog test Password must contain at least 5 characters
keywords
Execution instructions are
repeated in multiple test cases
9/11/2014
10
© 2014 LogiGear
Keywords is not just test automation
� Can also be used for other than testing:− data entry chores
− training purposes
� Can also be used for manual testing− for example with a manual testing dialog
− even can show instructions, with placeholders for values
Actionlogin <user> <password>Enter "<user>" in the user name field, and "<password>" in the password field.
Actionlogin <user> <password>Enter "<user>" in the user name field, and "<password>" in the password field.
Test lineuser name password
login hansb secret
Test lineuser name password
login hansb secret
What the manual tester would seeEnter "hansb" in the user name field, and "secret" in the password field.What the manual tester would seeEnter "hansb" in the user name field, and "secret" in the password field.
© 2014 LogiGear
Keywords need a method
� By themselves keywords don't provide much scalability− they can even backfire and make automation more cumbersome
− a method can help tell you which keywords to use when, and how to
organize the process
� Today we'll look at Action Based Testing (ABT)− addresses test management, test development and automation
− large focus on test design as the main driver for automation success
� Central deliveries in ABT are the "Test Modules"− developed in spreadsheets
− each test module contains "test objectives" and "test cases"
− each test module is a separate (mini) project, each test module can
involve different stake holders
9/11/2014
11
© 2014 LogiGear
Don't just automate manual testing
© 2014 LogiGear
Don't just automate manual testing
9/11/2014
12
© 2014 LogiGear
Don't just automate manual testing
Good automated testing is not the same as automating good manual testing. . .
© 2014 LogiGear
Action Based Testing (ABT)
� Uses the keyword format as a basis for a method− covers test management, test development and automation
− with a large focus on test design as the main driver for automation
success
− method is specific, but concepts are generic
� The central product in ABT is the "Test Module", not the
test case− like chapters in a book
− test cases are part of the test modules, they are typically the result
(rather than the input) of test development
− test development is seen as having both analytical and creative
aspects
− developed as spread sheets, external from the automation, with a
well defined flow
− easier to manage: each test module is a separate (mini) project, each
test module can involve different stake holders
9/11/2014
13
© 2014 LogiGear
High Level Test Design - Test Development Plan
Objectives
Test Module 1
Test Cases
Test Module 2 Test Module N
Actions
. . .
AUTOMATION
Objectives Objectives
interaction test business test
Overview Action Based Testing
define the "chapters"
create the "chapters"
create the "words"
make the words work
Test Cases Test Cases
window control value
enter log in user name jdoe
enter log in password car guy
window control property expected
check property log in ok button enabled true
user password
log in jdoe car guy
first last brand model
enter rental Mary Renter Ford Escape
last total
check bill Renter 140.42
© 2014 LogiGear
Example of business level test module
� Consists of an (1) initial part, (2) test cases and (3) a final part
� Focus is on business functionality, with a clear business scope
� Navigation details are avoided
TEST MODULE Car Rental Payments
user
start system john
TEST CASE TC 01 Rent some cars
first name last name car
rent car John Doe Ford Escape
rent car John Doe Chevvy Volt
last name amount
check payment Doe 140.4
FINAL
close application
9/11/2014
14
© 2014 LogiGear
Example of an interaction level test module
� Lay-out the same, with an initial part, test cases and a final part
� Interaction details that are the target of the test are not hidden
� Focus is not on business ("is the payment amount correct"), but on
interaction ("do I see the payment amount")
TEST MODULE Screen Flow
user
start system john
TEST CASE TC 01 Order button
window button
click main create order
window
check window exists new order
FINAL
close application
© 2014 LogiGear
Variables and expressions with keywords
� This test does not need an absolute number for the
available cars, just wants to see if a stock is updated
� As a convention we denote an assignment with ">>"
� The "#" indicates an expression
TEST CASE TC 02 Rent some more cars
car available
get quantity Chevvy Volt >> volts
first name last name car
rent car John Doe Chevvy Volt
rent car John Doe Chevvy Volt
car expected
check quantity Chevvy Volt # volts - 2
9/11/2014
15
© 2014 LogiGear
Data driven testing with keywords
� The test lines will be repeated for each row in the data set
� The values represented by "car", "first" and "last" come
from the selected row of the data set
DATA SET cars
car first last value
Chevvy Volt John Doe 40000
Ford Escape Mary Kane 22500
Chrysler 300 Jane Collins 29000
Buick Verano Tom Anderson 23000
BMW 750 Henry Smyth 87000
Toyota Corolla Vivian Major 16000
TEST CASE TC 03 Check stocks
data set
use data set /cars
car available
get quantity # car >> quantity
first name last name car
rent car # first # last # car
car expected
check quantity # car # quantity - 1
repeat for data set
© 2014 LogiGear
Automating keyword tests
Function Interpret
While not end of test
Read next line
Split the line into arguments
Look up the keyword in the "action list"
Execute the function belonging to the keyword
Report the results of this line
Repeat for next line
End
ReportInterpreter
Keywords are useful, but technical not complex. It is not hard to make a
simple keyword interpreter. Many test tools also have keyword options in
some form or another.
9/11/2014
16
© 2014 LogiGear
Example: script for an action "check sort order"
# get table object, column number and column count
windowName = LIBRARY.NamedArgument("window")
tableName = LIBRARY.NamedArgument("table")
columnName = LIBRARY.NamedArgument("column")
table = ABT.OpenElement(windowName, tableName)
column = table.GetColumnIndex(columnName)
rowCount = table.GetRowCount()
# check the sort order, row by row
previous = table.GetCellText(0, column)
for i in range(1, rowCount):
current = table.GetCellText(i, column)
if current < previous :
LIBRARY.AdministerCheck("order", "sorted", "fails " + str(i+1), FAIL)
return
previous = current
LIBRARY.AdministerCheck("order", "sorted", "all rows in order", PASS)
The following action script will verify whether the rows in a table are sorted:
find the table in the UI
if a value is smaller than before, fail the test
if all rows are ascending, pass the test
get arguments from the test line
def action_checkSortOrder():
© 2014 LogiGear
Using the new action
� By keeping an action generic it can be applied for a
variety of situations
� Some examples of using "check sort order":
window table column
check sort order view orders orders table ID
window table column
check sort order annual results regions revenue
window table column
check sort order inventory cars price
window table column
check sort order registration students last name
9/11/2014
17
© 2014 LogiGear
Example application
© 2014 LogiGear
A Test Module for the application
� We click a tree node, and then do a check
� The actions here are built-in in the framework
9/11/2014
18
© 2014 LogiGear
Making a new "action"
� This action definition uses existing actions to create a new action called "check bitrate"
� Argument names can be used in cell expressions, that start with "#", and support the usual string and numeric operators
create a node path
from the first two
argumentsthe expected value is
given by the 3rd argument
the arguments of
the new action
name of the new action
© 2014 LogiGear
Using the action in a test
� These test lines don't care about the navigation in the UI of the application, the focus is functional: verify data
� Such functional tests are easier to read with high level actions, and the reduced dependency on navigation makes them (much) easier to maintain in the long term
9/11/2014
19
© 2014 LogiGear
4000
tests
250
actions
2000
tests
22 functions
200
actions20 functions
In a good application of the keywords approach a large increase in test cases (like doubling
the amount) should result in a modest increase in actions, and a minor increase, if any, in
programmed action functions.
Scalability
© 2014 LogiGear
Low-level, high-level, mid-level actions
� "Low level": detailed interaction with the UI (or API)− generic, do not show any functional or business logic
− examples: "click", "expand tree node", "select menu"
� "High level": a business domain operation or check on the
application under test− hide the interaction
− examples: "enter customer", "rent car", "check balance"
� "Mid level": common sequences at a more detailed
application level− usually to wrap a form or dialog
− for use in high level actions
− greatly enhance maintainability
− example: "enter address fields"
enter customer
enter address fields
enter select set . . .. . .
9/11/2014
20
© 2014 LogiGear
Identifying controls
� Identify windows and controls, and assign names to them
� These names encapsulate the properties that the tool can use to identify the windows and controls when executing the tests
© 2014 LogiGear
Mapping an interface
� An interface mapping (common in test tools) will map windows and
controls to names
� When the interface of an application changes, you only have to update
this in one place
� The interface mapping is a key step in your automation success, allocate
time to design it well, in particular naming and choosing identifying
properties
INTERFACE ENTITY library
interface entity setting title {.*Music Library}
name class label
interface element title text Title:
interface element artist text Artist:
interface element file size text File size (Kb):
name class position
interface element playing time text textbox 4
interface element file type text textbox 5
interface element bitrate text textbox 6
name class position
interface element music treeview treeview 1
9/11/2014
21
© 2014 LogiGear
Stabilize Automation
� Test design
� Interface mapping
� Timing
� Application "testability"
� Test the automation
© 2014 LogiGear
� Look for properties a human user can't see, but a test tool can
� This approach can lead to speedier and more stable automation− interface mapping is often bottleneck, and source of maintenance problems
− with predefined identifying property values in interface map can be created without "spy" tools
− not sensitive to changes in the system under test
− not sensitive to languages and localizations
� Examples:− "id" attribute for HTML elements
− "name" field for Java controls
− "AccessibleName" or "Automation ID" properties in .Net controls (see below)
Automation-friendly design: hidden properties
9/11/2014
22
© 2014 LogiGear
Mapping the interface using hidden identifiers
� Instead of positions or language dependent labels, an internal property
"automation id" has been used
� The interface definition will be less dependent on modifications in the UI
of the application under test
� If the information can be agreed upon with the developers, for example in
an agile team, it can be entered (or pasted) manually and early on
INTERFACE ENTITY library
interface entity setting automation id MusicLibraryWindow
ta name ta class automation id
interface element title text TitleTextBox
interface element artist text SongArtistTextBox
interface element file size text SizeTextBox
interface element playing time text TimeTextBox
interface element file type text TypeTextBox
interface element bitrate text BitrateTextBox
ta name ta class automation id
interface element music treeview MusicTreeView
© 2014 LogiGear
� Passive timing− wait a set amount of time
− in large scale testing, try to avoid passive timing altogether: • if wait too short, test will be interrupted
• if wait too long, time is wasted
� Active timing− wait for a measurable event
− usually the wait is up to a, generous, maximum time
− common example: wait for a window or control to appear (usually the test tool will do
this for you)
� Even if not obvious, find something to wait for...
� Involve developers if needed− relatively easy in an agile team, but also in traditional projects, give this priority
� If using a waiting loop− make sure to use a "sleep" function in each cycle that frees up the processor (giving the
AUT time to respond)
− wait for an end time, rather then a set amount of cycles
Active Timing
9/11/2014
23
© 2014 LogiGear
Active Timing
� How much passive timing do you have in your
scripts?
� If you're not sure, find out...
� ... and let me know
"First action I took upon my return was to evaluate the percentage of
passive time in our code and found passive time 68% versus active
time 32%. Needless to say our automation test cases were very
expensive time operations and now I know why..."Raed Atawneh, 2012 (extract)
© 2014 LogiGear
Things to wait for...
� Wait for a last control or elements to load− developers can help knowing which one that is
� Non-UI criteria− API function
− existence of a file
� Criteria added in development specifically for this purpose, like:− "disabling" big slow controls (like lists or trees) until they're done loading
− API functions or UI window or control properties
� Use a "delta" approach:− every wait cycle, test if there was a change; if no change, assume that the
loading time is over:
− examples of changes:• the controls on a window
• count of items in a list
• size a file (like a log file)
9/11/2014
24
© 2014 LogiGear
� Should be a "must have" requirement− first question in a development project: "how do we test this?"
� Identifying properties
� Hooks for timing
� White-box access to anything relevant:− input data (ability to emulate)
− output data (what is underlying data being displayed)
− random generators (can I set a seed?)
− states (like in a game)
− objects displayed (like monsters in a game)
� Emulation features, like time-travel and fake locations
Testability, some key items
© 2014 LogiGear
Why Better Test Design?
� Quality and manageability of test − many tests are often quite "mechanical" now, no surprises
− one to one related to specifications, user stories or requirements,
which often is ok, but lacks aggression
− no combinations, no unexpected situations, lame and boring
− such tests have a hard time finding (interesting) bugs
� Better automation− when unneeded details are left out of tests, they don't have to be
maintained
− avoiding "over checking": creating checks that are not in the scope of
a test, but may fail after system changes
− limit the impact of system changes on tests, making such impact
more manageable
I have become to believe that successful automation is usually
less of a technical challenge as it is a test design challenge.
unexpected problem?
9/11/2014
25
© 2014 LogiGear
Issues are not always obvious...
Downton Abbey
© 2014 LogiGear
The Three “Holy Grails” of Test Design
� Metaphor to depict three main steps in test design
� Using "grail" to illustrate that there is no single perfect
solution, but that it matters to pay attention
Right approach for each test module
Proper level of detail in the test specification
Organization of tests into test modules
9/11/2014
26
© 2014 LogiGear
What's the trick...
© 2014 LogiGear
What's the trick...
� Have or acquire facilities to store and organize
your content
� Select your stuff
� Decide where to put what− assign and label the shelves
� Put it there
� If the organization is not sufficient anymore, add
to it or change it
9/11/2014
27
© 2014 LogiGear
Breakdown Criteria
� Common Criteria
− Functionality (customers, finances, management information, UI, ...)
− Architecture of the system under test (client, server, protocol, sub
systems, components, modules, ...)
− Kind of test (navigation flow, negative tests, response time, ...)
� Additional Criteria
− Stakeholders (like "Accounting", "Compliance", "HR", ...)
− Complexity of the test (put complex tests in separate modules)
− Execution aspects (special hardware, multi-station, ...)
− Project planning (availability of information, timelines, sprints, ...)
− Risks involved (extra test modules for high risk areas)
− Ambition level (smoke test, regression, aggressive, Y)
© 2014 LogiGear
Approach 1: Workshop
� Gather a meeting with relevant participants− test developers
− domain experts
− automation engineer (focus on efficiency of automation)
− experienced moderator
− also consider: developers, managers
� If necessary, provide training of participants
before the discussion
9/11/2014
28
© 2014 LogiGear
Approach 2: Design and Feedback
� One or two experienced test designers create a
first draft
� The draft is delivered/discussed to relevant
parties
� Ask the parties to verify:1. Structure: does it make sense
2. Completeness: are all relevant areas covered
� Based on feedback, further modify the design
© 2014 LogiGear
Properties of a good Breakdown
� Reflects the level of tests
� Well differentiated and clear in scope
� Balanced in size and amount
� Modules mutually independent
� Fitting the priorities and planning of the project
9/11/2014
29
© 2014 LogiGear
"Thou Shall Not Debug Tests..."
� Large and complex test projects can be hard to "get to
run"
� If they are however, start with taking a good look again at
your test design...
� Rule of thumb: don't debug tests. If tests don't run
smoothly, make sure:− lower level tests have been successfully executed first -> UI flow in
the AUT is stable
− actions and interface definitions have been tested sufficiently with
their own test modules -> automation can be trusted
− are you test modules not too long and complex?
© 2014 LogiGear
Breakdown examples
� CRUD tests (Create, Read, Update, Delete) for all entity types in the app− like "order", "customer", "well", etc
− for all: various types and situations
� Forms, value entry− does each form work (try to test form by form, not entity by entity)
− mandatory and optional fields, valid and invalid values, etc
− UI elements and their properties and contents
− function keys, tab keys, special keys, etc
� Screen and transaction flows− like cancel an order, menu navigation, use a browser back and forward buttons, etc
− is the data in the database correct after each flow
� Business transactions, business rules− identify situations that the tests need to try
� Function tests, do individual functions work− can I count orders, can I calculate a discount, etc
� End-to-end tests− like enter sale order, then check inventory and accounting
� Tests with specific automation needs− like multi station tests
� Tests of non-UI functions
� High ambition tests (aggressive tests)− can I break the system under test
9/11/2014
30
© 2014 LogiGear
Example Top Level Structure
<business object 1>
Lifecycles
Value entry
Screen flows. . .
Dialogs
<business object 2>
Functions and Features
Integrations
End-to-end, business
. . .
Security, authorization
Special tests
Non-UI
Extensibility, customizing
Custom controls
. . .
Project
© 2014 LogiGear
Identifying the modules
Step 1: top down → establish main structure (and understanding)
� analyze what the business is and what the system does?
� how is it technically organized?
� what is important that we test
� use the list in the "breakdown examples" slide as a starting point
� also look at the “secondary criteria”, as far as applicable
� if the test is large, define main groups first, then detail out into modules
Step 2: bottom up → refine, complete
� study individual functionalities and checks (like from exist test cases)
� and identify test modules for them if needed
� identify and discuss any additional criteria and needed testing situations
� review and discuss the resulting list(s) of test modules
� create some early drafts of test modules and adjust the list if needed
9/11/2014
31
© 2014 LogiGear
What about existing tests?
� Compare to moving house:− some effort can't be avoided
− be selective, edit your stuff, • look at the future, not the past
− first decide where to put what, then put it there
− moving is an opportunity, you may not get such chance again soon
� Follow the module approach− define the modules and their scope as if from scratch
− use the existing test cases in two ways:• verify completeness
• harvest and re-use them for tests and for actions
− avoid porting over "step by step", in particular avoid over-checking
2009 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.© 2011 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.
Defining test runs using "test suites"
Build Acceptance Test
Smoke Test
System Test
Functional Acceptance Test
Integration Test
9/11/2014
32
2009 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.© 2011 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.
Test Module Scope Prio Status
Model Life CyclesCreate, store, delete Models (=formula + data), as part of SYS
sessions1 pass
Result Life Cycles Create, store outputs. See them in the process store. 1 pass
Formula Life Cycles Create, edit, manage, remove formulas 2 pass
Formula Editor buttons, operations, undo 3 pass
Repository
display of the Modeler repository, presence of user formulas,
drag and drop usage. Effect of changing repository folder
(environment variable)
1 failed
Model Store in Repository presence, re-run, delete 1 pass
Repository UI example: selecting an item shows its description 2 errors
Formula EvaluationCorrectness of results, valid/invalid arguments, boundary
analyses, special arguments1 pass
Built-in FormulasPresence, correctness, valid/invalid arguments, boundaries,
special arguments, equivalence classes1 pass
Data Table AssociationAssociate tabels view, change and remove associations, data
applicability, for existing and defined formulas2 pass
Quick Access buttonsLife cycle of Quick Access buttons, correctnes for the built-in
ones3 dev
Formula argumentspresence, argument types, argument entry, parameters,
defaults2 pass
arguments for Built-in
Formulas
arguments, argument types and defaults for each pre-defined
formula2 failed
Area Of Interest Relations defaulting, tree visibility, select/deselect, …. 1 pass
Model ExecutionModel times, start, stop (cancel), restart ("chunks",
"timeboxes", ... needs more information)3 pass
Graphics graphical representation of various data types and data sets 1 pass
Graphics Viewing zoom, select, drag and drop (no 3d now) 1 pass
Administration users, projects, authorization 1 pass
Model results in central
database
storing, removing, using, correctness, … (there are some other
applications, mostly legacy, that can do the same Models to
compare)
1 pass
Modeler UI various controls, panels, tabs 2 pass
2009 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.© 2011 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.
Test Module Scope Prio Status
Model Life CyclesCreate, store, delete Models (=formula + data), as part of SYS
sessions1 pass
Result Life Cycles Create, store outputs. See them in the process store. 1 pass
Formula Life Cycles Create, edit, manage, remove formulas 2 pass
Formula Editor buttons, operations, undo 3 pass
Repository
display of the Modeler repository, presence of user formulas,
drag and drop usage. Effect of changing repository folder
(environment variable)
1 failed
Model Store in Repository presence, re-run, delete 1 pass
Repository UI example: selecting an item shows its description 2 errors
Formula EvaluationCorrectness of results, valid/invalid arguments, boundary
analyses, special arguments1 pass
Built-in FormulasPresence, correctness, valid/invalid arguments, boundaries,
special arguments, equivalence classes1 pass
Data Table AssociationAssociate tabels view, change and remove associations, data
applicability, for existing and defined formulas2 pass
Quick Access buttonsLife cycle of Quick Access buttons, correctnes for the built-in
ones3 dev
Formula argumentspresence, argument types, argument entry, parameters,
defaults2 pass
arguments for Built-in
Formulas
arguments, argument types and defaults for each pre-defined
formula2 failed
Area Of Interest Relations defaulting, tree visibility, select/deselect, …. 1 pass
Model ExecutionModel times, start, stop (cancel), restart ("chunks",
"timeboxes", ... needs more information)3 pass
Graphics graphical representation of various data types and data sets 1 pass
Graphics Viewing zoom, select, drag and drop (no 3d now) 1 pass
Administration users, projects, authorization 1 pass
Model results in central
database
storing, removing, using, correctness, … (there are some other
applications, mostly legacy, that can do the same Models to
compare)
1 pass
Modeler UI various controls, panels, tabs 2 pass
9/11/2014
33
2009 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.© 2011 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.
Test Module Scope Prio Status
Model Life CyclesCreate, store, delete Models (=formula + data), as part of SYS
sessions1 pass
Result Life Cycles Create, store outputs. See them in the process store. 1 pass
Formula Life Cycles Create, edit, manage, remove formulas 2 pass
Formula Editor buttons, operations, undo 3 pass
Repository
display of the Modeler repository, presence of user formulas,
drag and drop usage. Effect of changing repository folder
(environment variable)
1 failed
Model Store in Repository presence, re-run, delete 1 pass
Repository UI example: selecting an item shows its description 2 errors
Formula EvaluationCorrectness of results, valid/invalid arguments, boundary
analyses, special arguments1 pass
Built-in FormulasPresence, correctness, valid/invalid arguments, boundaries,
special arguments, equivalence classes1 pass
Data Table AssociationAssociate tabels view, change and remove associations, data
applicability, for existing and defined formulas2 pass
Quick Access buttonsLife cycle of Quick Access buttons, correctnes for the built-in
ones3 dev
Formula argumentspresence, argument types, argument entry, parameters,
defaults2 pass
arguments for Built-in
Formulas
arguments, argument types and defaults for each pre-defined
formula2 failed
Area Of Interest Relations defaulting, tree visibility, select/deselect, …. 1 pass
Model ExecutionModel times, start, stop (cancel), restart ("chunks",
"timeboxes", ... needs more information)3 pass
Graphics graphical representation of various data types and data sets 1 pass
Graphics Viewing zoom, select, drag and drop (no 3d now) 1 pass
Administration users, projects, authorization 1 pass
Model results in central
database
storing, removing, using, correctness, … (there are some other
applications, mostly legacy, that can do the same Models to
compare)
1 pass
Modeler UI various controls, panels, tabs 2 pass
2009 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.© 2011 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.
Test Module Scope Prio Status
Model Life CyclesCreate, store, delete Models (=formula + data), as part of SYS
sessions1 pass
Result Life Cycles Create, store outputs. See them in the process store. 1 pass
Formula Life Cycles Create, edit, manage, remove formulas 2 pass
Formula Editor buttons, operations, undo 3 pass
Repository
display of the Modeler repository, presence of user formulas,
drag and drop usage. Effect of changing repository folder
(environment variable)
1 failed
Model Store in Repository presence, re-run, delete 1 pass
Repository UI example: selecting an item shows its description 2 errors
Formula EvaluationCorrectness of results, valid/invalid arguments, boundary
analyses, special arguments1 pass
Built-in FormulasPresence, correctness, valid/invalid arguments, boundaries,
special arguments, equivalence classes1 pass
Data Table AssociationAssociate tabels view, change and remove associations, data
applicability, for existing and defined formulas2 pass
Quick Access buttonsLife cycle of Quick Access buttons, correctnes for the built-in
ones3 dev
Formula argumentspresence, argument types, argument entry, parameters,
defaults2 pass
arguments for Built-in
Formulas
arguments, argument types and defaults for each pre-defined
formula2 failed
Area Of Interest Relations defaulting, tree visibility, select/deselect, …. 1 pass
Model ExecutionModel times, start, stop (cancel), restart ("chunks",
"timeboxes", ... needs more information)3 pass
Graphics graphical representation of various data types and data sets 1 pass
Graphics Viewing zoom, select, drag and drop (no 3d now) 1 pass
Administration users, projects, authorization 1 pass
Model results in central
database
storing, removing, using, correctness, … (there are some other
applications, mostly legacy, that can do the same Models to
compare)
1 pass
Modeler UI various controls, panels, tabs 2 pass
9/11/2014
34
2009 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.© 2011 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.
Test Module Scope Prio Status
Model Life CyclesCreate, store, delete Models (=formula + data), as part of SYS
sessions1 pass
Result Life Cycles Create, store outputs. See them in the process store. 1 pass
Formula Life Cycles Create, edit, manage, remove formulas 2 pass
Formula Editor buttons, operations, undo 3 pass
Repository
display of the Modeler repository, presence of user formulas,
drag and drop usage. Effect of changing repository folder
(environment variable)
1 failed
Model Store in Repository presence, re-run, delete 1 pass
Repository UI example: selecting an item shows its description 2 errors
Formula EvaluationCorrectness of results, valid/invalid arguments, boundary
analyses, special arguments1 pass
Built-in FormulasPresence, correctness, valid/invalid arguments, boundaries,
special arguments, equivalence classes1 pass
Data Table AssociationAssociate tabels view, change and remove associations, data
applicability, for existing and defined formulas2 pass
Quick Access buttonsLife cycle of Quick Access buttons, correctnes for the built-in
ones3 dev
Formula argumentspresence, argument types, argument entry, parameters,
defaults2 pass
arguments for Built-in
Formulas
arguments, argument types and defaults for each pre-defined
formula2 failed
Area Of Interest Relations defaulting, tree visibility, select/deselect, …. 1 pass
Model ExecutionModel times, start, stop (cancel), restart ("chunks",
"timeboxes", ... needs more information)3 pass
Graphics graphical representation of various data types and data sets 1 pass
Graphics Viewing zoom, select, drag and drop (no 3d now) 1 pass
Administration users, projects, authorization 1 pass
Model results in central
database
storing, removing, using, correctness, … (there are some other
applications, mostly legacy, that can do the same Models to
compare)
1 pass
Modeler UI various controls, panels, tabs 2 pass
2009 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.© 2011 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.
Test Module Scope Prio Build 1 Build 2 Build 3
Model Life CyclesCreate, store, delete Models (=formula + data), as part of SYS
sessions1 pass pass pass
Result Life Cycles Create, store outputs. See them in the process store. 1 pass pass pass
Formula Life Cycles Create, edit, manage, remove formulas 2 pass pass pass
Formula Editor buttons, operations, undo 3 pass pass pass
Repository
display of the Modeler repository, presence of user formulas,
drag and drop usage. Effect of changing repository folder
(environment variable)
1 failed failed failed
Model Store in Repository presence, re-run, delete 1 pass pass pass
Repository UI example: selecting an item shows its description 2 errors pass pass
Formula EvaluationCorrectness of results, valid/invalid arguments, boundary
analyses, special arguments1 pass pass pass
Built-in FormulasPresence, correctness, valid/invalid arguments, boundaries,
special arguments, equivalence classes1 pass pass pass
Data Table AssociationAssociate tabels view, change and remove associations, data
applicability, for existing and defined formulas2 pass pass pass
Quick Access buttonsLife cycle of Quick Access buttons, correctnes for the built-in
ones3 dev errors pass
Formula argumentspresence, argument types, argument entry, parameters,
defaults2 pass pass pass
arguments for Built-in
Formulas
arguments, argument types and defaults for each pre-defined
formula2 failed failed pass
Area Of Interest Relations defaulting, tree visibility, select/deselect, …. 1 pass pass pass
Model ExecutionModel times, start, stop (cancel), restart ("chunks",
"timeboxes", ... needs more information)3 pass pass pass
Graphics graphical representation of various data types and data sets 1 pass pass pass
Graphics Viewing zoom, select, drag and drop (no 3d now) 1 pass pass pass
Administration users, projects, authorization 1 pass pass pass
Model results in central
database
storing, removing, using, correctness, … (there are some other
applications, mostly legacy, that can do the same Models to
compare)
1 pass pass pass
Modeler UI various controls, panels, tabs 2 pass pass pass
9/11/2014
35
2009 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.
Too detailed?
Step Name Description Expected
step 16 Click the new formula button to start a new
calculation.
The current formula is cleared. If it had not
been save a message will show
step 17 Enter "vegas winner" in the name field The title will show "vegas winner"
step 18 Open the formula editor by clicking the '+'
button for the panel "formula editor"
The formula editor will show with an empty
formula (only comment lines)
step 19 Add some lines and enter "10*x;" The status bard will show "valid formula".
There is a "*" marker in the title
step 20 Click the Save formula button The formula is saved, the "*" will disappear
from the title
step 21 Open the panel with the arguments by
clicking the '+' button
There two lines, for 'x' and 'y'
step 22 Click on the value type cell and select
"currency"
A button to select a currency appears, with
default USD
step 23 Click on the specify argument values link The argument specification dialog is shown
2009 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.© 2014 LogiGear Corporation. All rights reserved.
Grail 2: Approach per Test Module
• Plan the test module:� when to develop: do we have enough information?
UI tests are usually the last ones to be developed
� when to execute: make sure lower level stuff working first
UI tests are usually the first ones to be executed
• Process:� do an intake: understand what is needed and devise an approach
� analyze requirements, formulate "test objectives", create tests
• Don't just stick to "checking", try follow an exploratory approach:� see the test development as a "learning process", about the business domain, the application
structure, the interaction, etc
� talk about your tests, make them strong
• Identify stakeholders and their involvement:� users, subject matter experts, developers, auditors, etc
• Choose testing techniques if applicable:� boundary analysis, decision tables, state-transition diagrams, etc
• Note about naming test module and test cases: leave out buzz words like
"Verify", "Test", "Can", it makes trees and lists more easy to read
9/11/2014
36
© 2014 LogiGear
Eye on the ball, Scope
� Always know the scope of the test module
� The scope should be unambiguous
� The scope determines many things:− what the test objectives are
− which test cases to expect
− what level of actions to use
− what the checks are about and which events should
generate a warning or error (if a “lower” functionality
is wrong)
© 2014 LogiGear
State your Objectives . . .
...
TO-3.51 The exit date must be after the entry date
...
test objective TO-3.51
name entry date exit date
enter employment Bill Goodfellow 2016-10-02 2016-10-01
check error message The exit date must be after the entry date.
requirement,
specification,
Y
test case
requirement,
specification,
Y
test objective test case
direct relation indirect relation via a test objective
Linking through test objectives can help easier traceability:
9/11/2014
37
© 2014 LogiGear
� Keep test objectives short and simple
� Focus on what to test, not how
� Split longer texts into atomic sentences
� Typically test objectives will be like:− cause and effect ("clicking clear clears all fields"0
− condition and effect ("if all fields filled, 'ok' is enabled")
Test Objectives
© 2014 LogiGear
Grail 3: Specification Level, choosing actions
� Scope of the test determines the specification level
� As high level as appropriate, as little arguments as possible− be generous with default values for arguments
� Clear names for actions− usually verb + noun usually works well
− try to standardize both the verbs and the nouns, like "check customer"
versus "verify client" (or vice versa)
� Avoid "engineer" styles for names of actions and arguments− tests are not source code
− like no spaces, uppercase, camel-case or underlines
− in other words: "noha_RDT_oUnderS~tand" names please
� Manage and document the Actions
� By-product of the test design
9/11/2014
38
© 2014 LogiGear
Example of using actions
In this real world example the first "sequence number" for teller transactions for a given day is retrieved, using a search function
• the "#" means an expression, in this case a variable with todays business date
• the ">>" means: assign to a variable for use later on in the test
key
type key {F7}
type key 3
page tab
locate page tab Scan Criteria
text
check breadcrumb general functions > search
window control value
select search scan direction Backward
window control value
enter value search business date match # bus date
source control
click search go
window control variable
get search results sequence number >> seq num
. . .
© 2014 LogiGear
variable
get sequence number >> seq num
Example of using actionsIn this real world example the first "sequence number" for teller transactions for a given day is retrieved, using a search function
• the "#" means an expression, in this case a variable
• the ">>" assign to a variable for use later on in the test
9/11/2014
39
© 2014 LogiGear
Another example
TEST MODULE Order processing
start system
TEST CASE TC 01 Order for tablets
user password
login jdoe doedoe
window
check window exists welcome
order id cust id article price quantity
create order AB123 W3454X tablet 198.95 5
order id total
check order total AB123 994.75
. . .
© 2014 LogiGear
Environments, configurations
� Many factors can influence details of automation− language, localization
− hardware
− version of the system under test
− system components, like OS or browser
� Test design can reflect these− certain test modules are more general
− others are specific, for example for a language
� But for tests that do not care about the differences, the
automation just needs to "deal" with them− shield them from the tests
9/11/2014
40
© 2014 LogiGear
Capture variations of the system under test in the actions and interface
definitions, rather than in the tests (unless relevant there).
Can be a feature in a test playback tool, or something you do with a global
variable or setting.
Variation Variation Variation
"Variations"
"Master Switch"
Actions, Interface Definitions
. . .
© 2014 LogiGear
Possible set up of variations
linked variation
keyworded variation
Specify for example in a dialog when you start an execution:
9/11/2014
41
© 2014 LogiGear
Non-UI Testing
� Examples− application programming interfaces (API’s)
− embedded software
− protocols
− files, batches
− databases
− command line interfaces (CLI’s)
− multi-media
− mobile devices
� Impact is mainly on the automation− test design should in most cases be transparent towards the specific
interfaces
� Often non-UI automation can speed up functional tests
that do not address the UI
testing devices
© 2014 LogiGear
Multiple System Access
System (part)
Under Test
Automation Scheme
API
access
protocol
access
UI
access
database
access
Test Modules, driving either
one or multiple interfaces
9/11/2014
42
© 2014 LogiGear
Device Testing
Software Under Test
AgentABT
Automation
Interface
Info
Testing HostDevice
Andr
oid
© 2014 LogiGear
Multimedia: The "Play List" Approach
� Approach applicable for graphics, videos, sound
fragments, etc
� The test includes "questions":− what the tester should see or hear
− like "Are the matching areas blue?"
− actions like "check picture"
� The test tool keeps a "play list"− during the run items are captured and stored
− after the run, the tester is presented with the items,
and the matching questions
− the tester acknowledges/falsifies
− the system stores those passed items
− if during the next run the items are the same as
earlier passed ones, the tester is not asked again
9/11/2014
43
© 2014 LogiGear
Performance Testing
� The topic is complex, but to create tests can be quite
straightforward− actions like "generate load <how much>" and "check response time <max
wait>"
− use one tool to generate load (like JMeter), another to run the "normal"
functional test
� Often performance testing isn't testing, but more close to
research− analysis bottle necks and hot spots (for example discontinuities in response
times, means buffers are full)
− application of statistical techniques like queuing theory
− how to realistically mimic large scale productions situations in smaller test
environments
� The three controls you can/should address:hardware (equipment, infrastructure, data centers, etc)
software (programs, database models, settings, etc)
demands (1 second may cost 10 times more than 2 seconds)
See also: "Load Testing for Dummies", Scott Barber, gomez.com
© 2014 LogiGear
Organization
� Much of the success is gained or lost in how you organize the
process− part of the teams
− who does test design
− who does automation
− what to outsource, what to keep in-house
� Write a plan of approach for the test development and automation− scope, assumptions, risks, planning
− methods, best practices
− tools, technologies, architecture
− stake holders, including roles and processes for input and approvals
− team
− . . .
� Assemble the right resources− testers, lead testers
− automation engineer(s)
− managers, ambassadors, ...
Test design is a skill . . .Automation is a skill . . . Management is a skill . . .
. . . and those skills are different . . .
9/11/2014
44
© 2014 LogiGear
Product Life Cycles
� Product life cycles, rather than task life cycles
� The project planning and execution largely determines when the
products are created
system
development
test
development
test
automation
© 2014 LogiGear
Typical Time Allocation
TEST DEVELOPMENT
AUTOMATION
time
eff
ort
s
9/11/2014
45
© 2014 LogiGear
Keywords and ABT in Agile
� Keywords are suitable for agile projects:− tests are easier to create and understand, in particular for non-
programmers
− they allow test development without a need for details that haven't
been defined yet
− automated tests can quickly follow changes in the system under test
� Action Based Testing in itself is quite agile− focused on products and cooperation
− flexible in process, in fact each test module can have its own process
− test modules are usually very suitable to drive system development
� However, ABT relies on high level test design for best
results− identifying test modules
− in larger scale projects this may require at least some overall test
planning activities that are not necessarily easy to do in a single
scrum team
© 2014 LogiGear
ABT in Agile
Test Module
Definition
(optional)
Test Module Development
Interface Definition
Action Automation
Test Execution
Sprint ProductsProduct Backlog
Test re-use
Automation re-use
product owner
teamprod owner & team
User stories
Documentation
Domain understanding
Acceptance Criteria
PO Questions
Situations
Relations
Agile life cycle
Test development
Main Level Test Modules
Interaction Test Modules
Cross over Test Modules
9/11/2014
46
© 2014 LogiGear
Using ABT in Sprints (1)
� Aim for "sprint + zero", meaning: try to get test
development and automation "done" in the same sprint,
not the next one− next one means work clutters up, part of team is not working on the
same sprint, work is done double (manually and automated), ...
� Agree on the approach:− questions like does "done" include tests developed and automated?
− do we see testing and automation as distinguishable tasks and
skillsets?
− is testability a requirement for the software?
© 2014 LogiGear
Using ABT in Sprints (2)
� Just like for development, use discussions with the team
and product owners − deepen understanding, for the whole team
− help identify items like negative, alternate and unexpected situations
� Start with the main test modules, that address the user
stories and acceptance criteria− try to keep the main test modules at a similar level as those stories
and criteria
− test modules can double as modeling device for the sprint
� Plan for additional test modules:− low-level testing of the interaction with the system under test (like
UI's)
− crossing over to other parts of the system under test
9/11/2014
47
© 2014 LogiGear
Using ABT in Sprints (3)
To discuss an approach, consider daily "sit down" meetings with
some or all members to coach and evaluate− an end-of-day counterpart to the early-morning "stand up" meetings
− short and friendly, not about progress and impediments, but about practices and
experiences with them (like "what actions did you use?")
− a few meetings may suffice
� Create good starting conditions for a sprint:− automation technology available (like hooks, calling functions, etc)
− how to deal with data and environments
− understanding of subject matter, testing, automation, etc
� Do interface mapping by hand, using developer provided
identifications− saves time by not having to use the viewer or other spy tools
− recording of actions (not tests) will go better
Tip
© 2014 LogiGear
Summary
� Keywords is one of the techniques for automated
testing, in addition to record & playback and
scripting
� In itself keywords are not a silver bullet, it needs
a good approach, careful planning and good
organization to be successful
� Keywords can work for GUI testing, but equally
well for a variety of other purposes
9/11/2014
48
© 2014 LogiGear
1. Testing Computer Software, Cem Kaner, Hung Nguyen, Jack Falk, Wiley
2. Lessons Learned in Software Testing, Cem Kaner, James Bach, Bret Pettichord, Wiley
3. Experiences of Test Automation, Dorothy Graham, Mark Fewster, Addison Wesley, 2012
4. Automating Software Testing, Dorothy Graham, Mark Fewster, Addison Wesley
5. Action Based Testing (overview article), Hans Buwalda, Better Software, March 2011
6. Action Figures (on model-based testing), Hans Buwalda, Better Software, March 2003
7. Integrated Test Design & Automation, Hans Buwalda, Dennis Janssen and Iris Pinkster, Addison Wesley
8. Soap Opera Testing (article), Hans Buwalda, Better Software Magazine, February 2005
9. Testing with Action Words, Abandoning Record and Playback, Hans Buwalda, Eurostar 1996
10. QA All Stars, Building Your Dream Team, Hans Buwalda, Better Software, September 2006
11. The 5% Solutions, Hans Buwalda, Software Test & Performance Magazine, September 2006
12. Happy About Global Software Test Automation, Hung Nguyen, Michael Hackett, e.a., Happy About
13. Misconceptions About Test Automation, Hans Buwalda, LogiGear Magazine, April 29th, 2013
Some References