50
Vendor Meets User The Hexawise Test Design Tool and Two Testers Who Tried to Use it in Real Life Presented at CAST, 2011 Justin Hunter Lanette Creamer Ajay Balamurugadas

Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Embed Size (px)

DESCRIPTION

This presentation was presented by Justin Hunter, Lanette Creamer, and Ajay Balamurugadas at CAST 2011. The focus of the presentation is using pairwise testing methods as well as other more sophisticated Design of Experiments based software test design methods. The description of the presentation on the CAST site is: "Vendor Meets User: The Hexawise Test Design Tool and a Tester who Tried to Use It in Real Life Justin Hunter, Lanette Creamer, and Ajay Balamurugadas Dr. William G. Hunter helped manufacturers create small numbers of prototypes that were each carefully designed to reveal as much actionable information as possible. He did this using Design of Experiments methods that he taught as a professor of Applied Statistics. Five years ago, while working at Accenture, Hunter’s son Justin began to apply some of these Design of Experiments-based methods to the software testing field. After seeing promising results from 17 pilot projects he helped manage at Accenture, Justin created Hexawise, a software test design tool that generates tests using Design of Experiments-based methods. Justin will introduce the tool. But this is not the typical vendor talk. Testers Lanette Creamer and Ajay Balamurugadas each recently used Hexawise for the first time on a real project. They will share their experiences, covering both where it helped and where she experienced limitations of the tool and the test design technique. Justin Hunter, Founder and CEO of Hexawise, is a test design specialist who has enjoyed teaching testers on six continents how to improve the efficiency and effectiveness of their test case selection approaches. The improbably circuitous career path that led him into the software testing field included working as a securities lawyer based in London and launching Asia’s first internet-based stock brokerage firm. The Hexawise test design tool is a web-based tool that is available for free to teams of 5 or fewer testers, as well as to non-profit organizations. Lanette Creamer: After 10 years at Adobe, including working as a Quality Lead testing across the Creative Suites, Lanette is now a Senior Consultant with Sogeti. She is currently working as a Test Lead at Starbucks. Lanette has been evangelizing test collaboration and promoting advancement in human test ideas for the past 5 years. With a deep passion for collaboration as a way to increase test coverage, she believes it is a powerful solution when facing complex technical challenges. Lanette has presented at PNSQC, Better Software/Agile Development Practices, Writing About Testing, and STPCon in 2010. She’ll be participating at CAST 2011 in her home city of Seattle. She actively participates in the testing community and has written two technical papers and a published article on testing in ST&P Mag January 2010 (now ST&QA magazine)."

Citation preview

Page 1: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Vendor Meets User

The Hexawise Test Design Tool and Two Testers Who Tried to Use it in Real Life

Presented at CAST, 2011Justin Hunter

Lanette CreamerAjay Balamurugadas

Page 2: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Topics

2

Introduction to Hexawise “Inside the Mind of the Vendor”

Justin Hunter

Experiences“Inside the Minds of Testers”

Lanette Creamer & Ajay Balamurugadas

Hexawise is a test case design tool used by testers to design their tests. In a context where test scripts are used, Hexawise can design detailed test scripts. In an Exploratory Testing context, Hexawise is used to generate “test ideas” that encourage the tester for the tester to explore, and even design mini tests on the fly.

Page 3: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

My Dad - William G. Hunter

3

Why I created Hexawise has a whole lot to do with who my father was. He was a leading applied statistician who specialized in how to make experiments more efficient & effective.

Page 4: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

1960’s

4

In the 1960’s my dad brought my family to Singapore where he taught at a university and worked with local companies.

Page 5: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

1970’s

5

In the 1970’s my dad brought my family to Nigeria where, again, he taught at a university and worked with local companies. That’s me in the lower right hand corner. We were visiting a factory my father was helping.

Page 6: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

1980’s

6

In the 1980’s he did something that his colleagues thought was pretty crazy because they thought his expertise and lessons learned probably wouldn’t transfer into the government sector. While a professor in Madison, Wisconsin, he started collaborating with local and state government agencies.

Page 7: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Design of Experiments

7

Why did he uproot our family every few years and then start collaborating with government agencies?

It was because he passionately believed that sharing his expertise in Design of Experiments (a specialized field of applied statistics), would really help people - by giving them skills that would improve both quality and productivity. This is a book cover from a book he co-wrote with George Box that has helped to increase awareness of what Design of Experiments is and how practitioners should use it.

Page 8: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Design of Experiments

8

So what is Design of Experiments? It’s a specialized field of applied statistics that has been around since the 1930’s. Simply put, it is focused on answering this question.

Page 9: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Design of Experiments

9

Where is Design of Experiments used?

It is used extensively in manufacturing, among other industries. If you’re a manufacturer trying to create a widget for a car part, for example, you don’t want to have to build 100,000 different prototypes of the widget before you stumbled across a combination of heat and pressure and temperature and ingredients that achieve the desired characteristics. You’d want to build a small handful of prototypes, and have the variables going into each of the prototypes carefully varied from prototype to prototype to allow you to learn as much as possible in as few experiments. That’s what auto-manufacturers regularly do.

Page 10: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Design of Experiments

10

Where Else is Design of Experiments used?

Design of Experiments-based methods have also been commonly used in agriculture for decades. If you’re Monsanto and you want to grow an hardier seed that will grow in colder temperatures and mature more quickly, you’re going to use Design of Experiments methods to identify the combinations of variables to test together in each test you execute.

Page 11: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Design of Experiments

11

Where Else is Design of Experiments used?

Many marketers also use Design of Experiments methods extensively. YouTube recently ran an experiment involving 1,024 different combinations of fonts, colors, messages, and button sizes and shapes to find an optimal combination that increased their sign-up rate by more than 15%. Jeff Fry, a tester at Microsoft wrote a good article about this and posted a phenomenal video by a Design of Experiments expert who worked at Amazon before moving to Microsoft.

A/B testing is a very simple “watered down” DoE approach. Multi-Variate Testing (MVT) is “full-blown” DoE-based marketing.

Page 12: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Design of Experiments

12

What about in Software

Testing?This is the question I’ve been focused on for the last 5 years. Seventeen pilot projects I conducted at my prior company convinced me that DoE-based methods consistently deliver improvements in the efficiency and effectiveness of their business as usual test cases.

Page 13: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

What is Hexawise?

13

Challenges Hexawise Addresses

Problems During Test Design...

Impact Felt During Test Execution

Manual Documentation

Largely Repetitive Tests

Gaps in Coverage

Delayed Start

Inefficient

Missed DefectsHexawise was created to address these common testing challenges.

Page 14: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Mortgage Application Example

14

Let’s use this simple example to demonstrate how pairwise testing works. I’ve borrowed this idea from a presentation that Bernie Berger gave at StarEast.

Imagine you’re testing a mortgage application that has several sets of details. This is an “executive summary” view of the different options that could be selected for application. We could make this example more complicated by including hardware and software configuration options, user types, etc. We’re intentionally keeping it simple here.

Page 15: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Mortgage Application Example

15

If we just examine one of those three branches, we see that it has 27 possible test combinations associated with it. For example, 1 of the 27 possible tests would include:

One example: Income = Low & Credit Rating = Medium & Customer Status = VIP

There are 26 other similar combinations.

Page 16: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Mortgage Application Example

16

When we examine all three branches, we see they have equal complexity. Each of the three branches has 27 total possible combinations. How many total combinations are there? Hint: It is not 81.

Page 17: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Which Tests Should You Choose?

17

27 X 27 X 27 =

19,683 Possible Tests

There are almost 20,000 possible combinations to choose from.

Page 18: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Prioritization

18

How many test inputs are needed to trigger defects in production?

51%33%

11%5%

• Medical Devices:  D.R. Wallace, D.R. Kuhn, Failure Modes in Medical Device Software: an Analysis of 15 Years of Recall Data, International Journal of Reliability, Quality, and Safety Engineering, Vol. 8, No. 4, 2001.    • Browser, Server:  D.R. Kuhn, M.J. Reilly, An Investigation of the Applicability of Design of Experiments to Software Testing, 27th NASA/IEEE Software Engineering Workshop, NASA Goddard SFC 4-6 December, 2002 .   • NASA database:  D.R. Kuhn, D.R. Wallace, A.J. Gallo, Jr., Software Fault Interactions and Implications for Software Testing, IEEE Trans. on Software Engineering, vol. 30, no. 6, June, 2004.   • Network Security:  K.Z. Bell, Optimizing Effectiveness and Efficiency of Software Testing: a Hybrid Approach,  PhD Dissertation, North Carolina State University, 2006.  

1

2 (“pairwise”)

3

4, 5, or 6In order to prioritize which specific combinations should be selected as high priority tests from those ~20,000 possible tests, it is extremely important to understand that the vast majority of defects in production can be triggered by just two test inputs tested together (e.g., a test that includes Income = Low as the first test input and also includes Credit Rating = High as the second test input).

This fact has extremely important implications for software testers. Unfortunately, very few software testers are aware of (a) this fact, or (b) the implications. The implications for software testers is that small sets of tests that ensure every possible PAIR of values get tested together in at least one test case are extremely efficient and effective at finding defects.

Page 19: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Mortgage Application Example

19

Select a couple pairs of test inputs from this mind map. Possible pairs of inputs could include pairs like these shown below. Select your own two pairs though.

First Example of a pair of values: Income = Low & Credit Rating = Medium

Second Example of a pair of values: Income= High & Customer Status = Employee

Third Example: Income = High & Credit Rating = Low

Page 20: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

20

These are the same test inputs that have been imported from the mind map into the Hexawise tool. When you click on the “Create Tests” icon at the top of the screen, you will see a pairwise testing solution. Every and all pairs of values might have selected will be included in a surprisingly small number of tests.

Page 21: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

21

Only 17 tests are required (out f 19,683 possible tests) to test every single possible pair of test conditions together in the same test case at least once.

In other words, every single pair of test conditions will be tested together at least once. The pair of conditions we selected at random Income = Low, Credit Rating = Medium appears in test number 8. All the other pairs are also tested together at least once.

If we have done a thorough job of identifying test inputs, the vast majority of defects will be triggered by these 17 tests out of almost 20,000 tests. This is the lesson from Design of Experiments that have been learned and applied in so many other industries since the 1930’s and are now being applied by an increasing number software testers.

Page 22: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

22

This is the same set of 17 tests shown in case the speaker notes on the last slide covered a pair of values you were trying to confirm were tested together.

Page 23: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

23

The tests “front-load” coverage. 87% of the pairs of values have already been tested together by the end of the 9th test.

This is simply a coverage chart showing what percentage of test input pairs have been tested together so far as a percentage of the total number of possible pairs that could be tested together in the System Under Test.

Page 24: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

User Experiences

24

Lanette Creamer’s Experiences

Now lets’ hear from Lanette and Ajay...

Page 25: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

HEXAWISEFirst used on June 08, 2009

Page 26: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

FIRST STEPS

Interested to try new softwareAware of allpairs.exeProblem Statement

Multiple Printers Printer Specific Charts Chart 1 & Chart 2 Other Settings

Sl. No Printer Chart 1 Chart 2 Settings

1 ABC 16 strips 64 strips Borderless

15 LMN 64 strips ------- Auto cut

45 XYZ ------- 16 strips -------

Page 27: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

TESTED WITH ALLPAIRS

Excel to Notepad to ExcelVery useful when all pairs are validUnable to mention invalid pairsSteps to be repeated based on casesMaintained a common repository

Page 28: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

HEXAWISE

Easy to use One user account –

anytime accessibleCan specify invalid

pairsMultiple strength cases

Page 29: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

HEXAWISE WISH LIST

Desktop version – useful without internet too

Able to define invalid pairs after the cases are generated

Easy method to define invalid pairs

Need to try project sharing & excel import

Page 30: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Disclaimer: Thinking Req’d

30

?

This is a photo I saw Lanette use in earlier presentations. It is absolutely spot on in this context. Designing DoE-based software tests is not a paint-by-numbers approach. You need to use your critical thinking skills. Without using them, there will be a garbage in / garbage out problem.

Page 31: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Disclaimer - Imperfect Models

31

?When you use a Design of Experiments-based test design tool, you effectively create a model that will generate your tests. Whenever you do so, there will be parts of the System Under Test you will miss. Perhaps (probably?) you will miss important parts.

Page 32: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Disclaimer - Which Inputs?

32

?

When creating DoE-based software tests, you will face the same kinds of test design considerations you always have... as well as new, DoE-specific considerations.

Page 33: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

“The Truffle Pig Problem”

33

?

Design of Experiments-based test design methods face a “truffle pig problem.” If software bugs were like leaves on your lawn that you wanted to get rid of, DoE-based test-design based methods would be much more popular than they are now. DoE-based methods would be the equivalent of a leaf blower: you’d be able to instantly see your productivity increase.

Unfortunately bugs are not visible, like leaves. They’re hiding, unseen, like truffles. It is my experience, that DoE-based test design methods are like an especially thorough and efficient truffle pig. The problem is, of course, that if someone gave you a super truffle pig that was twice as good at finding truffles as your regular truffle pig, you would probably have a hard time assessing how good it was. DoE-based test design methods face this same challenge.

Page 34: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

How Can You Know?

34

?

Here’s the best approach I’ve come up with to answer the question of “how can you know DoE-based test design methods are better than manually-selected test cases?”

Page 35: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

“Let’s test this hypothesis.”

35

Cereal Box

Toyota - Entering the Truck market

in the U.S

Even though we can complete a meaningful “bake-off” pilot project within just a couple man days of effort, this is the typical reaction I get from test teams who I propose this approach to! (brief video of office mates diving under desks, hiding under plants, etc.)

It is amazing how quickly people tend to run and hide when they are given the opportunity to learn something that could fundamentally change their software testing effectiveness.

The irony is that teams will say “we’re too busy to execute a one or two day pilot project now.” Hello? In my experience, the findings from the pilot - on average - more than double the number of defects found per tester hour. The entire point of learning about Design of Experiments-based test design techniques, like pairwise and 3-way, and orthogonal array-based / OA testing is to improve your efficiency and effectiveness... So you can get much more done with fewer resources. Saying “I’m too busy to learn how to do that” is... shortsighted is probably the most diplomatic word.

Page 36: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Results: Less Test Design Time

36

SameDifferent Test

Design Approach

Different Results

System Under Test Identify tests

manually vs. generate tests

using a Design of Experiments-

based tool

Time to Design Tests

Test Ideas

Identify tests manually vs.

generate tests using a Design of

Experiments-based tool

Combinatorial Coverage

Time

Identify tests manually vs.

generate tests using a Design of

Experiments-based tool

No. of Bugs FoundTime

Identify tests manually vs.

generate tests using a Design of

Experiments-based tool Time to Execute

Tests

~ 30 - 40% Less Time

(b/c Many Test Generation Steps are Automated)

?

In my experience, teams that have agreed to pilot projects have seen these results. It takes far less time to generate tests using this approach because many steps in the test case selection process and test case documentation process get automated.

Page 37: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Results: Better Coverage

37

SameDifferent Test

Design Approach

Different Results

System Under Test Identify tests

manually vs. generate tests

using a Design of Experiments-

based tool

Time to Design Tests

Test Ideas

Identify tests manually vs.

generate tests using a Design of

Experiments-based tool

Combinatorial Coverage

Time

Identify tests manually vs.

generate tests using a Design of

Experiments-based tool

No. of Bugs FoundTime

Identify tests manually vs.

generate tests using a Design of

Experiments-based tool Time to Execute

Tests

?

In my experience, it is easy to show that combinatorial coverage (e.g. how many pairs of values, how many triples of values, are tested together, etc.) is far superior with this approach. In this actual example from a couple months ago, we showed that 51 business as usual tests that were put together manually did not test for more than 1,400 pairs of values.

A skeptic will probably say... “OK. Interesting, but what does that translate to in terms of actual defects found?”

Page 38: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Results: More Bugs Found

38

SameDifferent Test

Design Approach

Different Results

System Under Test Identify tests

manually vs. generate tests

using a Design of Experiments-

based tool

Time to Design Tests

Test Ideas

Identify tests manually vs.

generate tests using a Design of

Experiments-based tool

Combinatorial Coverage

Time

Identify tests manually vs.

generate tests using a Design of

Experiments-based tool

No. of Bugs FoundTime

Identify tests manually vs.

generate tests using a Design of

Experiments-based tool Time to Execute

Tests

Will depend upon:(1) the System Under Test, (2) Test Designer skill, and

(3) the coverage strength of the DoE-based tests.

My Experience from dozens of projects:

2-way DoE-based tests consistently

find more.

?

In my experience, 2-way (or pairwise) tests - using the same test ideas as used in business as usual tests - have consistently found more defects than the business as usual tests. This is true even when the business as usual tests are far higher in number than the pairwise set of tests.

If you used 3-way or 4-way sets of tests, the number of defects found by this Design of Experiments-based test design approach would be far higher than found using the business as usual approach.

Page 39: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Results: ~2x Bugs / Hour

39

SameDifferent Test

Design Approach

Different Results

System Under Test Identify tests

manually vs. generate tests

using a Design of Experiments-

based tool

Time to Design Tests

Test Ideas

Identify tests manually vs.

generate tests using a Design of

Experiments-based tool

Combinatorial Coverage

Time

Identify tests manually vs.

generate tests using a Design of

Experiments-based tool

No. of Bugs FoundTime

Identify tests manually vs.

generate tests using a Design of

Experiments-based tool Time to Execute

Tests

?

Will depend upon:(1) the System Under Test, (2) Test Designer skill, and (3) the coverage strength of the DoE-based tests.

My Experience from dozens of projects:

~2-way DoE-based tests consistently find MANY more

bugs / hour (often double)

The number of defects is higher using Hexawise. The number of tests executed is lower using Hexawise. On average, in the dozens of pilot projects I have seen, the number of defects found per tester hour is often double the number of defects found per tester hour from business as usual sets of tests.

Page 40: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

How Can You Know?

40

?I would strongly encourage you to try a simple one or two day pilot project. In fact, I’ll help you do it if you agree to publish the results (whether good or bad).

Page 41: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Additional Information

41

Text

?

James Bach - Pairwise Testing: A Best Practice that Isn’t

We’ve barely scratched the surface on the topic of what Design of Experiments-based test design is and how you could get started using it. Here are some good sources to find out more about it and how you can get started using it. I am happy to talk to you about it if you have any questions.

Page 42: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

?Questions?

42

Text

Thank you all for your time. Any questions?

Page 43: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

43

Appendix Slides

Page 44: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Select Your Thoroughness Goal

Testing for every pair of input values is just a start. The test designer can generate plans with very different levels of testing thoroughness.

The 2-way test cases Hexawise generates have been consistently shown to be more thorough than standard test cases created by most test teams at Fortune 500 firms. Even so, Hexawise allows users to “turn up the coverage dial” dramatically and generate other, extraordinarily thorough, sets of tests. In this case, we see Hexawise can generate test set solutions for this simple insurance ratings engine example ranging in size from 28 test cases (for users who prioritize speed to market) all the way up to 3,925 test cases (for users who desire extremely thorough testing).

44

Page 45: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

How Much is Enough Testing?

The “Analyze Coverage” screen shows you how much coverage is achieved at each point in the set of tests. In other words, what percentage of the targeted combinations have been tested for after each test?

This chart gives teams the ability to make fact-based decisions about “how much testing is enough?” Here, for example, 83% of the pairs of test inputs entered into this plan have been tested together after only 12 tests (out of 295,000 possible tests).

45

Page 46: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Better Than Hand-Selected Tests

If you take a close look at any set of Hexawise-generated test cases you will notice that there is an enormous amount of variation from test case to test case (and the smallest amount of repetition that is mathematically possible to achieve).

In contrast, if you were to translate your existing manually-selected test cases into a similar format and analyze them, you would find that the manually-selected test cases have far more repeated test combinations and far less variation from test case to test case. This is is a big part of the reason why Hexawise generates dramatic efficiency improvements.

In addition, if you were to graph the percent of the targeted 2-way combinations achieved by your existing manually-selected test cases, you would find that there are many pairs of test inputs that were never covered by your tests. The fact that Hexawise will ensure every pair of test inputs gets tested in at least one test case is a big part of the reason why Hexawise-generated tests result in superior coverage and more defects found during test execution.

46

Page 47: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

What is DoE-based testing?

Topic Details

DefinitionDesign of Experiments-based testing is a test design approach used to identify a small subset of tests (from many possible ones) in order to find as many defects as possible in as few tests as possible.

Why it Works

Test conditions are constructed to ensure:

• No combinations of conditions get accidentally omitted

• Unproductive repetition is minimized

“AKA”

“Design of Experiments-based testing” covers several closely-related subjects:

• Pairwise / AllPairs

• Orthogonal Array / OA / OATs

• 2-way, 3-way, ... t-way

47

Page 48: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Software Testing Challenges

• Software applications are very complex; it is impossible to test every possibility

• Extraordinarily smart, pragmatically-oriented applied statisticians created the field of “Design of Experiments” to solve exactly this challenge; for the last 40+ years they have developed highly effective math-based covering array techniques and similar strategies which are now broadly used in many areas including manufacturing, advertising, and agriculture

• These proven Design of Experiments techniques, which are designed to find out as much information as possible in as few test cases as possible, also have direct applicability to the software testing field

• Unfortunately, the vast majority of software testers in the relatively young field of software testing have never heard of any Design of Experiments concepts like MFAT vs. OFAT, Orthogonal Array coverage, pairwise coverage, or even the existence of the “Design of Experiments” field

• Instead of using 40+ years of Design of Experiments-based knowledge to design tests that are as effective as possible, testers almost always manually select the combinations of test conditions they use in their tests, and as a result...

48

Page 49: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Results without DoE / Hexawise

... the results from manual test case selection efforts are consistently far from optimal:

Wasteful repetitionMissed combinations

49

Page 50: Hexawise Software Test Design Tool - "Vendor Meets User" at CAST Software Testing Conference 2011 - with speaker notes

Results with DoE / Hexawise

In contrast, Hexawise algorithms use Design of Experiments-based methods to generate tests. The result is that Hexawise-generated tests consistently find more defects in fewer tests. Hexawise-generated tests pack more coverage into each test.

50