38
VOLUME 6 • ISSUE 7 • JULY 2009 • $8.95 • www.stpcollaborative.com THE GREAT AGILE DEBATE | STATE OF THE TEST INDUSTRY | WS-I INTEROPERABILITY TE

THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

VOLUME 6 bull ISSUE 7 bull JULY 2009 bull $895 bull wwwstpcollaborativecom

THE GREAT AGILE DEBATE | STATE OF THE TEST INDUSTRY | WS-I INTEROPERABILITY TESTING |

AGILE TESTING IN PRACTICE | PROGRAM REQUIREMENTS RIGMAROLESTING |

VOLUME 6 bull ISSUE 7 bull JULY 2009contents

6 bull Software Test amp Performance

1122 MAIN FEATURE

The Four Pillars of AgileWhats most important when implementing agiletesting in your organization Learn from a self-made vertical solution provider who brought agilemethods to his own companyBy Robert Walsh

1188 The Great Agile DebateBoon to mankind or over-hyped fad Read this point-counterpoint and decide for yourselfBy Rex Black and Bob Galen

2244 The Test Industryrsquos Big TentThe test industry has no shortage of characters noraccompanying points of view about how to test trainand advance careers The industrys notables weighin on the state of software testingBy David Strom

3300 Coloring Outside the LinesWill your companys Web services work with othersLearn from a former WS-I working group leader abouttesting tools and techniques that can help you deter-mine whether interoperability will be an issue for youBy Chris Ferris

1 James Bach 2 Scott Barber 3 Jon Bach 4 Michael Bolton5 Rex Black

DEPARTMENTS

8 Presidentrsquos LetterWelcome to STPCollaborative

9 EditorialThe test industrycomprises diversephilosophies butmost with the samegoals in mind

10 STampPediaThe Agile Manifestoflew in the face ofdevelopment theoryNow its processes areubiquitousBy Matt Heusser amp Chris McMahon

AVAILABLE ONLINE

Best PracticesRequirements are a rope bridge spanningdevelopersrsquo knowledgeand the analystrsquos under-standing of business By Joel Shorestpcollaborativecombp

FutureTestQA pros Grow into agile for newopportunities By Chuck Maplesstpcollaborativecomft

ON THE COVER

6 Bob Galen 7 Dan Bartow 8 Robert Walsh 9 David Saff

10 Kent Beck

GET MORE ONLINE AT

wwwstpcollaborativecom bull 7

A PublicationREDWOODCollaborative Media

Software Test amp Performance (ISSN- 1548-3460) is published monthly by Redwood Collaborative Media 105 Maxess Avenue Suite 207 Melville NY 11747Periodicals postage paid at Huntington NY and additional mailing offices Software Test amp Performance is a registered trademark of Redwood CollaborativeMedia All contents copyrighted 2009 Redwood Collaborative Media All rights reserved POSTMASTER Send changes of address to Software Test amp Performance 105 Maxess Road Suite 207 Melville NY 11747 To contact Software Test amp Performance subscriber services please send an e-mail to membership-servicesstpcollaborativecom or call 631-393-6051 ext 200

Testing a product that was due forrelease yesterday Learn thesecrets of just-in-time testing fromlong-time consultant and testingguru ROB SABOURINstpcollaborativecomsabourin

Agile testing boom or bust BOBGALEN and REX BLACK duke itout in The Great Agile Debate aLIVE WEB EVENTstpcollaborativecomeseminars

Join us at STPCON 2009 STPCollaborativersquos first annual gath-ering STPCon is the industryrsquosleading event for software devel-opment managers testQA pro-fessionals and notable luminar-ies to network and learnndashOct19ndash23 Cambridge MAstpcollaborativecomconferences

CONTRIBUTORS

ROBERT WALSHis president andmanager of applica-tion development at EnvisionWare aprovider of librarysoftware solutionsHe has been work-

ing with agile methodologies since2002

With more than 25years of softwareand systems engi-neering experienceREX BLACK haswritten numerousbooks on softwaretest management

He is president and principal con-sultant of RBCS

A Certified ScrumMaster Practicing(CSP) since 2004BOB GALEN isthe director of prod-uct developmentand agile architectfor ChannelAdvisor

and the principal consultant forRGalen Consulting Group

CHRIS FERRISis an IBM disting-uished engineer andCTO of IndustryStandards in IBMsSoftware GroupStandards Strategyorganization Hes a

former chair of the WS-I BasicProfile Working Group

DAVID STROMformerly editor-in-chief at Toms Hard-warecom NetworkComputing Maga-zine and DigitalLanding has writtentwo books and thou-

sands of articles for trade publi-cations

Can you identify slow spots failures and bottlenecks Learnhow with app-performance sleuthSCOTT BARBERstpcollaborativecombarber

Our new resource directory show-cases all the LATEST TESTERSOLUTIONS and technologiesSearch by category see demosdownload trials get members-onlybenefits and much morestpcollaborativecomresources

July 2009

Dear Subscribers

In December 2008 Redwood Collaborative Media purchased the Software Test amp Performance

division of BZ Media We had a vision of creating new opportunities for software testers to con-

nect with and learn from test and QA colleagues around the globe through a new collaborative

media model Today we are pleased to realize that vision and unveil the new Software Test amp

Performance Collaborative

STP Collaborative will continue to bring you the high quality news valuable information and

face-to-face networking opportunities yoursquove come to expect from Software Test amp Performance

magazine and the Software Test amp Performance Conference In addition wersquore excited to

announce the launch of an all-new Web site which will serve as the cornerstone of a new mem-

bership model that enables community-building education training and peer-to-peer knowl-

edge sharing which are so vital to your profession Were excited about our new programs and

we hope you will be too We invite you to visit wwwstpcollaborativecom and join our com-

munity today

Once again we welcome you to STP Collaborative and invite you to use and comment on our

content contribute to our knowledge base and interact with each other to build a community

that will benefit everyone in your growing profession We look forward to working with you to

build a global community of knowledge workers in the software test and QA profession

Sincerely

Andy Muns

President Redwood Collaborative Media

Publisher Software Test amp Performance magazine

JAMES BACHCEOSatisfice

SCOTT BARBERChief Technologist PerfTestPlusDirector AST

KENT BECKFounder and directorThree Rivers Institute

VLADIMIR BELORUSETSSQA ManagerXerox

REX BLACKFounder and presidentRBCS

Introducing the STP Collaborative Strategic Advisory Board

8 bull Software Test amp Performance JULY 2009

MICHAEL BOLTONFounderDevelopSense

ROSS COLLARDFounderCollard amp Co

JAN FISHQA ManagerPhillips Lifeline

ROBERT GALENPrincipal ConsultantRGalen Consulting Group

MATT HEUSSERInd software developertester amp trainer Socialtext

DAVID HOLCOMBECo-founder president amp CEOeLearning Guild

ANDREW MUNSPresident amp CEORedwood Collaborative Media

RON MUNSChairmanRedwood Collaborative Media

BJ ROLLISONTest ArchitectMicrosoft

DAVID STROMFreelance columnist and speaker

IF I HAD TO DESCRIBE IN A SINGLEword the characters that make up todayrsquos castof test industry thought leaders one that fits wellwould be diverse Theyrsquore geographically di-versemdashstretching from Quebec to Kalamazoofrom Boston to Bulverde Their clients are evenmore so Engagements spread across the USand Canada to China New Zealand Russia andSweden to name just a few

Then therersquos the philosophical divide asyoursquoll discover in David Stromrsquos fine piece in thisissue titled ldquoWersquore All Part of the Storyrdquo in whichhe looks at the state of the testing industry andits major players Yoursquoll hear from the Bach broth-ers James and Jon Scott Barber Rex BlackMichael Bolton Kent Beck and many others Justto prove that we didnrsquot simplytake names from the B sec-tion of the testerrsquos phone bookwe also have Googlersquos DavidSaff weighing in

While opinions and philo-sophies vary therersquos one thingon which they all seem to agreeWith the ever-increasing com-plexity of software paired withever-shrinking release timeta-bles there will always be a needto reduce testing time andincrease efficiency Is thereagreement on how best toaccomplish that Maybe in car-toons

Which brings me to thestory behind this monthrsquos cov-er illustration concept CreditMr Strom for the caricatureidea during a brainstormingmeeting with me and Redwoodpresident Andy Muns I knewthat self-effacement would beokay with Jon Bach and hisbrother James who recently recounted the sto-ry of how the two had received caricatures ofthemselves at a conference they were bothattending We asked permission anyway ofcourse as we did with the remaining playersPeople in testing are exceedingly good natured

in my experienceat least

Varied opin-ions are furtherhighlighted in thisissue in The AgileProcess CreatesTesting Puzzlesa lively point-coun-terpoint by con-sultant Rex Blackand Scrum Mas-ter Bob Galen Ofcourse even critics of agile methods acknowl-edge merits of the practices when executed cor-rectly After all what can be bad about putting

people over process Of put-ting working software over com-prehensive documentation

You might say that agileprinciples were behind the designof our new Web site which wereexcited to launch this month Itrsquosdesigned for individuals to inter-act with each other and with usThe site also features process-es tools and resources fortesters QA professionals andsoftware managers Itrsquos designedwith the customer (you) in mindand looks to add interactivity andcollaboration to your favorite test-ing magazine And well con-stantly add more features andcontent to make this new site arich and valuable online resourceand community

Also beginning this monththe magazine has a fresh newlook Sincere thanks go to artdirector LuAnn Palazzo with-out whose talent and infinite

patience this project would not have happenedWe hope yoursquoll enjoy the magazine and participate in our new membership communityand collaborative Web site Please join us atwwwstpcollaborativecom and tell us what youthink yacute

ed notes

The Test IndustryrsquosCartoonland Villagec

Edward J Correia

ldquoWhile opinions

and philosophies

vary there will

always be a need

to reduce testing

time and increase

efficiency rdquo

VOLUME 6 bull ISSUE 7 bull JULY 2009

President Andrew Muns

Chairman Ron Muns

105 Maxess Road Suite 207Melville NY 11747+1-631-393-6051+1-631-393-6057 faxwwwstpcollaborativecom

Cover Illustration Caricatures by Steve Nyman

REDWOODCollaborative Media

Editor

Edward J Correiaecorreiastpcollaborativecom

Contributing Editors

Joel ShoreMatt HeusserChris McMahon

Art Director

LuAnn T Palazzolpalazzostpcollaborativecom

Publisher

Andrew Munsamunsstpcollaborativecom

Associate Publisher

David Karpdkarpstpcollaborativecom

Director of Operations

Kristin Munskmunsstpcollaborativecom

Chief Marketing Officer

Jennifer McClurejmcclurestpcollaborativecom

Marketing Coordinator

Teresa Cantwelltcantwellstpcollaborativecom

Reprints

Lisa Abelsonabelsonstpcollaborativecom

516-379-7097

MembershipCustomer Service

membership-servicesstpcollaborativecom

631-393-6051 x 200

Circulation and List Services

Lisa Fiskelfiskestpcollaborativecom

wwwstpcollaborativecom bull 9

IN FEBRUARY OF 2001 17 TECH-nology professionals gathered for a peerconference in Park City Utah and craft-ed a document called the Agile Man-ifesto That document is probably thesingle most influential statement on soft-ware development of the past decadeand its values led to a series of newideas and concepts insoftware testing

Agile SoftwareMethodsXP Short for ExtremeProgramming XP was cre-ated by Kent Beck duringhis work at Chrysler Atthe time projects at theauto maker struggled withheavyweight specifica-tions specialized rolesand enough turnover thatprojects would simply burnout mid-stream delivering no workingsoftware at all before being canceledExtreme Programming focuses on deliver-ing software by first developing simpleworking features and then adding morefeatures incrementally while improvingthe code constantly Most XP techniquessuch as Pair Programming Test DrivenDevelopment and Refactoring are nowconsidered engineering practices

Scrum Popularized at (now defunct)Easel Corp by Ken Schwaber and JeffSutherland At Easel requirementswere changing so quickly that thedevelopers could never get anythingcompletely finished Scrum focuses onstable requirements for each time boxcommitments from team membersand communication through dailystand up meetings Most of the focusof Scrum is on project management

DSDM Short for Dynamic SystemsDevelopment Method Developed in theUnited Kingdom DSDM advocatesnotice that many projects have fixed staffand a fixed release date DSDM thereforefocuses on techniques to allow the teamto alter the project scope while maintain-ing high quality thus hitting the original

due date

Agile SoftwareProcessesTraditional Software Pro-cess generally breaksdown work by phase orby type of work all of therequirements first thenall of the coding then allof the testing and so onAgile processes insteadbreak down work by fea-ture one thin slice at atime of the final product

which the customer can use to actuallyand concretely do something new

Agile teams then batch the work upinto time boxes periodically bringing thecode to production quality Agile teamsrelease production code as often asevery few days to as rarely as every fewmonths Because of that teams needgreater discipline and skill to deal with anincreased regression testing burden

Iteration Used in Extreme Program-ming an iteration is a time box (typicallyone to three weeks) starting with arequirements exercise and ending withnew features at production quality

Sprint A short time box used in Scrumthat allows the technical staff to createcode ready for production without therequirements changing as they workTechnical staff are expected to own thedeliverable and release production fea-tures at the end of the time box

User Story A form of requirement thatdescribes a single specific meaningfulpiece of functionality that has value to the

business User stories are small and canbe implemented by a pair of programmersin one or a few days User Stories line uproughly with a single scenario within ause case in traditional software devel-opment A user story is not a specifica-tion it is an example of what the softwarewill allow a person to do

Story Card Stories are often express-ed on 3x5 index cards Agile Coach RonJeffries has described a story card as aplaceholder for a conversation

Story Tests A series of tests express-ed in plain English but eventually automat-ed Story tests provide examples for theprogrammers and testers and describethe minimum amount of functionality forthe story to be called done The goal ofstory tests is not to convey a comprehen-sive specification but to provide examplesso that the whole team can build a sharedmental model of what is expected Story Kickoffs A kickoff is the finalmeeting held before developers start

writing code The idea that testersshould be involved early in the processhas been around for decades The agiletwist is to have testers help as part ofthe design for the story During the kick-off developers product owners andtesters can discuss and ask questionsabout what the software will do this canlead to a better description and prevent

stamp pedia

10 bull Software Test amp Performance JULY 2009

Agile Testing in Practicec

The encyclopedia for software testing professionals

Matt Huesser andChris McMahon

Matt Heusser and Chris McMahon are careersoftware developers testers and bloggers Mattalso works at Socialtext where he performstesting and quality assurance for the compa-nyrsquos Web-based collaboration software

ldquoGood ideas can come

from anywhere at

any time rdquo

continued on page 36 gt

wwwseapinecomstpswiftSatisfy your quality obsession[ [

copy 2009 Seapine Software Inc All rights reserved

Swiftcovercom cut their testing time in half with TestTrack Studio and QA Wizard Pro while still providing the quality their customers expectSeapinersquos end-to-end Software Quality Assurance (SQA) solutions help you deliver quality products faster Start with

QA Wizard Pro for automated testing and add TestTrack Studio for issue tracking and test case managementmdashintegrated quality assurance solutions that together reduce testing time saving you money and improving customer satisfaction

Reduce quality assurance costs with automated functional and regression testing

Manage test case development defect assignments and QA verification with one application

Track which test cases have been automated schedule script runs and link test results with defects

Learn how to test faster while protecting quality Visit wwwseapinecomstpswift

Save timewhile protectingsoftware quality

So much of success boils down to time QA Wizard Pro and TestTrack Studio allow us to be more protable because we do more in less time mdash Test Manager Swiftcover

Test

Trac

kregPr

o T e

stTr

ackreg

TCM

T e

stTr

ackreg

Stud

io

Surr

ound

SCM

regSe

apin

e C M

regQ

A W

izar

dregPr

oIss

ue M

anag

emen

t Te

st C

ase

Man

agem

ent

Test

Pla

nnin

g amp

T rac

king

Co

nfi g

urat

ion

Man

agem

ent

Chan

ge M

anag

emen

t Au

tom

ated

Test

ing

Illus

trat

ion

by S

teve

Nym

an

necessary for the others to succeed They alsoprovide a system of checks and balances thatallows the strengths of one to compensate forthe weaknesses of the others Each pillar is theresponsibility of a different group within theagile development team and when each groupdoes its part my experience has shown that theresult is usually a successful project

AUTOMATED UNIT TESTINGUnit testing is perhaps the easiest type of test-ing to automate Automated unit testing is apractice through which nearly every line andbranch of code is covered with an automatedtest that proves the code works as intendedAutomated unit tests focus on the discreteunits or modules within the application eachperforming its specific tasks in isolation fromthe rest Automated unit tests may be writtenbefore or after the code under test is writtenbut itrsquos most effective to write them beforehand(a process known as Test-Driven Developmentor TDD)

With TDD the code tends to more close-ly represent the developerrsquos intent becausefor the developer itrsquos easier to write justenough code to satisfy the single new failingtest Less code written means less code tomaintain and fewer places where bugs canhide When the unit tests are written after thecode the tests scan sometimes show onlythat the code works as written and not nec-

essarily as intended Further therersquos a ten-dency to write more code than is necessarybecause the programmer is speculating aboutwhat conditions might exist rather than start-ing from concrete test cases that highlight thetrue requirements

Automated unit testing falls almost solelyon the shoulders of the developers and helpsto ensure that the code they are writingbehaves according to their interpretation ofthe requirements The word interpretation isimportant there because while a modulemight achieve 100 percent code coverage andpass all of its unit tests it might still fail tomeet the needs of the application If the pro-grammer misunderstands the requirementthe tests he writes will be flawed and theresulting code will misbehave Thatrsquos amongthe reasons that automated unit testing is notthe only form of automated testing employedon an agile project

Automated unit tests are generally writtenin the same language as the application beingtested A family of testing tools exists to facili-tate automated unit testing Among the mostwell-knows is jUnit created for Java by KentBeck and Erich Gamma with ports now avail-able for most languages These tools provide

By Robert Walsh

Successful agile testing relies on a strategy built on four pillars automated unit test-ing automated acceptance testing automated regression testing and manualexploratory testing As with columns that hold up a building no single pillar cansupport the entire load The four are interdependent and each provides benefits

THE BUILDING BLOCKS OF AGILEmdash

AUTOMATED UNIT ACCEPTANCE AND

REGRESSION TESTING PAVE THE WAY TO

MANUAL EXPLORATORY TESTING

Robert Walsh is president and manager of applicationdevelopment at EnvisionWare which provides soft-ware solutions for public and academic libraries

14 bull Software Test amp Performance JULY 2009

language elements for defining testcases combining tests into suites andasserting the various exit conditions thatdetermine whether the tests pass or failMany also provide a run-time environ-ment in which the tests execute Theserange from simple command-line reportsshowing the results of the tests to com-plete GUIs that integrate directly withthe developerrsquos programming environ-ment Automated unit tests often areintegrated with the continuous build sys-tem and are run each and every time theproject is built This helps to ensure thatthe developers know instantaneouslywhen a recent check-in has caused thesuite of unit tests to fail

When working with automated unittests developers commonly find itnecessary to use mock objects fakeobjects or method stubs to stand in forother parts of the application Forexample if a module obtains informa-tion from a database it might beadvantageous to use a fake databaseobject to provide canneddata that is similar to thereal data available in theproduction environmentThis approach gives theprogrammer more controlover the test environ-ment and lets them dic-tate which data isreturned and how to rep-resent a variety of inter-esting situations Furtherthe technique allowsdevelopers to simulateproblems and failures thatmay be difficult to triggerwhen using a real data-base These might includeconnectivity issues bufferoverruns and other errorconditions

Automated unit testsprovide several key ben-efits in an agile environ-ment First they make itpossible to achieve highlevels of code coveragealmost as a side-effect Ifa programmer is disciplined aboutTDD then near 100 percent coveragejust happens Second automated unittests help to provide an early warningindicator that shows when behaviorwithin a module has changed This canbe used to identify unintended side-effects of other development and toserve as a kind of safety net for later

refactorings If after reworking a mod-ule all of the existing unit tests passthe developer can be confident that themodule will function properly in thecontext of the full application Finallyautomated unit tests also help to raisethe initial quality of the applicationdelivered to testers By using tests aspart of the coding process developersare able to catch many of the minorerrors and trivial mistakes that wouldotherwise fall to testers Reducingobvious coding errors through unittesting helps free up testers for theimportant job of manual exploratorytesting

AUTOMATED ACCEPTANCETESTINGAn essential element to the success ofan application project is how well itmeets the needs of its users A prod-uct might be absolutely flawless per-fect in every way and still not do thejob One way to ensure that the busi-

ness needs are met iswith automated accept-ance testing

With automated ac-ceptance testing the cus-tomer often with help ofQA writes tests for thekey features that arescheduled to be imple-mented in the upcomingiteration These tests arewritten in a natural lan-guage that non-technicalpeople can read and as aresult they become exe-cutable specifications thatcannot get out-of-syncwith the application beingtested

Automated accept-ance tests tend to focuson successful use casesthey often ignore the sadpaths and exceptional situ-ations For this reason atest strategy built entirelyon automated acceptancetesting will be inadequate

to thoroughly test the applicationSometimes the testers will augment thetest suite created by the customer withadditional acceptance tests that docover more of the corner cases

Automated acceptance tests pro-vide the primary measure of progress asthe product evolves over the course ofan iteration or even across an entire

release plan The acceptance tests arewritten early in (or even before) the iter-ation and running them clearly illustrateshow many of the stories have been com-pleted As part of an iteration reviewmeeting the tests are run again todemonstrate that the scope selected atthe start of the iteration was implement-ed according to the customerrsquos own def-inition of what is correct This process ismainly a formality it should be no sur-prise to anyone whether the tests willpass or fail when run during the iterationreview meeting

Two approaches to automatedacceptance testing are the Frameworkfor Integrated Test (FIT) developed byWard Cunningham and FitNesse awiki originally written in Java by MicahMartin Robert C Martin and MichaelFeathers FitNesse extends the princi-ples established by Cunninghamrsquos tooland provides a mechanism by whichtests may be created and maintained inaddition to being executed Both are

[With

automated

acceptance

testing

the

customer

writes

tests for

the key

features]

based on creating tables of input datathat are fed into the application undertest The programrsquos output is capturedand compared to the expected resultswhich is also provided in the tables

While tests themselves are definedusing a combination of narrative textand tabular data theyrsquore executed byfixture code that accepts the inputfrom the tables calls the appropriatemethods in the application and pro-vides the output back to the test run-ner These fixtures are generally writ-ten in the language of the applicationand must be implemented and main-tained by developers However thegoal is that they be thin and serve onlyto translate inputs and outputsbetween the test harness and theapplication Therefore the overallimpact that working with the fixturecode has on the developersrsquo productiv-ity should be minimal

As with automated unit tests auto-mated acceptance tests are likely tobe run against an incomplete applica-tion Not much functionality exists inearly iterations so it may be necessaryto test one part before another hasbeen built Again simulators are oftenused to provide the missing behaviorsIt is up to the project team to decidewhether the simulators are replacedwith real implementations as theybecome available in later iterationsEach approach has its advantagesUsing real components may providemore confidence that the applicationworks properly However using simula-tors may provide more control over thesystem during the tests and as withthe mocks and fakes used in unit test-ing itrsquos sometimes easier to recreatefailures to verify that the applicationhandles them properly

AUTOMATED REGRESSIONTESTINGResponsible thorough testing mustverify not only the functional require-ments but non-functional requirementsas well Automated regression testingwhich seeks to verify functions thathave worked in the past can be aneffective way to provide this additionalcoverage and fill in the gaps left by theacceptance tests

Unlike unit and acceptance testingwhich most heavily involve developersand customers automated regressiontesting is performed by testersAutomated regression tests should

JULY 2009 wwwstpcollaborativecom bull 15

CHALLENGES OF TESTING IN AGILE ENVIRONMENTS

The agile environment exaggerates many of the challenges faced in more convention-

al approaches to testing First the pace of the development effort requires that the

application be tested within each iteration in parallel with the actual coding

Conventional approaches that focus on full regression during a time when the code is

frozen are ineffective in agile environments Agile testing tends to focus on the new

functionality and providing rapid feedback to the programmers However regression

testing is still important and some mechanisms by which the entire application may

be exercised end-to-end must be implemented

Testing on agile projects begins almost immediately However in the early iterations

there may not be much of the application to test Conventional approaches that focus

on running the completed application do not work well in agile environments Even

when automated efforts are employed in conventional environments the emphasis

often is on exercising the application from the outside in In other words scripts drive

the application through the UI In a methodology where the UI evolves with the rest of

the application though it is not possible to test the application in this manner

Requirements in agile environments tend to be less formal than in conventional

methodologies Requirements also referred to as stories in some agile iterations often

begin life as a single phrase or short sentence describing at a very high level what the

application must do As part of the iteration or release planning process the project

teammdasha cross-functional mixture of programmers testers business analysts and

stakeholdersmdashcollaborate to flesh out the stories The stakeholders often referred to

collectively as the customer focus on the observable ldquohappy-pathrdquo outcomes while

the testers add details to cover the odd exceptional or ldquocornerrdquo cases In some agile

methodologies these details are captured in the form of acceptance tests Conventional

approaches that use requirements documents as the basis of test cases tend to struggle

in agile settings

Agile depends heavily on effective collaboration among the programmers testers and

stakeholders It is not uncommon in agile environments to find testers and developers

working closely together on a daily basis This tight coordination of the various groups

often is absent in more conventional methodologies where the product transitions

through the development life cycle as a series of handoffs The business analysts deliv-

er the requirements to the architects who in turn deliver a design to the programmers

who deliver an application to the testers who test it according to the initial require-

ments People working in conventional settings sometimes are uncomfortable collabo-

rating with those in other groups

Successful agile efforts must find ways to overcome each of these challenges

Automation is essential to addressing the issue of pace By building automation infra-

structure early in the project when there is little real functionality available to test it

becomes far easier to run more tests later in the project Further this approach helps

to provide the regression testing necessary to ensure the product is evolving stably and

not losing functionality once working However not all tests can nor should be auto-

mated Skilled and experienced professional testers can add significant value through

effective manual testing By automating what can be automated testers have more

time to apply their talents through manual testing efforts

attempt to cover as much of the appli-cation as possible and they shouldrely on mocks and simulators sparing-ly This allows the tests to provide afairly realistic view of the state of theprogram and to more accurately pre-dict how it will behave in a productionenvironment

Creating and maintaining a usefulsuite of automated regression tests maybe the most difficult aspect of testing inan agile environment In early iterationsnot much of the application exists sothere are few meaningful regression

tests that can be written and executedAs the program grows the scope andcomplexity of the project may becomeoverwhelming and it may be challengingto keep up with the pace of develop-ment It is usually helpful to spend earlyiterations creating a framework forautomated regression tests andpreparing outlines and test scripts forfunctionality that will be added overtime By investing in the scaffolding inthe beginning it becomes possible forthe testers to add new regressiontests incrementally Also the existing

regression tests help to ensure thatthe application continues to performaccording to the expectations estab-lished in prior iterations This makes itpossible for the testers to continue tofocus on new functionality rather thanconstantly reconfirming existing behav-ior is correct

In conventional development meth-odologies there are many categoriesof tests intended to exercise the appli-cation end-to-end These include per-formance tests load tests stresstests integration tests smoke testsand others In the end each of thesemay be thought of as a type of regres-sion test All attempts to determinehow well the product meets the func-tional and non-functional requirementsalso attempt to exercise the completeapplication and all help to ensure thatfunctionality is never lost once it hasbeen implemented

One of the key components ofagile methodologies is a focus on busi-ness value Therefore any of thesetypes of tests that provide valuableand meaningful feedback about thestate of the project should be givenattention in agile environments Forexample if performance is important tothe application then performancetests should be automated and made apart of the regression test suite Thesame is true for any of the other typesof tests mentioned here

Because automated regression

[Automated

regression tests

should cover

as much of the

application

as possible

and should

use mocks

sparingly]

16 bull Software Test amp Performance JULY 2009

tests may serve a variety of purposesthere is no standard approach to imple-menting them Often a combination oftools must be employed to address eachspecific situation Both commercial andfree tools are available for the differenttypes of testing and each may be bestsuited to a particular type of applicationIn addition some teams build customframeworks using Ruby Java PerlPython or another language to get thebest match for their project

MANUAL EXPLORATORY TESTINGThere are two main types ofdefects those under whichthe application fails to performaccording to expectations (asdefined by its requirements)and those that cause theapplication to behave improp-erly in response to an unex-pected event Automatedtests can be effective at find-ing the former For a given setof expected inputs the appli-cation should produce a specific output Howeverautomation is not particularlygood at finding the unexpect-ed

Consider for example anapplication that permits a userto perform tasks A B and CAs the product was beingdeveloped no one expected that A Band C were in any way related Howeverwhen the user performs these tasks in aspecific order the program crashes It isunlikely that the functional requirementsinclude a statement like ldquoThe systemshall not crash when the user performsA then C then Brdquo As a result it isunlikely that an automated test exists tosimulate the situation This is where man-ual exploratory testing can be beneficial

Manual exploratory testing gives thetester an opportunity to truly demon-strate their value to a project Testerstend to be good at coming up withunusual ldquowhat ifrdquo scenarios that repre-sent what actual users will do with theapplication in the production environ-ments These might be long series oftasks odd combinations of events orprolonged use of the program Oftenthese scenarios are too difficult or cost-ly to automate

Manual exploratory testing is morethan the tester wandering randomlythrough the application To be done

effectively the tester must start with aplan He needs to isolate a section ofthe program to explore and time box theexploration Once the tester begins theexploration he should create a record ofthe actions taken This is crucial to theability to recreate any problems foundScreen recording tools also might proveuseful for documenting exactly what thetester does during an exploratory testingsession When flaws are found duringthis process new acceptance or regres-sion tests should be written (and auto-mated) to expose the bad behavior pro-

vide a verification that it has been cor-rected and ensure that it does notreturn

For software testing to be success-ful in an agile development environmenta four-part strategy comprising compli-mentary approaches has given me thebest results

Automated unit testing puts theonus of removing obvious errors ondevelopers where it belongs and resultsin higher initial quality delivered totesters

Automated acceptance testingsolves the critical problem of engagingstakeholders to identify key functionalityand to define what it means for the pro-gram to work correctly and provide con-fidence that the core functionality isbeing properly implemented

Automated regression testing pro-vides thorough coverage ensures thatthe application meets functional and non-functional requirements and guaranteesthat functionality once working is not dis-abled in later iterations

Manual exploratory testing exposesunexpected behavior by letting testersexercise the application in ways too dif-ficult or costly to automate

This four-phase approach to testingenables a project team to effectively andthoroughly determine an applicationrsquosreadiness for release Consistent with

agile practices testingbegins early in the softwaredevelopment life cycle and itis done constantly through-out the project Every teammember participates in someway in the testing effort andthe mixture of testing typesminimizes the risks associat-ed with any single techniqueBy automating what can beautomated it becomes pos-sible to provide testers withsufficient time to fully applytheir unique skills and experi-ences yacute

REFERENCES AND FURTHERREADINGbull Cunningham Ward Framework

for Integrated Test October 122007 httpfitc2com

bull Dunkin Robert Burrell TheLanguage of Testing Dummies

Mocks Fakes and Stubs Weirdest UndreamtUse Case November 29 2008httpwwwjrollercomrobertburrelldonkinentrytdd_mocks_fakes_stubs

bull FitNesse Front Page FitNesse httpwwwfitnesseorg

bull Fowler Martin Mocks Arenrsquot Stubs MocksArenrsquot Stubs January 2 2007 httpwwwmartinfowlercomarticlesmocksArentStubshtml

bull jUnitorg Resources for Test DrivenDevelopment jUnit httpwwwjunitorg

bull Porter Robert Mock Objects in Test DrivenDesign BlogOfBob October 7 2007httpwwwrp2ccomblogofbobMockObjectsInTestDrivenDesignaspx

bull Rothman Johanna Does Exploratory TestingHave A Place On Agile Teams StickyMinds July7 2008httpwwwstickymindscomsitewideaspFunction=edetailampObjectType=COLampObjectId=13860

[Manual exploratory

testing is more than

wandering randomly

through the application

the tester must start

with a plan]

LEARN TEST AUTOMATION from Robert Walsh at the Software Test ampPerformance Conference this fall Hersquoll be with us in Cambridge MAOct 19-23 teaching ldquoFirst Steps to Test Automationrdquo an in-depth coursethat can automate your testing fast VViissiitt wwwwwwssttppccoonnccoomm

observed with clients using agile methodologies typi-cally Scrum While my associates and I have helpedclients overcome these challenges I will not present allof our suggested solutions here Instead Software Testamp Performance magazine has given test consultant andcertified Scrum Master Bob Galen an opportunity tooffer solutions

What I describe here are challenges we haveobserved or discussed with clients who told us they werefollowing an agile life cycle Some of these challengesmay be inherent in agile methodologies However somemay in fact actually arise from improper understanding orimplementation of the chosen methodology resulting inavoidable problems

VOLUME AND SPEED OF CHANGE One of the principles of agile development is that proj-ect teams should welcome changing requirementseven late in development Many testing strategies

especially analytical require-ments-based testing becomequite inefficient in such situa-tions While automated test-ing can help accommodateon-going change automationcan be fragile and high-main-tenance if not implementedproperly We find that risk-based testing coupled withreactive techniques likeexploratory testing and prop-

By Rex Black

Every software development model presents itsown testing implications Some ease the testingprocess others make it more challenging Whatfollows is a look at some challenges I have

Rex Black is founder and president of RBCS a software hardware andsystems consultancy and is a founding member of the STP CollaborativeStrategic Advisory Board

18 bull Software Test amp Performance JULY 2009

The AgileProcessCreatesTestingPuzzles

Which AreSolved When

Agile IsPerformedCorrectly

of traditional testing consultants I donrsquot believe theywere gathered from the perspective of an actual agiletester one with serious experience working in matureand high performing agile teams

Such views often come from the perspective of ascribe or observer of teams who at least from my per-spective arenrsquot practicing agility very well Or theyrsquoreanecdotal reactions to the Agile Manifesto or other writ-ings without trying to understand the underlying intentionsWhatrsquos discouraging is that superficial reactions from wellrespected sources can give the wrong impression

VOLUME AND SPEED OF CHANGEThe point of embracing change is the realization thatmost software projects are volatile Business conditionsare incredibly dynamic Customers may not actuallyknow what they need until they see the application startto coalesce as working software The developers maynot have understood the challenges associated with thearchitecture without actuallybuilding parts of it nor mighttesters understand the mostrisk-based areas to test with-out seeing the application run

The point of trying to freezerequirements at a historicalinstant in time where thingsare well understood withoutwriting a lick of software is afoolrsquos errand Agile methods tryto embrace change react to it

Iwas asked by Software Test amp Performance magazine to provide a rebuttal to Rex Blackrsquos perspectives on how ldquoagile challenges testingrdquoRexrsquos views and reactions to agility are typical

By Bob GalenBob Galen is principal consultant of the RGalen Consulting Group anda founding member of the STP Collaborative Strategic Advisory Board

JULY 2009 wwwstpcollaborativecom bull 19

erly-done automation works bestWhatrsquos more change-related challenges often arise from

changes in the definition of the product and its correct behav-ior When the test team is not kept informed of thesechanges or when the rate of change is very high this canimpose inefficiencies on the development execution andmaintenance of tests

SHORT ITERATIONS SQUEEZE TESTINGIn sequential life cycles if the organization involves the testteam early in the project the testers have lots of time todevelop and maintain their tests and can do so in parallel withthe development of the system and prior to the start of sys-tem test execution Some more formal iterative lifecycle mod-els such as Rapid Application Development and the RationalUnified Process often allow substantial periods of timebetween test execution periods for each iteration Theseintervals allow test teams develop and maintain their test sys-tems

Most teams using agile life cycles progress more quicklyeg Scrum uses sprints two to four week periods of shortfast-paced development iterations I have seen this pacesqueeze the test teamrsquos ability to develop and maintain testsystems Testing strategies that include an automation ele-ment also have proven particularly sensitive to this challenge

UNIT TESTING IS OFTEN INADEQUATE Most agile proponents stress good automated unit testing andframeworks such as JUnit have helped simplify and minimize

and make the changes visible across the team and to stakehold-ers while guiding the software toward the customersrsquo true needsIf the traditional testing approaches that Rex alludes to can notadapt to change effectively then Irsquod argue that they need tochange

SHORT ITERATIONS SQUEEZE TESTINGIf your parochial goal is simply testing then I agree that shortiterations can be quite challenging But if your goal is to deliversmall chunks of working software to gain customer and prod-uct quality feedback and to ultimately deliver high value andworking products applications then small iterations rock

To Rexrsquos point regarding ldquosqueezing test system mainte-nancerdquo all of the agile methods adhere to the notion of com-pleting a small set of features that meet rdquodonenessrdquo criteriaat the end of each Iteration or Sprint

There should be no work that is left outmdashfor exampleappropriate test automation development or maintenance oftest ldquosystemsrdquo

Itrsquos up to the testers to simply place this work on theProduct and Sprint Backlogs and plan its completion as partof each Iteration or Sprint

Sometimes this takes effort and courage to defend this workagainst feature pressure However all team members encounterthe same challenge and need to ensure that they deliver ldquodonerdquosoftware from each of their functional perspectives

UNIT TESTING IS OFTEN INADEQUATEFirst agile methods stress automated testing of all kindsmdashnotsimply unit tests FitNesse Selenium and Watir Watin are

examples of open source automation tools that are widelyused in the agile community and that focus on non-unit testing

We do stress unit tests as being an important part of theoverall coverage and as being a safety net for the teamsrsquo refac-toring efforts We want developers to be responsible for con-tributing to test automationmdashnot just testers And if a team isusing TDD as a practice wersquove learned that developing testscan be a powerful aid as one designs their software in smallincremental steps

No agile team to my knowledge views unit tests as theend-all of testing In fact as part of delivering ldquodonerdquo softwareincrements they must decide on the set of tests (unit func-

20 bull Software Test amp Performance JULY 2009

[I expect testers to pair with

developers to improve practices anddevelopers

to deliver solid unit tests before they

can be considered lsquodonersquo]

tional acceptance non-functional regression etc) thatshould be run within the Sprint in order to meet a customer-ready deliverable Every team member should be definingthese testing tasks with the testers taking the lead

To Rexrsquos last point that programmers donrsquot commit to doingeffective unit testing I expect the testers to pair with the devel-opers to improve these practices and I expect developers todeliver solid unit tests before they can be considered ldquodonerdquo

MANAGING THE INCREASED REGRESSION RISKThere is a continuing misperception that unit tests are theonly tests run on a Sprint-by-Sprint basis While itrsquos true thatthe team might not be able to run a full regression test with-in each Sprint it is their responsibility to apply risk-basedtechniques based upon whatrsquos been added or changed andto run appropriate functional non-functional exploratory andregression tests as part of delivering ldquodonerdquo software at theend of the Sprint

Clearly automation development at all levels helps withthis so the team should be investing time in improving theiroverall ability to automate all of the testing types I mention asappropriate

POOR OR MISSING TEST ORACLESI hate to break the news to Rex but the best test oracles arethe people within the team not the documentation Documen-tation almost always loses its relevance and accuracy Yes itrsquoshelpful but should never be considered the single source forall knowledge

What agility encourages is for the team to talkmdashto plan their

the costs Such automated unit tests allow what agile propo-nents call refactoring Refactoring is the redesign of majorchunks of code or even entire objects The automated unit

tests provide for quick regression testing of refactored codeSome agilists recommend substituting automated unit tests fordesign as in Test Driven Development While this soundsgood in theory there are two problems that I have observedwith this approach in practice

First unit testing has limited bug-finding utility In researchpublished in June 2002 in CrossTalk the Journal of Defense

Software Engineering author Capers Jones found unit testingto be only 25 to 30 percent effective at finding and removingdefects In RBCS assessments we have found that good sys-tem testing by an independent test team averages around 85percent effective at finding defects So organizations that careabout quality should have both good unit testing and good sys-tem testing

Second while unit testing is often thought of as thedeveloperrsquos job I have found that many programmers justdonrsquot do it or deliver cursory attempts This can create com-plications and delays for the system test team in its role asldquobug finder of last resortrdquo as they deal with buggy code dur-ing short test execution periods in the sprints This can alsolead to code churn during testing since so much code mustchange to fix the bugs

The amount of change can ultimately outstrip even theability of the best automated regression test system to keepup which then leads to lower defect detection effectivenessfor the test team

MANAGING THE INCREASED REGRESSION RISKCapers Jones also reported that regression accounts forabout seven percent of bugs In iterative life cycles such asScrum however code that worked in previous sprints getschurned by new features in each subsequent sprint Thisincreases the risk of regression

Agilists emphasize good automated unit testing in part tomanage the regression risk inherent in such churn But with eventhe best unit testing topping out at about 50 percent defect

JULY 2009 wwwstpcollaborativecom bull 21

[While unit testing is often thought

of as the developerrsquos job I have found that

they just donrsquot do it or [they] deliver

cursory attempts]

22 bull Software Test amp Performance JULY 2009

work together to cluster in small groups working on specific fea-tures to work with the customer gaining valuable feedbackmdashandto collaborate across disciplines From this they can decidewhere when and to what degree documentation is necessary

We expect team members to be professionals and to beresponsible and accountable We trust their judgment overblindly filling in a plan template or a requirement document orblindly running the same tests over and over again to meetsome perceived risk

All oracles have a cost We engage the customers in thesedecisions as wellmdashensuring that those oracles that are providedare actually used and kept current

THE SHIFTING TEST BASISIrsquom lost here Agile teams define requirements Some usePost-it Notes and cards or define them as User StoriesOthers describe use cases or use traditional requirementforms Many even add a construct called acceptance tests ona finely grained story level that helps define the critical ele-ments of each feature or requirement These are things thatthe customer will ldquotestrdquo as part of their acceptance at the endof a Sprint or Iteration

Irsquod guess that Rex has seen poor or missing requirements inhis agile experience and Irsquod say those were not mature or solid-ly performing agile teams ldquoGoodrdquo teams define requirementsat a level that assures the team will build what the customerneeds They ensure collaboration surrounding each require-mentmdashat the point of attack

To capture a key point here The agile methods try to adopta Just-in-Time and Just Enough approach to their requirement

removal effectiveness according to Jones automated regressiontesting via unit tests will miss at least half of the regression bugs

POOR OR MISSING TEST ORACLESAgile favors working software over written documentationand in my observations special scorn is reserved for specifi-cations The suggestion by the Agile Manifesto (wwwagile-manifesto org) that people should value ldquoworking softwareover comprehensive documentationrdquo can create real chal-lenges for a test team when taken too far Testers userequirements specifications and other documents as testoracles ie the means to determine correct behavior undera given test condition We have seen testers in agile situa-tions given documents with insufficient detail and in somecases given no documents at all

Even when testers are given adequate documents twoother agile development principles keep the test oracle chal-lenge alive First agile requires teams to embrace changeand espouses that ldquothe most efficient and effective methodof conveying information to and within a development team isface-to-face conversationrdquo

From what Irsquove seen these two principles allow the proj-ect team to change the definition of correct behavior at anytime including after testers have created tests to confirm aparticular behavior and even after testers have reported bugsagainst a particular behavior

Further the definition of correct behavior can change in ameeting or a discussion where a tester is not present If thatchange is not communicated to the tester the project suffers

false positives when testers report bugs against behaviornow defined as correct

THE SHIFTING TEST BASISRequirements-based testing strategies cannot handle vague ormissing requirements specifications Missing requirementsspecifications would mean a test team following a requirements-based testing strategy not only canrsquot say what it means for aparticular test to pass or fail they wouldnrsquot have a test basisupon which to base their tests

The test basis also provides a means to measure theresults In a requirements-based test strategy testers canrsquotreport test results accurately if the requirements are missing orpoor Testers canrsquot report the percentage of the test basis cov-ered by passed tests because the requirements wonrsquot provideenough detail for meaningful coverage analysis

TOO MANY MEETINGSA ldquoface-to-face conversationrdquo is just another way to describe ameeting A client once jokingly described Scrum as a heavy-weight process saying ldquoIrsquom surprised at the name agilemdashitshould be called couch potato There are too many meetingsTherersquos too much jawboning I find it really ironic that there are allthese books explaining how simple it isrdquo

Perhaps it was intentional not to use the dreaded word meet-ing in the Agile Manifesto but the result is often the same In someorganizations everyone gets invited to every meeting meetingsballoon in number and duration and managers and leads are notavailable to manage and lead their team because they are in meet-

ings much of the day Effectiveness and efficiency drop Anotherclient once told me of a senior manager who said in response toa complaint from a line manager about how attending meetingswas negatively impacting his ability to lead his team ldquoWersquore goingto continue to have these meetings until I find out why nothing isgetting done around hererdquo The senior manager was I am told theonly one in the room who didnrsquot get the irony

THE BOTTOM LINEIn RBCS client engagements we typically recommend blend-ed testing strategies that align well with Scrum and otheragile methodologies What seems most likely to work in manycases is a blend of risk-based testing maintainable automat-ed testing (at both the unit and system level) and a percent-age of effort spent on reactive testing (eg exploratory) Insome cases our methods will help mitigate the testing risksand reduce the testing challenges associated with thesemethodologies

Clearly there is no single way to perform effective soft-ware testing and as the industry evolves so too will the mosteffective methods that do the most good yacute

development An option of ldquono requirementsrdquo does not exist

TOO MANY MEETINGSThe point of face-to-face collaboration is that itrsquos the most effec-tive and richest way to communicate in technical teams Has anydeveloper ever implemented a requirement had a tester test itand a customer receive it with perfect alignment That wouldtruly be exceptional More oftenthe tester will say the softwareperforms differently than the testsandor the customer says I didnrsquotask for that even though theywere all working off of the exactsame requirement

The point is that technicalteams need to rely on written aswell as face-to-face collabora-tion to effectively reduce themisunderstandings and reworkthat written requirements alone often drive

Beyond this agile teams should be conducting incrediblyeffective meetings something perhaps lacking in the exam-ples Rex has seen I suspect this problem isnrsquot agile-centricand that those teams simply need a good facilitator

THE BOTTOM LINEEffectively testing todayrsquos software is incredibly challenging andwe all need to stop thinking of agile testing as somethingopposed to traditional testing Testers need every possible tool

to perform testing well Agile methods have emerged as a set ofpractices and a mindset that is context based and powerful Theytake a holistic view to testing and qualitymdashreinforcing the two asa whole Testers in this model have a powerful opportunity to notonly test but to champion and guide the overall teamrsquos focus ondelivering quality software

We need to stop our history of opposition to new testing app-roaches and begin embracing abroader set of testing practicesthat work in each projectrsquos uniquecontext I also believe we that weneed to stop quoting CapersJones as I suspect his data andstudies do not reflect the dynam-ics of mature agile teams Irsquom look-ing forward to encouraging Rex totry and gain deeper understandingof agile testing practices yacute

Bob will have ample oppor-tunity to do that in October when he and Rex meet for thefirst time as fellow members of STampP CollaborativersquosStrategic Advisory Board ndashEd

JULY 2009 wwwstpcollaborativecom bull 23

[Teams need to rely on written and face-to-face

collaboration to effectively reduce rework]

LEARN AGILE TESTING TECHNIQUESdirectly from Bob Galen himself this fall atSTPCon The Scrum Master will be on hand inCambridge MA Oct 19-23 to teach ldquoAgile andHigh-Speed Software Testing Techniquesrdquo anin-depth training course for bringing agile test-

ing to your organization VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

LEARN TEST MANAGEMENT from Rex Blackat the Software Test amp Performance Conferencethis fall in Cambridge Mass Learn about ldquoTestManagement A Risk-Based Approachrdquo the prac-tice that guides test activities based on carefulassessment and prioritization of the risks involved VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

ciplines and draw on a variety of skills to do their jobs And whilethere are many unsung testing heroes inside software develop-ment shops there are the beginnings in many organizations of agrowing recognition that test and performance engineers shouldbe considered peers of their development counterparts

We asked several testing gurus to share their thoughts onthe state of the testing industry and give us some insights on wherethey see things evolving One thing for anyone to realize is thatwhile testing as an industry is maturing it is still relatively new Thesoftware tester job isnt something that shows up on the US CensusWe are still very much a new profession says Scott Barber thefounder and chief technologist of PerfTestPlus in Palm Bay Fla Mostof our very first programmers are still alive we arent thousands ofyears old like other disciplinesrdquo

Perhaps the best rundown of the industryrsquos evolution is summed up by Michael Boltonwho lives in Toronto Testing has historically been confirmation validation and verificationhe says I believe that testing must become far more about exploration discovery investi-gation learning and communicating what youve learnedrdquo Bolton runs test consultancyDevelopSense and teaches rapid software testing methods ldquoI like to encourage people torecognize and manage uncertainty and ambiguity and instead of trying to eliminate them get-ting them down to levels that are suffi-ciently low for people to do use-ful workrdquo

While any gathering ofsoftware testers will find dis-agreement on overall pur-pose one thing is cleartesting isnt about how toimprove code qualityndashthat isthe role of the engineeringpeople who do the coding workand of management says BoltonTesting is about providing informationto those people so that better-informed decisionsabout quality are possible

The future careers for the current cropof test engineers seem to be on the rise

All this activity has made software testing a very big tent The industry has matured and grownto encompass numerous approaches to methods training and tools Because of this diver-sity there is a growing consensus that the best testers are those that can span multiple dis-

The software testing industry is at a historic crossroads Rich Internet applica-tions cloud computing the rise of virtualization and more capable mobiledevices have all made the job of producing reliable software more complexwhile at the same time enabling more sophistication for automated test tools

Wersquore All Part of the Story

We are a new profession most of our first programmers are

still alivendash Scott Barber

A skilledtester is theproduct of avariety of

experiencesndash Jon Bach

24 bull Software Test amp Performance

the business outcomes theyrsquove delivered he says He goes on to describe the best mix in his development teams A performance

engineering organization should be comprised of engineers that are just as capableor more capable than the core product engineers This allows for the performanceteam to be just as familiar with the product code base and for them to go as far asmaking performance enhancements to the product and testing the outcome of thosechanges

At Google employees involved in testing generally have one of three titles saysDavid Saff a Google software engineer in test whorsquos basedin Cambridge Mass Software engineers are expected to unit-test their own code and many take an active role in maintain-ing testing frameworks and system tests for their teams Testengineers look at the quality requirements of a project identifyholes and fill them by applying existing automation tools where pos-sible or managing manual testing teams where needed Softwareengineers in test look for holes in the tools available and serve as internal

tool developers and implementation consultants Any teamcant survive without high competence in each domain

CLOUDS ARE FORMINGThe move toward cloud computing has helped softwaretesting overall but also made for some new challengestoo Cloud computing adds layers of complexity to test-ing but it also makes things more interesting says BoltonThese layers include being able to predict latency and avail-ability of all components that reside in the cloud Barber saysthat Too many people take latency for granted still and many

application developers still take the entire network infrastruc-ture for granted and dont consider it part of their domain

so programmers just assume it is working unless theircode has a problem

Bartow agrees Network latency hasbecome even more important with the intro-duction of Rich Internet Application tech-nologies such as Adobe Flex Every value that you enter

David Strom is a freelance reporter and author living in St Louis

particularly within large software shops such as Intuit and Google There software testengineers are often more important than the development engineers who writethe original code and have their own career track with parallel compensationand benefits

At Intuit QA engineers are held to the exact same standards and expec-tations as product developersrdquo says Dan Bartow a manager of performanceengineering for the company which makes TurboTax and other financial softwareldquoIn fact my team of five full-time performance engineers is calibrated the sameway as the product engineers at the end of year in terms of their skill-sets and

Virtualizationmakes things

easierand harder

ndash Michael Bolton

wwwstpcollaborativecom bull 25

of the Story

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 2: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

AGILE TESTING IN PRACTICE | PROGRAM REQUIREMENTS RIGMAROLESTING |

VOLUME 6 bull ISSUE 7 bull JULY 2009contents

6 bull Software Test amp Performance

1122 MAIN FEATURE

The Four Pillars of AgileWhats most important when implementing agiletesting in your organization Learn from a self-made vertical solution provider who brought agilemethods to his own companyBy Robert Walsh

1188 The Great Agile DebateBoon to mankind or over-hyped fad Read this point-counterpoint and decide for yourselfBy Rex Black and Bob Galen

2244 The Test Industryrsquos Big TentThe test industry has no shortage of characters noraccompanying points of view about how to test trainand advance careers The industrys notables weighin on the state of software testingBy David Strom

3300 Coloring Outside the LinesWill your companys Web services work with othersLearn from a former WS-I working group leader abouttesting tools and techniques that can help you deter-mine whether interoperability will be an issue for youBy Chris Ferris

1 James Bach 2 Scott Barber 3 Jon Bach 4 Michael Bolton5 Rex Black

DEPARTMENTS

8 Presidentrsquos LetterWelcome to STPCollaborative

9 EditorialThe test industrycomprises diversephilosophies butmost with the samegoals in mind

10 STampPediaThe Agile Manifestoflew in the face ofdevelopment theoryNow its processes areubiquitousBy Matt Heusser amp Chris McMahon

AVAILABLE ONLINE

Best PracticesRequirements are a rope bridge spanningdevelopersrsquo knowledgeand the analystrsquos under-standing of business By Joel Shorestpcollaborativecombp

FutureTestQA pros Grow into agile for newopportunities By Chuck Maplesstpcollaborativecomft

ON THE COVER

6 Bob Galen 7 Dan Bartow 8 Robert Walsh 9 David Saff

10 Kent Beck

GET MORE ONLINE AT

wwwstpcollaborativecom bull 7

A PublicationREDWOODCollaborative Media

Software Test amp Performance (ISSN- 1548-3460) is published monthly by Redwood Collaborative Media 105 Maxess Avenue Suite 207 Melville NY 11747Periodicals postage paid at Huntington NY and additional mailing offices Software Test amp Performance is a registered trademark of Redwood CollaborativeMedia All contents copyrighted 2009 Redwood Collaborative Media All rights reserved POSTMASTER Send changes of address to Software Test amp Performance 105 Maxess Road Suite 207 Melville NY 11747 To contact Software Test amp Performance subscriber services please send an e-mail to membership-servicesstpcollaborativecom or call 631-393-6051 ext 200

Testing a product that was due forrelease yesterday Learn thesecrets of just-in-time testing fromlong-time consultant and testingguru ROB SABOURINstpcollaborativecomsabourin

Agile testing boom or bust BOBGALEN and REX BLACK duke itout in The Great Agile Debate aLIVE WEB EVENTstpcollaborativecomeseminars

Join us at STPCON 2009 STPCollaborativersquos first annual gath-ering STPCon is the industryrsquosleading event for software devel-opment managers testQA pro-fessionals and notable luminar-ies to network and learnndashOct19ndash23 Cambridge MAstpcollaborativecomconferences

CONTRIBUTORS

ROBERT WALSHis president andmanager of applica-tion development at EnvisionWare aprovider of librarysoftware solutionsHe has been work-

ing with agile methodologies since2002

With more than 25years of softwareand systems engi-neering experienceREX BLACK haswritten numerousbooks on softwaretest management

He is president and principal con-sultant of RBCS

A Certified ScrumMaster Practicing(CSP) since 2004BOB GALEN isthe director of prod-uct developmentand agile architectfor ChannelAdvisor

and the principal consultant forRGalen Consulting Group

CHRIS FERRISis an IBM disting-uished engineer andCTO of IndustryStandards in IBMsSoftware GroupStandards Strategyorganization Hes a

former chair of the WS-I BasicProfile Working Group

DAVID STROMformerly editor-in-chief at Toms Hard-warecom NetworkComputing Maga-zine and DigitalLanding has writtentwo books and thou-

sands of articles for trade publi-cations

Can you identify slow spots failures and bottlenecks Learnhow with app-performance sleuthSCOTT BARBERstpcollaborativecombarber

Our new resource directory show-cases all the LATEST TESTERSOLUTIONS and technologiesSearch by category see demosdownload trials get members-onlybenefits and much morestpcollaborativecomresources

July 2009

Dear Subscribers

In December 2008 Redwood Collaborative Media purchased the Software Test amp Performance

division of BZ Media We had a vision of creating new opportunities for software testers to con-

nect with and learn from test and QA colleagues around the globe through a new collaborative

media model Today we are pleased to realize that vision and unveil the new Software Test amp

Performance Collaborative

STP Collaborative will continue to bring you the high quality news valuable information and

face-to-face networking opportunities yoursquove come to expect from Software Test amp Performance

magazine and the Software Test amp Performance Conference In addition wersquore excited to

announce the launch of an all-new Web site which will serve as the cornerstone of a new mem-

bership model that enables community-building education training and peer-to-peer knowl-

edge sharing which are so vital to your profession Were excited about our new programs and

we hope you will be too We invite you to visit wwwstpcollaborativecom and join our com-

munity today

Once again we welcome you to STP Collaborative and invite you to use and comment on our

content contribute to our knowledge base and interact with each other to build a community

that will benefit everyone in your growing profession We look forward to working with you to

build a global community of knowledge workers in the software test and QA profession

Sincerely

Andy Muns

President Redwood Collaborative Media

Publisher Software Test amp Performance magazine

JAMES BACHCEOSatisfice

SCOTT BARBERChief Technologist PerfTestPlusDirector AST

KENT BECKFounder and directorThree Rivers Institute

VLADIMIR BELORUSETSSQA ManagerXerox

REX BLACKFounder and presidentRBCS

Introducing the STP Collaborative Strategic Advisory Board

8 bull Software Test amp Performance JULY 2009

MICHAEL BOLTONFounderDevelopSense

ROSS COLLARDFounderCollard amp Co

JAN FISHQA ManagerPhillips Lifeline

ROBERT GALENPrincipal ConsultantRGalen Consulting Group

MATT HEUSSERInd software developertester amp trainer Socialtext

DAVID HOLCOMBECo-founder president amp CEOeLearning Guild

ANDREW MUNSPresident amp CEORedwood Collaborative Media

RON MUNSChairmanRedwood Collaborative Media

BJ ROLLISONTest ArchitectMicrosoft

DAVID STROMFreelance columnist and speaker

IF I HAD TO DESCRIBE IN A SINGLEword the characters that make up todayrsquos castof test industry thought leaders one that fits wellwould be diverse Theyrsquore geographically di-versemdashstretching from Quebec to Kalamazoofrom Boston to Bulverde Their clients are evenmore so Engagements spread across the USand Canada to China New Zealand Russia andSweden to name just a few

Then therersquos the philosophical divide asyoursquoll discover in David Stromrsquos fine piece in thisissue titled ldquoWersquore All Part of the Storyrdquo in whichhe looks at the state of the testing industry andits major players Yoursquoll hear from the Bach broth-ers James and Jon Scott Barber Rex BlackMichael Bolton Kent Beck and many others Justto prove that we didnrsquot simplytake names from the B sec-tion of the testerrsquos phone bookwe also have Googlersquos DavidSaff weighing in

While opinions and philo-sophies vary therersquos one thingon which they all seem to agreeWith the ever-increasing com-plexity of software paired withever-shrinking release timeta-bles there will always be a needto reduce testing time andincrease efficiency Is thereagreement on how best toaccomplish that Maybe in car-toons

Which brings me to thestory behind this monthrsquos cov-er illustration concept CreditMr Strom for the caricatureidea during a brainstormingmeeting with me and Redwoodpresident Andy Muns I knewthat self-effacement would beokay with Jon Bach and hisbrother James who recently recounted the sto-ry of how the two had received caricatures ofthemselves at a conference they were bothattending We asked permission anyway ofcourse as we did with the remaining playersPeople in testing are exceedingly good natured

in my experienceat least

Varied opin-ions are furtherhighlighted in thisissue in The AgileProcess CreatesTesting Puzzlesa lively point-coun-terpoint by con-sultant Rex Blackand Scrum Mas-ter Bob Galen Ofcourse even critics of agile methods acknowl-edge merits of the practices when executed cor-rectly After all what can be bad about putting

people over process Of put-ting working software over com-prehensive documentation

You might say that agileprinciples were behind the designof our new Web site which wereexcited to launch this month Itrsquosdesigned for individuals to inter-act with each other and with usThe site also features process-es tools and resources fortesters QA professionals andsoftware managers Itrsquos designedwith the customer (you) in mindand looks to add interactivity andcollaboration to your favorite test-ing magazine And well con-stantly add more features andcontent to make this new site arich and valuable online resourceand community

Also beginning this monththe magazine has a fresh newlook Sincere thanks go to artdirector LuAnn Palazzo with-out whose talent and infinite

patience this project would not have happenedWe hope yoursquoll enjoy the magazine and participate in our new membership communityand collaborative Web site Please join us atwwwstpcollaborativecom and tell us what youthink yacute

ed notes

The Test IndustryrsquosCartoonland Villagec

Edward J Correia

ldquoWhile opinions

and philosophies

vary there will

always be a need

to reduce testing

time and increase

efficiency rdquo

VOLUME 6 bull ISSUE 7 bull JULY 2009

President Andrew Muns

Chairman Ron Muns

105 Maxess Road Suite 207Melville NY 11747+1-631-393-6051+1-631-393-6057 faxwwwstpcollaborativecom

Cover Illustration Caricatures by Steve Nyman

REDWOODCollaborative Media

Editor

Edward J Correiaecorreiastpcollaborativecom

Contributing Editors

Joel ShoreMatt HeusserChris McMahon

Art Director

LuAnn T Palazzolpalazzostpcollaborativecom

Publisher

Andrew Munsamunsstpcollaborativecom

Associate Publisher

David Karpdkarpstpcollaborativecom

Director of Operations

Kristin Munskmunsstpcollaborativecom

Chief Marketing Officer

Jennifer McClurejmcclurestpcollaborativecom

Marketing Coordinator

Teresa Cantwelltcantwellstpcollaborativecom

Reprints

Lisa Abelsonabelsonstpcollaborativecom

516-379-7097

MembershipCustomer Service

membership-servicesstpcollaborativecom

631-393-6051 x 200

Circulation and List Services

Lisa Fiskelfiskestpcollaborativecom

wwwstpcollaborativecom bull 9

IN FEBRUARY OF 2001 17 TECH-nology professionals gathered for a peerconference in Park City Utah and craft-ed a document called the Agile Man-ifesto That document is probably thesingle most influential statement on soft-ware development of the past decadeand its values led to a series of newideas and concepts insoftware testing

Agile SoftwareMethodsXP Short for ExtremeProgramming XP was cre-ated by Kent Beck duringhis work at Chrysler Atthe time projects at theauto maker struggled withheavyweight specifica-tions specialized rolesand enough turnover thatprojects would simply burnout mid-stream delivering no workingsoftware at all before being canceledExtreme Programming focuses on deliver-ing software by first developing simpleworking features and then adding morefeatures incrementally while improvingthe code constantly Most XP techniquessuch as Pair Programming Test DrivenDevelopment and Refactoring are nowconsidered engineering practices

Scrum Popularized at (now defunct)Easel Corp by Ken Schwaber and JeffSutherland At Easel requirementswere changing so quickly that thedevelopers could never get anythingcompletely finished Scrum focuses onstable requirements for each time boxcommitments from team membersand communication through dailystand up meetings Most of the focusof Scrum is on project management

DSDM Short for Dynamic SystemsDevelopment Method Developed in theUnited Kingdom DSDM advocatesnotice that many projects have fixed staffand a fixed release date DSDM thereforefocuses on techniques to allow the teamto alter the project scope while maintain-ing high quality thus hitting the original

due date

Agile SoftwareProcessesTraditional Software Pro-cess generally breaksdown work by phase orby type of work all of therequirements first thenall of the coding then allof the testing and so onAgile processes insteadbreak down work by fea-ture one thin slice at atime of the final product

which the customer can use to actuallyand concretely do something new

Agile teams then batch the work upinto time boxes periodically bringing thecode to production quality Agile teamsrelease production code as often asevery few days to as rarely as every fewmonths Because of that teams needgreater discipline and skill to deal with anincreased regression testing burden

Iteration Used in Extreme Program-ming an iteration is a time box (typicallyone to three weeks) starting with arequirements exercise and ending withnew features at production quality

Sprint A short time box used in Scrumthat allows the technical staff to createcode ready for production without therequirements changing as they workTechnical staff are expected to own thedeliverable and release production fea-tures at the end of the time box

User Story A form of requirement thatdescribes a single specific meaningfulpiece of functionality that has value to the

business User stories are small and canbe implemented by a pair of programmersin one or a few days User Stories line uproughly with a single scenario within ause case in traditional software devel-opment A user story is not a specifica-tion it is an example of what the softwarewill allow a person to do

Story Card Stories are often express-ed on 3x5 index cards Agile Coach RonJeffries has described a story card as aplaceholder for a conversation

Story Tests A series of tests express-ed in plain English but eventually automat-ed Story tests provide examples for theprogrammers and testers and describethe minimum amount of functionality forthe story to be called done The goal ofstory tests is not to convey a comprehen-sive specification but to provide examplesso that the whole team can build a sharedmental model of what is expected Story Kickoffs A kickoff is the finalmeeting held before developers start

writing code The idea that testersshould be involved early in the processhas been around for decades The agiletwist is to have testers help as part ofthe design for the story During the kick-off developers product owners andtesters can discuss and ask questionsabout what the software will do this canlead to a better description and prevent

stamp pedia

10 bull Software Test amp Performance JULY 2009

Agile Testing in Practicec

The encyclopedia for software testing professionals

Matt Huesser andChris McMahon

Matt Heusser and Chris McMahon are careersoftware developers testers and bloggers Mattalso works at Socialtext where he performstesting and quality assurance for the compa-nyrsquos Web-based collaboration software

ldquoGood ideas can come

from anywhere at

any time rdquo

continued on page 36 gt

wwwseapinecomstpswiftSatisfy your quality obsession[ [

copy 2009 Seapine Software Inc All rights reserved

Swiftcovercom cut their testing time in half with TestTrack Studio and QA Wizard Pro while still providing the quality their customers expectSeapinersquos end-to-end Software Quality Assurance (SQA) solutions help you deliver quality products faster Start with

QA Wizard Pro for automated testing and add TestTrack Studio for issue tracking and test case managementmdashintegrated quality assurance solutions that together reduce testing time saving you money and improving customer satisfaction

Reduce quality assurance costs with automated functional and regression testing

Manage test case development defect assignments and QA verification with one application

Track which test cases have been automated schedule script runs and link test results with defects

Learn how to test faster while protecting quality Visit wwwseapinecomstpswift

Save timewhile protectingsoftware quality

So much of success boils down to time QA Wizard Pro and TestTrack Studio allow us to be more protable because we do more in less time mdash Test Manager Swiftcover

Test

Trac

kregPr

o T e

stTr

ackreg

TCM

T e

stTr

ackreg

Stud

io

Surr

ound

SCM

regSe

apin

e C M

regQ

A W

izar

dregPr

oIss

ue M

anag

emen

t Te

st C

ase

Man

agem

ent

Test

Pla

nnin

g amp

T rac

king

Co

nfi g

urat

ion

Man

agem

ent

Chan

ge M

anag

emen

t Au

tom

ated

Test

ing

Illus

trat

ion

by S

teve

Nym

an

necessary for the others to succeed They alsoprovide a system of checks and balances thatallows the strengths of one to compensate forthe weaknesses of the others Each pillar is theresponsibility of a different group within theagile development team and when each groupdoes its part my experience has shown that theresult is usually a successful project

AUTOMATED UNIT TESTINGUnit testing is perhaps the easiest type of test-ing to automate Automated unit testing is apractice through which nearly every line andbranch of code is covered with an automatedtest that proves the code works as intendedAutomated unit tests focus on the discreteunits or modules within the application eachperforming its specific tasks in isolation fromthe rest Automated unit tests may be writtenbefore or after the code under test is writtenbut itrsquos most effective to write them beforehand(a process known as Test-Driven Developmentor TDD)

With TDD the code tends to more close-ly represent the developerrsquos intent becausefor the developer itrsquos easier to write justenough code to satisfy the single new failingtest Less code written means less code tomaintain and fewer places where bugs canhide When the unit tests are written after thecode the tests scan sometimes show onlythat the code works as written and not nec-

essarily as intended Further therersquos a ten-dency to write more code than is necessarybecause the programmer is speculating aboutwhat conditions might exist rather than start-ing from concrete test cases that highlight thetrue requirements

Automated unit testing falls almost solelyon the shoulders of the developers and helpsto ensure that the code they are writingbehaves according to their interpretation ofthe requirements The word interpretation isimportant there because while a modulemight achieve 100 percent code coverage andpass all of its unit tests it might still fail tomeet the needs of the application If the pro-grammer misunderstands the requirementthe tests he writes will be flawed and theresulting code will misbehave Thatrsquos amongthe reasons that automated unit testing is notthe only form of automated testing employedon an agile project

Automated unit tests are generally writtenin the same language as the application beingtested A family of testing tools exists to facili-tate automated unit testing Among the mostwell-knows is jUnit created for Java by KentBeck and Erich Gamma with ports now avail-able for most languages These tools provide

By Robert Walsh

Successful agile testing relies on a strategy built on four pillars automated unit test-ing automated acceptance testing automated regression testing and manualexploratory testing As with columns that hold up a building no single pillar cansupport the entire load The four are interdependent and each provides benefits

THE BUILDING BLOCKS OF AGILEmdash

AUTOMATED UNIT ACCEPTANCE AND

REGRESSION TESTING PAVE THE WAY TO

MANUAL EXPLORATORY TESTING

Robert Walsh is president and manager of applicationdevelopment at EnvisionWare which provides soft-ware solutions for public and academic libraries

14 bull Software Test amp Performance JULY 2009

language elements for defining testcases combining tests into suites andasserting the various exit conditions thatdetermine whether the tests pass or failMany also provide a run-time environ-ment in which the tests execute Theserange from simple command-line reportsshowing the results of the tests to com-plete GUIs that integrate directly withthe developerrsquos programming environ-ment Automated unit tests often areintegrated with the continuous build sys-tem and are run each and every time theproject is built This helps to ensure thatthe developers know instantaneouslywhen a recent check-in has caused thesuite of unit tests to fail

When working with automated unittests developers commonly find itnecessary to use mock objects fakeobjects or method stubs to stand in forother parts of the application Forexample if a module obtains informa-tion from a database it might beadvantageous to use a fake databaseobject to provide canneddata that is similar to thereal data available in theproduction environmentThis approach gives theprogrammer more controlover the test environ-ment and lets them dic-tate which data isreturned and how to rep-resent a variety of inter-esting situations Furtherthe technique allowsdevelopers to simulateproblems and failures thatmay be difficult to triggerwhen using a real data-base These might includeconnectivity issues bufferoverruns and other errorconditions

Automated unit testsprovide several key ben-efits in an agile environ-ment First they make itpossible to achieve highlevels of code coveragealmost as a side-effect Ifa programmer is disciplined aboutTDD then near 100 percent coveragejust happens Second automated unittests help to provide an early warningindicator that shows when behaviorwithin a module has changed This canbe used to identify unintended side-effects of other development and toserve as a kind of safety net for later

refactorings If after reworking a mod-ule all of the existing unit tests passthe developer can be confident that themodule will function properly in thecontext of the full application Finallyautomated unit tests also help to raisethe initial quality of the applicationdelivered to testers By using tests aspart of the coding process developersare able to catch many of the minorerrors and trivial mistakes that wouldotherwise fall to testers Reducingobvious coding errors through unittesting helps free up testers for theimportant job of manual exploratorytesting

AUTOMATED ACCEPTANCETESTINGAn essential element to the success ofan application project is how well itmeets the needs of its users A prod-uct might be absolutely flawless per-fect in every way and still not do thejob One way to ensure that the busi-

ness needs are met iswith automated accept-ance testing

With automated ac-ceptance testing the cus-tomer often with help ofQA writes tests for thekey features that arescheduled to be imple-mented in the upcomingiteration These tests arewritten in a natural lan-guage that non-technicalpeople can read and as aresult they become exe-cutable specifications thatcannot get out-of-syncwith the application beingtested

Automated accept-ance tests tend to focuson successful use casesthey often ignore the sadpaths and exceptional situ-ations For this reason atest strategy built entirelyon automated acceptancetesting will be inadequate

to thoroughly test the applicationSometimes the testers will augment thetest suite created by the customer withadditional acceptance tests that docover more of the corner cases

Automated acceptance tests pro-vide the primary measure of progress asthe product evolves over the course ofan iteration or even across an entire

release plan The acceptance tests arewritten early in (or even before) the iter-ation and running them clearly illustrateshow many of the stories have been com-pleted As part of an iteration reviewmeeting the tests are run again todemonstrate that the scope selected atthe start of the iteration was implement-ed according to the customerrsquos own def-inition of what is correct This process ismainly a formality it should be no sur-prise to anyone whether the tests willpass or fail when run during the iterationreview meeting

Two approaches to automatedacceptance testing are the Frameworkfor Integrated Test (FIT) developed byWard Cunningham and FitNesse awiki originally written in Java by MicahMartin Robert C Martin and MichaelFeathers FitNesse extends the princi-ples established by Cunninghamrsquos tooland provides a mechanism by whichtests may be created and maintained inaddition to being executed Both are

[With

automated

acceptance

testing

the

customer

writes

tests for

the key

features]

based on creating tables of input datathat are fed into the application undertest The programrsquos output is capturedand compared to the expected resultswhich is also provided in the tables

While tests themselves are definedusing a combination of narrative textand tabular data theyrsquore executed byfixture code that accepts the inputfrom the tables calls the appropriatemethods in the application and pro-vides the output back to the test run-ner These fixtures are generally writ-ten in the language of the applicationand must be implemented and main-tained by developers However thegoal is that they be thin and serve onlyto translate inputs and outputsbetween the test harness and theapplication Therefore the overallimpact that working with the fixturecode has on the developersrsquo productiv-ity should be minimal

As with automated unit tests auto-mated acceptance tests are likely tobe run against an incomplete applica-tion Not much functionality exists inearly iterations so it may be necessaryto test one part before another hasbeen built Again simulators are oftenused to provide the missing behaviorsIt is up to the project team to decidewhether the simulators are replacedwith real implementations as theybecome available in later iterationsEach approach has its advantagesUsing real components may providemore confidence that the applicationworks properly However using simula-tors may provide more control over thesystem during the tests and as withthe mocks and fakes used in unit test-ing itrsquos sometimes easier to recreatefailures to verify that the applicationhandles them properly

AUTOMATED REGRESSIONTESTINGResponsible thorough testing mustverify not only the functional require-ments but non-functional requirementsas well Automated regression testingwhich seeks to verify functions thathave worked in the past can be aneffective way to provide this additionalcoverage and fill in the gaps left by theacceptance tests

Unlike unit and acceptance testingwhich most heavily involve developersand customers automated regressiontesting is performed by testersAutomated regression tests should

JULY 2009 wwwstpcollaborativecom bull 15

CHALLENGES OF TESTING IN AGILE ENVIRONMENTS

The agile environment exaggerates many of the challenges faced in more convention-

al approaches to testing First the pace of the development effort requires that the

application be tested within each iteration in parallel with the actual coding

Conventional approaches that focus on full regression during a time when the code is

frozen are ineffective in agile environments Agile testing tends to focus on the new

functionality and providing rapid feedback to the programmers However regression

testing is still important and some mechanisms by which the entire application may

be exercised end-to-end must be implemented

Testing on agile projects begins almost immediately However in the early iterations

there may not be much of the application to test Conventional approaches that focus

on running the completed application do not work well in agile environments Even

when automated efforts are employed in conventional environments the emphasis

often is on exercising the application from the outside in In other words scripts drive

the application through the UI In a methodology where the UI evolves with the rest of

the application though it is not possible to test the application in this manner

Requirements in agile environments tend to be less formal than in conventional

methodologies Requirements also referred to as stories in some agile iterations often

begin life as a single phrase or short sentence describing at a very high level what the

application must do As part of the iteration or release planning process the project

teammdasha cross-functional mixture of programmers testers business analysts and

stakeholdersmdashcollaborate to flesh out the stories The stakeholders often referred to

collectively as the customer focus on the observable ldquohappy-pathrdquo outcomes while

the testers add details to cover the odd exceptional or ldquocornerrdquo cases In some agile

methodologies these details are captured in the form of acceptance tests Conventional

approaches that use requirements documents as the basis of test cases tend to struggle

in agile settings

Agile depends heavily on effective collaboration among the programmers testers and

stakeholders It is not uncommon in agile environments to find testers and developers

working closely together on a daily basis This tight coordination of the various groups

often is absent in more conventional methodologies where the product transitions

through the development life cycle as a series of handoffs The business analysts deliv-

er the requirements to the architects who in turn deliver a design to the programmers

who deliver an application to the testers who test it according to the initial require-

ments People working in conventional settings sometimes are uncomfortable collabo-

rating with those in other groups

Successful agile efforts must find ways to overcome each of these challenges

Automation is essential to addressing the issue of pace By building automation infra-

structure early in the project when there is little real functionality available to test it

becomes far easier to run more tests later in the project Further this approach helps

to provide the regression testing necessary to ensure the product is evolving stably and

not losing functionality once working However not all tests can nor should be auto-

mated Skilled and experienced professional testers can add significant value through

effective manual testing By automating what can be automated testers have more

time to apply their talents through manual testing efforts

attempt to cover as much of the appli-cation as possible and they shouldrely on mocks and simulators sparing-ly This allows the tests to provide afairly realistic view of the state of theprogram and to more accurately pre-dict how it will behave in a productionenvironment

Creating and maintaining a usefulsuite of automated regression tests maybe the most difficult aspect of testing inan agile environment In early iterationsnot much of the application exists sothere are few meaningful regression

tests that can be written and executedAs the program grows the scope andcomplexity of the project may becomeoverwhelming and it may be challengingto keep up with the pace of develop-ment It is usually helpful to spend earlyiterations creating a framework forautomated regression tests andpreparing outlines and test scripts forfunctionality that will be added overtime By investing in the scaffolding inthe beginning it becomes possible forthe testers to add new regressiontests incrementally Also the existing

regression tests help to ensure thatthe application continues to performaccording to the expectations estab-lished in prior iterations This makes itpossible for the testers to continue tofocus on new functionality rather thanconstantly reconfirming existing behav-ior is correct

In conventional development meth-odologies there are many categoriesof tests intended to exercise the appli-cation end-to-end These include per-formance tests load tests stresstests integration tests smoke testsand others In the end each of thesemay be thought of as a type of regres-sion test All attempts to determinehow well the product meets the func-tional and non-functional requirementsalso attempt to exercise the completeapplication and all help to ensure thatfunctionality is never lost once it hasbeen implemented

One of the key components ofagile methodologies is a focus on busi-ness value Therefore any of thesetypes of tests that provide valuableand meaningful feedback about thestate of the project should be givenattention in agile environments Forexample if performance is important tothe application then performancetests should be automated and made apart of the regression test suite Thesame is true for any of the other typesof tests mentioned here

Because automated regression

[Automated

regression tests

should cover

as much of the

application

as possible

and should

use mocks

sparingly]

16 bull Software Test amp Performance JULY 2009

tests may serve a variety of purposesthere is no standard approach to imple-menting them Often a combination oftools must be employed to address eachspecific situation Both commercial andfree tools are available for the differenttypes of testing and each may be bestsuited to a particular type of applicationIn addition some teams build customframeworks using Ruby Java PerlPython or another language to get thebest match for their project

MANUAL EXPLORATORY TESTINGThere are two main types ofdefects those under whichthe application fails to performaccording to expectations (asdefined by its requirements)and those that cause theapplication to behave improp-erly in response to an unex-pected event Automatedtests can be effective at find-ing the former For a given setof expected inputs the appli-cation should produce a specific output Howeverautomation is not particularlygood at finding the unexpect-ed

Consider for example anapplication that permits a userto perform tasks A B and CAs the product was beingdeveloped no one expected that A Band C were in any way related Howeverwhen the user performs these tasks in aspecific order the program crashes It isunlikely that the functional requirementsinclude a statement like ldquoThe systemshall not crash when the user performsA then C then Brdquo As a result it isunlikely that an automated test exists tosimulate the situation This is where man-ual exploratory testing can be beneficial

Manual exploratory testing gives thetester an opportunity to truly demon-strate their value to a project Testerstend to be good at coming up withunusual ldquowhat ifrdquo scenarios that repre-sent what actual users will do with theapplication in the production environ-ments These might be long series oftasks odd combinations of events orprolonged use of the program Oftenthese scenarios are too difficult or cost-ly to automate

Manual exploratory testing is morethan the tester wandering randomlythrough the application To be done

effectively the tester must start with aplan He needs to isolate a section ofthe program to explore and time box theexploration Once the tester begins theexploration he should create a record ofthe actions taken This is crucial to theability to recreate any problems foundScreen recording tools also might proveuseful for documenting exactly what thetester does during an exploratory testingsession When flaws are found duringthis process new acceptance or regres-sion tests should be written (and auto-mated) to expose the bad behavior pro-

vide a verification that it has been cor-rected and ensure that it does notreturn

For software testing to be success-ful in an agile development environmenta four-part strategy comprising compli-mentary approaches has given me thebest results

Automated unit testing puts theonus of removing obvious errors ondevelopers where it belongs and resultsin higher initial quality delivered totesters

Automated acceptance testingsolves the critical problem of engagingstakeholders to identify key functionalityand to define what it means for the pro-gram to work correctly and provide con-fidence that the core functionality isbeing properly implemented

Automated regression testing pro-vides thorough coverage ensures thatthe application meets functional and non-functional requirements and guaranteesthat functionality once working is not dis-abled in later iterations

Manual exploratory testing exposesunexpected behavior by letting testersexercise the application in ways too dif-ficult or costly to automate

This four-phase approach to testingenables a project team to effectively andthoroughly determine an applicationrsquosreadiness for release Consistent with

agile practices testingbegins early in the softwaredevelopment life cycle and itis done constantly through-out the project Every teammember participates in someway in the testing effort andthe mixture of testing typesminimizes the risks associat-ed with any single techniqueBy automating what can beautomated it becomes pos-sible to provide testers withsufficient time to fully applytheir unique skills and experi-ences yacute

REFERENCES AND FURTHERREADINGbull Cunningham Ward Framework

for Integrated Test October 122007 httpfitc2com

bull Dunkin Robert Burrell TheLanguage of Testing Dummies

Mocks Fakes and Stubs Weirdest UndreamtUse Case November 29 2008httpwwwjrollercomrobertburrelldonkinentrytdd_mocks_fakes_stubs

bull FitNesse Front Page FitNesse httpwwwfitnesseorg

bull Fowler Martin Mocks Arenrsquot Stubs MocksArenrsquot Stubs January 2 2007 httpwwwmartinfowlercomarticlesmocksArentStubshtml

bull jUnitorg Resources for Test DrivenDevelopment jUnit httpwwwjunitorg

bull Porter Robert Mock Objects in Test DrivenDesign BlogOfBob October 7 2007httpwwwrp2ccomblogofbobMockObjectsInTestDrivenDesignaspx

bull Rothman Johanna Does Exploratory TestingHave A Place On Agile Teams StickyMinds July7 2008httpwwwstickymindscomsitewideaspFunction=edetailampObjectType=COLampObjectId=13860

[Manual exploratory

testing is more than

wandering randomly

through the application

the tester must start

with a plan]

LEARN TEST AUTOMATION from Robert Walsh at the Software Test ampPerformance Conference this fall Hersquoll be with us in Cambridge MAOct 19-23 teaching ldquoFirst Steps to Test Automationrdquo an in-depth coursethat can automate your testing fast VViissiitt wwwwwwssttppccoonnccoomm

observed with clients using agile methodologies typi-cally Scrum While my associates and I have helpedclients overcome these challenges I will not present allof our suggested solutions here Instead Software Testamp Performance magazine has given test consultant andcertified Scrum Master Bob Galen an opportunity tooffer solutions

What I describe here are challenges we haveobserved or discussed with clients who told us they werefollowing an agile life cycle Some of these challengesmay be inherent in agile methodologies However somemay in fact actually arise from improper understanding orimplementation of the chosen methodology resulting inavoidable problems

VOLUME AND SPEED OF CHANGE One of the principles of agile development is that proj-ect teams should welcome changing requirementseven late in development Many testing strategies

especially analytical require-ments-based testing becomequite inefficient in such situa-tions While automated test-ing can help accommodateon-going change automationcan be fragile and high-main-tenance if not implementedproperly We find that risk-based testing coupled withreactive techniques likeexploratory testing and prop-

By Rex Black

Every software development model presents itsown testing implications Some ease the testingprocess others make it more challenging Whatfollows is a look at some challenges I have

Rex Black is founder and president of RBCS a software hardware andsystems consultancy and is a founding member of the STP CollaborativeStrategic Advisory Board

18 bull Software Test amp Performance JULY 2009

The AgileProcessCreatesTestingPuzzles

Which AreSolved When

Agile IsPerformedCorrectly

of traditional testing consultants I donrsquot believe theywere gathered from the perspective of an actual agiletester one with serious experience working in matureand high performing agile teams

Such views often come from the perspective of ascribe or observer of teams who at least from my per-spective arenrsquot practicing agility very well Or theyrsquoreanecdotal reactions to the Agile Manifesto or other writ-ings without trying to understand the underlying intentionsWhatrsquos discouraging is that superficial reactions from wellrespected sources can give the wrong impression

VOLUME AND SPEED OF CHANGEThe point of embracing change is the realization thatmost software projects are volatile Business conditionsare incredibly dynamic Customers may not actuallyknow what they need until they see the application startto coalesce as working software The developers maynot have understood the challenges associated with thearchitecture without actuallybuilding parts of it nor mighttesters understand the mostrisk-based areas to test with-out seeing the application run

The point of trying to freezerequirements at a historicalinstant in time where thingsare well understood withoutwriting a lick of software is afoolrsquos errand Agile methods tryto embrace change react to it

Iwas asked by Software Test amp Performance magazine to provide a rebuttal to Rex Blackrsquos perspectives on how ldquoagile challenges testingrdquoRexrsquos views and reactions to agility are typical

By Bob GalenBob Galen is principal consultant of the RGalen Consulting Group anda founding member of the STP Collaborative Strategic Advisory Board

JULY 2009 wwwstpcollaborativecom bull 19

erly-done automation works bestWhatrsquos more change-related challenges often arise from

changes in the definition of the product and its correct behav-ior When the test team is not kept informed of thesechanges or when the rate of change is very high this canimpose inefficiencies on the development execution andmaintenance of tests

SHORT ITERATIONS SQUEEZE TESTINGIn sequential life cycles if the organization involves the testteam early in the project the testers have lots of time todevelop and maintain their tests and can do so in parallel withthe development of the system and prior to the start of sys-tem test execution Some more formal iterative lifecycle mod-els such as Rapid Application Development and the RationalUnified Process often allow substantial periods of timebetween test execution periods for each iteration Theseintervals allow test teams develop and maintain their test sys-tems

Most teams using agile life cycles progress more quicklyeg Scrum uses sprints two to four week periods of shortfast-paced development iterations I have seen this pacesqueeze the test teamrsquos ability to develop and maintain testsystems Testing strategies that include an automation ele-ment also have proven particularly sensitive to this challenge

UNIT TESTING IS OFTEN INADEQUATE Most agile proponents stress good automated unit testing andframeworks such as JUnit have helped simplify and minimize

and make the changes visible across the team and to stakehold-ers while guiding the software toward the customersrsquo true needsIf the traditional testing approaches that Rex alludes to can notadapt to change effectively then Irsquod argue that they need tochange

SHORT ITERATIONS SQUEEZE TESTINGIf your parochial goal is simply testing then I agree that shortiterations can be quite challenging But if your goal is to deliversmall chunks of working software to gain customer and prod-uct quality feedback and to ultimately deliver high value andworking products applications then small iterations rock

To Rexrsquos point regarding ldquosqueezing test system mainte-nancerdquo all of the agile methods adhere to the notion of com-pleting a small set of features that meet rdquodonenessrdquo criteriaat the end of each Iteration or Sprint

There should be no work that is left outmdashfor exampleappropriate test automation development or maintenance oftest ldquosystemsrdquo

Itrsquos up to the testers to simply place this work on theProduct and Sprint Backlogs and plan its completion as partof each Iteration or Sprint

Sometimes this takes effort and courage to defend this workagainst feature pressure However all team members encounterthe same challenge and need to ensure that they deliver ldquodonerdquosoftware from each of their functional perspectives

UNIT TESTING IS OFTEN INADEQUATEFirst agile methods stress automated testing of all kindsmdashnotsimply unit tests FitNesse Selenium and Watir Watin are

examples of open source automation tools that are widelyused in the agile community and that focus on non-unit testing

We do stress unit tests as being an important part of theoverall coverage and as being a safety net for the teamsrsquo refac-toring efforts We want developers to be responsible for con-tributing to test automationmdashnot just testers And if a team isusing TDD as a practice wersquove learned that developing testscan be a powerful aid as one designs their software in smallincremental steps

No agile team to my knowledge views unit tests as theend-all of testing In fact as part of delivering ldquodonerdquo softwareincrements they must decide on the set of tests (unit func-

20 bull Software Test amp Performance JULY 2009

[I expect testers to pair with

developers to improve practices anddevelopers

to deliver solid unit tests before they

can be considered lsquodonersquo]

tional acceptance non-functional regression etc) thatshould be run within the Sprint in order to meet a customer-ready deliverable Every team member should be definingthese testing tasks with the testers taking the lead

To Rexrsquos last point that programmers donrsquot commit to doingeffective unit testing I expect the testers to pair with the devel-opers to improve these practices and I expect developers todeliver solid unit tests before they can be considered ldquodonerdquo

MANAGING THE INCREASED REGRESSION RISKThere is a continuing misperception that unit tests are theonly tests run on a Sprint-by-Sprint basis While itrsquos true thatthe team might not be able to run a full regression test with-in each Sprint it is their responsibility to apply risk-basedtechniques based upon whatrsquos been added or changed andto run appropriate functional non-functional exploratory andregression tests as part of delivering ldquodonerdquo software at theend of the Sprint

Clearly automation development at all levels helps withthis so the team should be investing time in improving theiroverall ability to automate all of the testing types I mention asappropriate

POOR OR MISSING TEST ORACLESI hate to break the news to Rex but the best test oracles arethe people within the team not the documentation Documen-tation almost always loses its relevance and accuracy Yes itrsquoshelpful but should never be considered the single source forall knowledge

What agility encourages is for the team to talkmdashto plan their

the costs Such automated unit tests allow what agile propo-nents call refactoring Refactoring is the redesign of majorchunks of code or even entire objects The automated unit

tests provide for quick regression testing of refactored codeSome agilists recommend substituting automated unit tests fordesign as in Test Driven Development While this soundsgood in theory there are two problems that I have observedwith this approach in practice

First unit testing has limited bug-finding utility In researchpublished in June 2002 in CrossTalk the Journal of Defense

Software Engineering author Capers Jones found unit testingto be only 25 to 30 percent effective at finding and removingdefects In RBCS assessments we have found that good sys-tem testing by an independent test team averages around 85percent effective at finding defects So organizations that careabout quality should have both good unit testing and good sys-tem testing

Second while unit testing is often thought of as thedeveloperrsquos job I have found that many programmers justdonrsquot do it or deliver cursory attempts This can create com-plications and delays for the system test team in its role asldquobug finder of last resortrdquo as they deal with buggy code dur-ing short test execution periods in the sprints This can alsolead to code churn during testing since so much code mustchange to fix the bugs

The amount of change can ultimately outstrip even theability of the best automated regression test system to keepup which then leads to lower defect detection effectivenessfor the test team

MANAGING THE INCREASED REGRESSION RISKCapers Jones also reported that regression accounts forabout seven percent of bugs In iterative life cycles such asScrum however code that worked in previous sprints getschurned by new features in each subsequent sprint Thisincreases the risk of regression

Agilists emphasize good automated unit testing in part tomanage the regression risk inherent in such churn But with eventhe best unit testing topping out at about 50 percent defect

JULY 2009 wwwstpcollaborativecom bull 21

[While unit testing is often thought

of as the developerrsquos job I have found that

they just donrsquot do it or [they] deliver

cursory attempts]

22 bull Software Test amp Performance JULY 2009

work together to cluster in small groups working on specific fea-tures to work with the customer gaining valuable feedbackmdashandto collaborate across disciplines From this they can decidewhere when and to what degree documentation is necessary

We expect team members to be professionals and to beresponsible and accountable We trust their judgment overblindly filling in a plan template or a requirement document orblindly running the same tests over and over again to meetsome perceived risk

All oracles have a cost We engage the customers in thesedecisions as wellmdashensuring that those oracles that are providedare actually used and kept current

THE SHIFTING TEST BASISIrsquom lost here Agile teams define requirements Some usePost-it Notes and cards or define them as User StoriesOthers describe use cases or use traditional requirementforms Many even add a construct called acceptance tests ona finely grained story level that helps define the critical ele-ments of each feature or requirement These are things thatthe customer will ldquotestrdquo as part of their acceptance at the endof a Sprint or Iteration

Irsquod guess that Rex has seen poor or missing requirements inhis agile experience and Irsquod say those were not mature or solid-ly performing agile teams ldquoGoodrdquo teams define requirementsat a level that assures the team will build what the customerneeds They ensure collaboration surrounding each require-mentmdashat the point of attack

To capture a key point here The agile methods try to adopta Just-in-Time and Just Enough approach to their requirement

removal effectiveness according to Jones automated regressiontesting via unit tests will miss at least half of the regression bugs

POOR OR MISSING TEST ORACLESAgile favors working software over written documentationand in my observations special scorn is reserved for specifi-cations The suggestion by the Agile Manifesto (wwwagile-manifesto org) that people should value ldquoworking softwareover comprehensive documentationrdquo can create real chal-lenges for a test team when taken too far Testers userequirements specifications and other documents as testoracles ie the means to determine correct behavior undera given test condition We have seen testers in agile situa-tions given documents with insufficient detail and in somecases given no documents at all

Even when testers are given adequate documents twoother agile development principles keep the test oracle chal-lenge alive First agile requires teams to embrace changeand espouses that ldquothe most efficient and effective methodof conveying information to and within a development team isface-to-face conversationrdquo

From what Irsquove seen these two principles allow the proj-ect team to change the definition of correct behavior at anytime including after testers have created tests to confirm aparticular behavior and even after testers have reported bugsagainst a particular behavior

Further the definition of correct behavior can change in ameeting or a discussion where a tester is not present If thatchange is not communicated to the tester the project suffers

false positives when testers report bugs against behaviornow defined as correct

THE SHIFTING TEST BASISRequirements-based testing strategies cannot handle vague ormissing requirements specifications Missing requirementsspecifications would mean a test team following a requirements-based testing strategy not only canrsquot say what it means for aparticular test to pass or fail they wouldnrsquot have a test basisupon which to base their tests

The test basis also provides a means to measure theresults In a requirements-based test strategy testers canrsquotreport test results accurately if the requirements are missing orpoor Testers canrsquot report the percentage of the test basis cov-ered by passed tests because the requirements wonrsquot provideenough detail for meaningful coverage analysis

TOO MANY MEETINGSA ldquoface-to-face conversationrdquo is just another way to describe ameeting A client once jokingly described Scrum as a heavy-weight process saying ldquoIrsquom surprised at the name agilemdashitshould be called couch potato There are too many meetingsTherersquos too much jawboning I find it really ironic that there are allthese books explaining how simple it isrdquo

Perhaps it was intentional not to use the dreaded word meet-ing in the Agile Manifesto but the result is often the same In someorganizations everyone gets invited to every meeting meetingsballoon in number and duration and managers and leads are notavailable to manage and lead their team because they are in meet-

ings much of the day Effectiveness and efficiency drop Anotherclient once told me of a senior manager who said in response toa complaint from a line manager about how attending meetingswas negatively impacting his ability to lead his team ldquoWersquore goingto continue to have these meetings until I find out why nothing isgetting done around hererdquo The senior manager was I am told theonly one in the room who didnrsquot get the irony

THE BOTTOM LINEIn RBCS client engagements we typically recommend blend-ed testing strategies that align well with Scrum and otheragile methodologies What seems most likely to work in manycases is a blend of risk-based testing maintainable automat-ed testing (at both the unit and system level) and a percent-age of effort spent on reactive testing (eg exploratory) Insome cases our methods will help mitigate the testing risksand reduce the testing challenges associated with thesemethodologies

Clearly there is no single way to perform effective soft-ware testing and as the industry evolves so too will the mosteffective methods that do the most good yacute

development An option of ldquono requirementsrdquo does not exist

TOO MANY MEETINGSThe point of face-to-face collaboration is that itrsquos the most effec-tive and richest way to communicate in technical teams Has anydeveloper ever implemented a requirement had a tester test itand a customer receive it with perfect alignment That wouldtruly be exceptional More oftenthe tester will say the softwareperforms differently than the testsandor the customer says I didnrsquotask for that even though theywere all working off of the exactsame requirement

The point is that technicalteams need to rely on written aswell as face-to-face collabora-tion to effectively reduce themisunderstandings and reworkthat written requirements alone often drive

Beyond this agile teams should be conducting incrediblyeffective meetings something perhaps lacking in the exam-ples Rex has seen I suspect this problem isnrsquot agile-centricand that those teams simply need a good facilitator

THE BOTTOM LINEEffectively testing todayrsquos software is incredibly challenging andwe all need to stop thinking of agile testing as somethingopposed to traditional testing Testers need every possible tool

to perform testing well Agile methods have emerged as a set ofpractices and a mindset that is context based and powerful Theytake a holistic view to testing and qualitymdashreinforcing the two asa whole Testers in this model have a powerful opportunity to notonly test but to champion and guide the overall teamrsquos focus ondelivering quality software

We need to stop our history of opposition to new testing app-roaches and begin embracing abroader set of testing practicesthat work in each projectrsquos uniquecontext I also believe we that weneed to stop quoting CapersJones as I suspect his data andstudies do not reflect the dynam-ics of mature agile teams Irsquom look-ing forward to encouraging Rex totry and gain deeper understandingof agile testing practices yacute

Bob will have ample oppor-tunity to do that in October when he and Rex meet for thefirst time as fellow members of STampP CollaborativersquosStrategic Advisory Board ndashEd

JULY 2009 wwwstpcollaborativecom bull 23

[Teams need to rely on written and face-to-face

collaboration to effectively reduce rework]

LEARN AGILE TESTING TECHNIQUESdirectly from Bob Galen himself this fall atSTPCon The Scrum Master will be on hand inCambridge MA Oct 19-23 to teach ldquoAgile andHigh-Speed Software Testing Techniquesrdquo anin-depth training course for bringing agile test-

ing to your organization VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

LEARN TEST MANAGEMENT from Rex Blackat the Software Test amp Performance Conferencethis fall in Cambridge Mass Learn about ldquoTestManagement A Risk-Based Approachrdquo the prac-tice that guides test activities based on carefulassessment and prioritization of the risks involved VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

ciplines and draw on a variety of skills to do their jobs And whilethere are many unsung testing heroes inside software develop-ment shops there are the beginnings in many organizations of agrowing recognition that test and performance engineers shouldbe considered peers of their development counterparts

We asked several testing gurus to share their thoughts onthe state of the testing industry and give us some insights on wherethey see things evolving One thing for anyone to realize is thatwhile testing as an industry is maturing it is still relatively new Thesoftware tester job isnt something that shows up on the US CensusWe are still very much a new profession says Scott Barber thefounder and chief technologist of PerfTestPlus in Palm Bay Fla Mostof our very first programmers are still alive we arent thousands ofyears old like other disciplinesrdquo

Perhaps the best rundown of the industryrsquos evolution is summed up by Michael Boltonwho lives in Toronto Testing has historically been confirmation validation and verificationhe says I believe that testing must become far more about exploration discovery investi-gation learning and communicating what youve learnedrdquo Bolton runs test consultancyDevelopSense and teaches rapid software testing methods ldquoI like to encourage people torecognize and manage uncertainty and ambiguity and instead of trying to eliminate them get-ting them down to levels that are suffi-ciently low for people to do use-ful workrdquo

While any gathering ofsoftware testers will find dis-agreement on overall pur-pose one thing is cleartesting isnt about how toimprove code qualityndashthat isthe role of the engineeringpeople who do the coding workand of management says BoltonTesting is about providing informationto those people so that better-informed decisionsabout quality are possible

The future careers for the current cropof test engineers seem to be on the rise

All this activity has made software testing a very big tent The industry has matured and grownto encompass numerous approaches to methods training and tools Because of this diver-sity there is a growing consensus that the best testers are those that can span multiple dis-

The software testing industry is at a historic crossroads Rich Internet applica-tions cloud computing the rise of virtualization and more capable mobiledevices have all made the job of producing reliable software more complexwhile at the same time enabling more sophistication for automated test tools

Wersquore All Part of the Story

We are a new profession most of our first programmers are

still alivendash Scott Barber

A skilledtester is theproduct of avariety of

experiencesndash Jon Bach

24 bull Software Test amp Performance

the business outcomes theyrsquove delivered he says He goes on to describe the best mix in his development teams A performance

engineering organization should be comprised of engineers that are just as capableor more capable than the core product engineers This allows for the performanceteam to be just as familiar with the product code base and for them to go as far asmaking performance enhancements to the product and testing the outcome of thosechanges

At Google employees involved in testing generally have one of three titles saysDavid Saff a Google software engineer in test whorsquos basedin Cambridge Mass Software engineers are expected to unit-test their own code and many take an active role in maintain-ing testing frameworks and system tests for their teams Testengineers look at the quality requirements of a project identifyholes and fill them by applying existing automation tools where pos-sible or managing manual testing teams where needed Softwareengineers in test look for holes in the tools available and serve as internal

tool developers and implementation consultants Any teamcant survive without high competence in each domain

CLOUDS ARE FORMINGThe move toward cloud computing has helped softwaretesting overall but also made for some new challengestoo Cloud computing adds layers of complexity to test-ing but it also makes things more interesting says BoltonThese layers include being able to predict latency and avail-ability of all components that reside in the cloud Barber saysthat Too many people take latency for granted still and many

application developers still take the entire network infrastruc-ture for granted and dont consider it part of their domain

so programmers just assume it is working unless theircode has a problem

Bartow agrees Network latency hasbecome even more important with the intro-duction of Rich Internet Application tech-nologies such as Adobe Flex Every value that you enter

David Strom is a freelance reporter and author living in St Louis

particularly within large software shops such as Intuit and Google There software testengineers are often more important than the development engineers who writethe original code and have their own career track with parallel compensationand benefits

At Intuit QA engineers are held to the exact same standards and expec-tations as product developersrdquo says Dan Bartow a manager of performanceengineering for the company which makes TurboTax and other financial softwareldquoIn fact my team of five full-time performance engineers is calibrated the sameway as the product engineers at the end of year in terms of their skill-sets and

Virtualizationmakes things

easierand harder

ndash Michael Bolton

wwwstpcollaborativecom bull 25

of the Story

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 3: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

VOLUME 6 bull ISSUE 7 bull JULY 2009contents

6 bull Software Test amp Performance

1122 MAIN FEATURE

The Four Pillars of AgileWhats most important when implementing agiletesting in your organization Learn from a self-made vertical solution provider who brought agilemethods to his own companyBy Robert Walsh

1188 The Great Agile DebateBoon to mankind or over-hyped fad Read this point-counterpoint and decide for yourselfBy Rex Black and Bob Galen

2244 The Test Industryrsquos Big TentThe test industry has no shortage of characters noraccompanying points of view about how to test trainand advance careers The industrys notables weighin on the state of software testingBy David Strom

3300 Coloring Outside the LinesWill your companys Web services work with othersLearn from a former WS-I working group leader abouttesting tools and techniques that can help you deter-mine whether interoperability will be an issue for youBy Chris Ferris

1 James Bach 2 Scott Barber 3 Jon Bach 4 Michael Bolton5 Rex Black

DEPARTMENTS

8 Presidentrsquos LetterWelcome to STPCollaborative

9 EditorialThe test industrycomprises diversephilosophies butmost with the samegoals in mind

10 STampPediaThe Agile Manifestoflew in the face ofdevelopment theoryNow its processes areubiquitousBy Matt Heusser amp Chris McMahon

AVAILABLE ONLINE

Best PracticesRequirements are a rope bridge spanningdevelopersrsquo knowledgeand the analystrsquos under-standing of business By Joel Shorestpcollaborativecombp

FutureTestQA pros Grow into agile for newopportunities By Chuck Maplesstpcollaborativecomft

ON THE COVER

6 Bob Galen 7 Dan Bartow 8 Robert Walsh 9 David Saff

10 Kent Beck

GET MORE ONLINE AT

wwwstpcollaborativecom bull 7

A PublicationREDWOODCollaborative Media

Software Test amp Performance (ISSN- 1548-3460) is published monthly by Redwood Collaborative Media 105 Maxess Avenue Suite 207 Melville NY 11747Periodicals postage paid at Huntington NY and additional mailing offices Software Test amp Performance is a registered trademark of Redwood CollaborativeMedia All contents copyrighted 2009 Redwood Collaborative Media All rights reserved POSTMASTER Send changes of address to Software Test amp Performance 105 Maxess Road Suite 207 Melville NY 11747 To contact Software Test amp Performance subscriber services please send an e-mail to membership-servicesstpcollaborativecom or call 631-393-6051 ext 200

Testing a product that was due forrelease yesterday Learn thesecrets of just-in-time testing fromlong-time consultant and testingguru ROB SABOURINstpcollaborativecomsabourin

Agile testing boom or bust BOBGALEN and REX BLACK duke itout in The Great Agile Debate aLIVE WEB EVENTstpcollaborativecomeseminars

Join us at STPCON 2009 STPCollaborativersquos first annual gath-ering STPCon is the industryrsquosleading event for software devel-opment managers testQA pro-fessionals and notable luminar-ies to network and learnndashOct19ndash23 Cambridge MAstpcollaborativecomconferences

CONTRIBUTORS

ROBERT WALSHis president andmanager of applica-tion development at EnvisionWare aprovider of librarysoftware solutionsHe has been work-

ing with agile methodologies since2002

With more than 25years of softwareand systems engi-neering experienceREX BLACK haswritten numerousbooks on softwaretest management

He is president and principal con-sultant of RBCS

A Certified ScrumMaster Practicing(CSP) since 2004BOB GALEN isthe director of prod-uct developmentand agile architectfor ChannelAdvisor

and the principal consultant forRGalen Consulting Group

CHRIS FERRISis an IBM disting-uished engineer andCTO of IndustryStandards in IBMsSoftware GroupStandards Strategyorganization Hes a

former chair of the WS-I BasicProfile Working Group

DAVID STROMformerly editor-in-chief at Toms Hard-warecom NetworkComputing Maga-zine and DigitalLanding has writtentwo books and thou-

sands of articles for trade publi-cations

Can you identify slow spots failures and bottlenecks Learnhow with app-performance sleuthSCOTT BARBERstpcollaborativecombarber

Our new resource directory show-cases all the LATEST TESTERSOLUTIONS and technologiesSearch by category see demosdownload trials get members-onlybenefits and much morestpcollaborativecomresources

July 2009

Dear Subscribers

In December 2008 Redwood Collaborative Media purchased the Software Test amp Performance

division of BZ Media We had a vision of creating new opportunities for software testers to con-

nect with and learn from test and QA colleagues around the globe through a new collaborative

media model Today we are pleased to realize that vision and unveil the new Software Test amp

Performance Collaborative

STP Collaborative will continue to bring you the high quality news valuable information and

face-to-face networking opportunities yoursquove come to expect from Software Test amp Performance

magazine and the Software Test amp Performance Conference In addition wersquore excited to

announce the launch of an all-new Web site which will serve as the cornerstone of a new mem-

bership model that enables community-building education training and peer-to-peer knowl-

edge sharing which are so vital to your profession Were excited about our new programs and

we hope you will be too We invite you to visit wwwstpcollaborativecom and join our com-

munity today

Once again we welcome you to STP Collaborative and invite you to use and comment on our

content contribute to our knowledge base and interact with each other to build a community

that will benefit everyone in your growing profession We look forward to working with you to

build a global community of knowledge workers in the software test and QA profession

Sincerely

Andy Muns

President Redwood Collaborative Media

Publisher Software Test amp Performance magazine

JAMES BACHCEOSatisfice

SCOTT BARBERChief Technologist PerfTestPlusDirector AST

KENT BECKFounder and directorThree Rivers Institute

VLADIMIR BELORUSETSSQA ManagerXerox

REX BLACKFounder and presidentRBCS

Introducing the STP Collaborative Strategic Advisory Board

8 bull Software Test amp Performance JULY 2009

MICHAEL BOLTONFounderDevelopSense

ROSS COLLARDFounderCollard amp Co

JAN FISHQA ManagerPhillips Lifeline

ROBERT GALENPrincipal ConsultantRGalen Consulting Group

MATT HEUSSERInd software developertester amp trainer Socialtext

DAVID HOLCOMBECo-founder president amp CEOeLearning Guild

ANDREW MUNSPresident amp CEORedwood Collaborative Media

RON MUNSChairmanRedwood Collaborative Media

BJ ROLLISONTest ArchitectMicrosoft

DAVID STROMFreelance columnist and speaker

IF I HAD TO DESCRIBE IN A SINGLEword the characters that make up todayrsquos castof test industry thought leaders one that fits wellwould be diverse Theyrsquore geographically di-versemdashstretching from Quebec to Kalamazoofrom Boston to Bulverde Their clients are evenmore so Engagements spread across the USand Canada to China New Zealand Russia andSweden to name just a few

Then therersquos the philosophical divide asyoursquoll discover in David Stromrsquos fine piece in thisissue titled ldquoWersquore All Part of the Storyrdquo in whichhe looks at the state of the testing industry andits major players Yoursquoll hear from the Bach broth-ers James and Jon Scott Barber Rex BlackMichael Bolton Kent Beck and many others Justto prove that we didnrsquot simplytake names from the B sec-tion of the testerrsquos phone bookwe also have Googlersquos DavidSaff weighing in

While opinions and philo-sophies vary therersquos one thingon which they all seem to agreeWith the ever-increasing com-plexity of software paired withever-shrinking release timeta-bles there will always be a needto reduce testing time andincrease efficiency Is thereagreement on how best toaccomplish that Maybe in car-toons

Which brings me to thestory behind this monthrsquos cov-er illustration concept CreditMr Strom for the caricatureidea during a brainstormingmeeting with me and Redwoodpresident Andy Muns I knewthat self-effacement would beokay with Jon Bach and hisbrother James who recently recounted the sto-ry of how the two had received caricatures ofthemselves at a conference they were bothattending We asked permission anyway ofcourse as we did with the remaining playersPeople in testing are exceedingly good natured

in my experienceat least

Varied opin-ions are furtherhighlighted in thisissue in The AgileProcess CreatesTesting Puzzlesa lively point-coun-terpoint by con-sultant Rex Blackand Scrum Mas-ter Bob Galen Ofcourse even critics of agile methods acknowl-edge merits of the practices when executed cor-rectly After all what can be bad about putting

people over process Of put-ting working software over com-prehensive documentation

You might say that agileprinciples were behind the designof our new Web site which wereexcited to launch this month Itrsquosdesigned for individuals to inter-act with each other and with usThe site also features process-es tools and resources fortesters QA professionals andsoftware managers Itrsquos designedwith the customer (you) in mindand looks to add interactivity andcollaboration to your favorite test-ing magazine And well con-stantly add more features andcontent to make this new site arich and valuable online resourceand community

Also beginning this monththe magazine has a fresh newlook Sincere thanks go to artdirector LuAnn Palazzo with-out whose talent and infinite

patience this project would not have happenedWe hope yoursquoll enjoy the magazine and participate in our new membership communityand collaborative Web site Please join us atwwwstpcollaborativecom and tell us what youthink yacute

ed notes

The Test IndustryrsquosCartoonland Villagec

Edward J Correia

ldquoWhile opinions

and philosophies

vary there will

always be a need

to reduce testing

time and increase

efficiency rdquo

VOLUME 6 bull ISSUE 7 bull JULY 2009

President Andrew Muns

Chairman Ron Muns

105 Maxess Road Suite 207Melville NY 11747+1-631-393-6051+1-631-393-6057 faxwwwstpcollaborativecom

Cover Illustration Caricatures by Steve Nyman

REDWOODCollaborative Media

Editor

Edward J Correiaecorreiastpcollaborativecom

Contributing Editors

Joel ShoreMatt HeusserChris McMahon

Art Director

LuAnn T Palazzolpalazzostpcollaborativecom

Publisher

Andrew Munsamunsstpcollaborativecom

Associate Publisher

David Karpdkarpstpcollaborativecom

Director of Operations

Kristin Munskmunsstpcollaborativecom

Chief Marketing Officer

Jennifer McClurejmcclurestpcollaborativecom

Marketing Coordinator

Teresa Cantwelltcantwellstpcollaborativecom

Reprints

Lisa Abelsonabelsonstpcollaborativecom

516-379-7097

MembershipCustomer Service

membership-servicesstpcollaborativecom

631-393-6051 x 200

Circulation and List Services

Lisa Fiskelfiskestpcollaborativecom

wwwstpcollaborativecom bull 9

IN FEBRUARY OF 2001 17 TECH-nology professionals gathered for a peerconference in Park City Utah and craft-ed a document called the Agile Man-ifesto That document is probably thesingle most influential statement on soft-ware development of the past decadeand its values led to a series of newideas and concepts insoftware testing

Agile SoftwareMethodsXP Short for ExtremeProgramming XP was cre-ated by Kent Beck duringhis work at Chrysler Atthe time projects at theauto maker struggled withheavyweight specifica-tions specialized rolesand enough turnover thatprojects would simply burnout mid-stream delivering no workingsoftware at all before being canceledExtreme Programming focuses on deliver-ing software by first developing simpleworking features and then adding morefeatures incrementally while improvingthe code constantly Most XP techniquessuch as Pair Programming Test DrivenDevelopment and Refactoring are nowconsidered engineering practices

Scrum Popularized at (now defunct)Easel Corp by Ken Schwaber and JeffSutherland At Easel requirementswere changing so quickly that thedevelopers could never get anythingcompletely finished Scrum focuses onstable requirements for each time boxcommitments from team membersand communication through dailystand up meetings Most of the focusof Scrum is on project management

DSDM Short for Dynamic SystemsDevelopment Method Developed in theUnited Kingdom DSDM advocatesnotice that many projects have fixed staffand a fixed release date DSDM thereforefocuses on techniques to allow the teamto alter the project scope while maintain-ing high quality thus hitting the original

due date

Agile SoftwareProcessesTraditional Software Pro-cess generally breaksdown work by phase orby type of work all of therequirements first thenall of the coding then allof the testing and so onAgile processes insteadbreak down work by fea-ture one thin slice at atime of the final product

which the customer can use to actuallyand concretely do something new

Agile teams then batch the work upinto time boxes periodically bringing thecode to production quality Agile teamsrelease production code as often asevery few days to as rarely as every fewmonths Because of that teams needgreater discipline and skill to deal with anincreased regression testing burden

Iteration Used in Extreme Program-ming an iteration is a time box (typicallyone to three weeks) starting with arequirements exercise and ending withnew features at production quality

Sprint A short time box used in Scrumthat allows the technical staff to createcode ready for production without therequirements changing as they workTechnical staff are expected to own thedeliverable and release production fea-tures at the end of the time box

User Story A form of requirement thatdescribes a single specific meaningfulpiece of functionality that has value to the

business User stories are small and canbe implemented by a pair of programmersin one or a few days User Stories line uproughly with a single scenario within ause case in traditional software devel-opment A user story is not a specifica-tion it is an example of what the softwarewill allow a person to do

Story Card Stories are often express-ed on 3x5 index cards Agile Coach RonJeffries has described a story card as aplaceholder for a conversation

Story Tests A series of tests express-ed in plain English but eventually automat-ed Story tests provide examples for theprogrammers and testers and describethe minimum amount of functionality forthe story to be called done The goal ofstory tests is not to convey a comprehen-sive specification but to provide examplesso that the whole team can build a sharedmental model of what is expected Story Kickoffs A kickoff is the finalmeeting held before developers start

writing code The idea that testersshould be involved early in the processhas been around for decades The agiletwist is to have testers help as part ofthe design for the story During the kick-off developers product owners andtesters can discuss and ask questionsabout what the software will do this canlead to a better description and prevent

stamp pedia

10 bull Software Test amp Performance JULY 2009

Agile Testing in Practicec

The encyclopedia for software testing professionals

Matt Huesser andChris McMahon

Matt Heusser and Chris McMahon are careersoftware developers testers and bloggers Mattalso works at Socialtext where he performstesting and quality assurance for the compa-nyrsquos Web-based collaboration software

ldquoGood ideas can come

from anywhere at

any time rdquo

continued on page 36 gt

wwwseapinecomstpswiftSatisfy your quality obsession[ [

copy 2009 Seapine Software Inc All rights reserved

Swiftcovercom cut their testing time in half with TestTrack Studio and QA Wizard Pro while still providing the quality their customers expectSeapinersquos end-to-end Software Quality Assurance (SQA) solutions help you deliver quality products faster Start with

QA Wizard Pro for automated testing and add TestTrack Studio for issue tracking and test case managementmdashintegrated quality assurance solutions that together reduce testing time saving you money and improving customer satisfaction

Reduce quality assurance costs with automated functional and regression testing

Manage test case development defect assignments and QA verification with one application

Track which test cases have been automated schedule script runs and link test results with defects

Learn how to test faster while protecting quality Visit wwwseapinecomstpswift

Save timewhile protectingsoftware quality

So much of success boils down to time QA Wizard Pro and TestTrack Studio allow us to be more protable because we do more in less time mdash Test Manager Swiftcover

Test

Trac

kregPr

o T e

stTr

ackreg

TCM

T e

stTr

ackreg

Stud

io

Surr

ound

SCM

regSe

apin

e C M

regQ

A W

izar

dregPr

oIss

ue M

anag

emen

t Te

st C

ase

Man

agem

ent

Test

Pla

nnin

g amp

T rac

king

Co

nfi g

urat

ion

Man

agem

ent

Chan

ge M

anag

emen

t Au

tom

ated

Test

ing

Illus

trat

ion

by S

teve

Nym

an

necessary for the others to succeed They alsoprovide a system of checks and balances thatallows the strengths of one to compensate forthe weaknesses of the others Each pillar is theresponsibility of a different group within theagile development team and when each groupdoes its part my experience has shown that theresult is usually a successful project

AUTOMATED UNIT TESTINGUnit testing is perhaps the easiest type of test-ing to automate Automated unit testing is apractice through which nearly every line andbranch of code is covered with an automatedtest that proves the code works as intendedAutomated unit tests focus on the discreteunits or modules within the application eachperforming its specific tasks in isolation fromthe rest Automated unit tests may be writtenbefore or after the code under test is writtenbut itrsquos most effective to write them beforehand(a process known as Test-Driven Developmentor TDD)

With TDD the code tends to more close-ly represent the developerrsquos intent becausefor the developer itrsquos easier to write justenough code to satisfy the single new failingtest Less code written means less code tomaintain and fewer places where bugs canhide When the unit tests are written after thecode the tests scan sometimes show onlythat the code works as written and not nec-

essarily as intended Further therersquos a ten-dency to write more code than is necessarybecause the programmer is speculating aboutwhat conditions might exist rather than start-ing from concrete test cases that highlight thetrue requirements

Automated unit testing falls almost solelyon the shoulders of the developers and helpsto ensure that the code they are writingbehaves according to their interpretation ofthe requirements The word interpretation isimportant there because while a modulemight achieve 100 percent code coverage andpass all of its unit tests it might still fail tomeet the needs of the application If the pro-grammer misunderstands the requirementthe tests he writes will be flawed and theresulting code will misbehave Thatrsquos amongthe reasons that automated unit testing is notthe only form of automated testing employedon an agile project

Automated unit tests are generally writtenin the same language as the application beingtested A family of testing tools exists to facili-tate automated unit testing Among the mostwell-knows is jUnit created for Java by KentBeck and Erich Gamma with ports now avail-able for most languages These tools provide

By Robert Walsh

Successful agile testing relies on a strategy built on four pillars automated unit test-ing automated acceptance testing automated regression testing and manualexploratory testing As with columns that hold up a building no single pillar cansupport the entire load The four are interdependent and each provides benefits

THE BUILDING BLOCKS OF AGILEmdash

AUTOMATED UNIT ACCEPTANCE AND

REGRESSION TESTING PAVE THE WAY TO

MANUAL EXPLORATORY TESTING

Robert Walsh is president and manager of applicationdevelopment at EnvisionWare which provides soft-ware solutions for public and academic libraries

14 bull Software Test amp Performance JULY 2009

language elements for defining testcases combining tests into suites andasserting the various exit conditions thatdetermine whether the tests pass or failMany also provide a run-time environ-ment in which the tests execute Theserange from simple command-line reportsshowing the results of the tests to com-plete GUIs that integrate directly withthe developerrsquos programming environ-ment Automated unit tests often areintegrated with the continuous build sys-tem and are run each and every time theproject is built This helps to ensure thatthe developers know instantaneouslywhen a recent check-in has caused thesuite of unit tests to fail

When working with automated unittests developers commonly find itnecessary to use mock objects fakeobjects or method stubs to stand in forother parts of the application Forexample if a module obtains informa-tion from a database it might beadvantageous to use a fake databaseobject to provide canneddata that is similar to thereal data available in theproduction environmentThis approach gives theprogrammer more controlover the test environ-ment and lets them dic-tate which data isreturned and how to rep-resent a variety of inter-esting situations Furtherthe technique allowsdevelopers to simulateproblems and failures thatmay be difficult to triggerwhen using a real data-base These might includeconnectivity issues bufferoverruns and other errorconditions

Automated unit testsprovide several key ben-efits in an agile environ-ment First they make itpossible to achieve highlevels of code coveragealmost as a side-effect Ifa programmer is disciplined aboutTDD then near 100 percent coveragejust happens Second automated unittests help to provide an early warningindicator that shows when behaviorwithin a module has changed This canbe used to identify unintended side-effects of other development and toserve as a kind of safety net for later

refactorings If after reworking a mod-ule all of the existing unit tests passthe developer can be confident that themodule will function properly in thecontext of the full application Finallyautomated unit tests also help to raisethe initial quality of the applicationdelivered to testers By using tests aspart of the coding process developersare able to catch many of the minorerrors and trivial mistakes that wouldotherwise fall to testers Reducingobvious coding errors through unittesting helps free up testers for theimportant job of manual exploratorytesting

AUTOMATED ACCEPTANCETESTINGAn essential element to the success ofan application project is how well itmeets the needs of its users A prod-uct might be absolutely flawless per-fect in every way and still not do thejob One way to ensure that the busi-

ness needs are met iswith automated accept-ance testing

With automated ac-ceptance testing the cus-tomer often with help ofQA writes tests for thekey features that arescheduled to be imple-mented in the upcomingiteration These tests arewritten in a natural lan-guage that non-technicalpeople can read and as aresult they become exe-cutable specifications thatcannot get out-of-syncwith the application beingtested

Automated accept-ance tests tend to focuson successful use casesthey often ignore the sadpaths and exceptional situ-ations For this reason atest strategy built entirelyon automated acceptancetesting will be inadequate

to thoroughly test the applicationSometimes the testers will augment thetest suite created by the customer withadditional acceptance tests that docover more of the corner cases

Automated acceptance tests pro-vide the primary measure of progress asthe product evolves over the course ofan iteration or even across an entire

release plan The acceptance tests arewritten early in (or even before) the iter-ation and running them clearly illustrateshow many of the stories have been com-pleted As part of an iteration reviewmeeting the tests are run again todemonstrate that the scope selected atthe start of the iteration was implement-ed according to the customerrsquos own def-inition of what is correct This process ismainly a formality it should be no sur-prise to anyone whether the tests willpass or fail when run during the iterationreview meeting

Two approaches to automatedacceptance testing are the Frameworkfor Integrated Test (FIT) developed byWard Cunningham and FitNesse awiki originally written in Java by MicahMartin Robert C Martin and MichaelFeathers FitNesse extends the princi-ples established by Cunninghamrsquos tooland provides a mechanism by whichtests may be created and maintained inaddition to being executed Both are

[With

automated

acceptance

testing

the

customer

writes

tests for

the key

features]

based on creating tables of input datathat are fed into the application undertest The programrsquos output is capturedand compared to the expected resultswhich is also provided in the tables

While tests themselves are definedusing a combination of narrative textand tabular data theyrsquore executed byfixture code that accepts the inputfrom the tables calls the appropriatemethods in the application and pro-vides the output back to the test run-ner These fixtures are generally writ-ten in the language of the applicationand must be implemented and main-tained by developers However thegoal is that they be thin and serve onlyto translate inputs and outputsbetween the test harness and theapplication Therefore the overallimpact that working with the fixturecode has on the developersrsquo productiv-ity should be minimal

As with automated unit tests auto-mated acceptance tests are likely tobe run against an incomplete applica-tion Not much functionality exists inearly iterations so it may be necessaryto test one part before another hasbeen built Again simulators are oftenused to provide the missing behaviorsIt is up to the project team to decidewhether the simulators are replacedwith real implementations as theybecome available in later iterationsEach approach has its advantagesUsing real components may providemore confidence that the applicationworks properly However using simula-tors may provide more control over thesystem during the tests and as withthe mocks and fakes used in unit test-ing itrsquos sometimes easier to recreatefailures to verify that the applicationhandles them properly

AUTOMATED REGRESSIONTESTINGResponsible thorough testing mustverify not only the functional require-ments but non-functional requirementsas well Automated regression testingwhich seeks to verify functions thathave worked in the past can be aneffective way to provide this additionalcoverage and fill in the gaps left by theacceptance tests

Unlike unit and acceptance testingwhich most heavily involve developersand customers automated regressiontesting is performed by testersAutomated regression tests should

JULY 2009 wwwstpcollaborativecom bull 15

CHALLENGES OF TESTING IN AGILE ENVIRONMENTS

The agile environment exaggerates many of the challenges faced in more convention-

al approaches to testing First the pace of the development effort requires that the

application be tested within each iteration in parallel with the actual coding

Conventional approaches that focus on full regression during a time when the code is

frozen are ineffective in agile environments Agile testing tends to focus on the new

functionality and providing rapid feedback to the programmers However regression

testing is still important and some mechanisms by which the entire application may

be exercised end-to-end must be implemented

Testing on agile projects begins almost immediately However in the early iterations

there may not be much of the application to test Conventional approaches that focus

on running the completed application do not work well in agile environments Even

when automated efforts are employed in conventional environments the emphasis

often is on exercising the application from the outside in In other words scripts drive

the application through the UI In a methodology where the UI evolves with the rest of

the application though it is not possible to test the application in this manner

Requirements in agile environments tend to be less formal than in conventional

methodologies Requirements also referred to as stories in some agile iterations often

begin life as a single phrase or short sentence describing at a very high level what the

application must do As part of the iteration or release planning process the project

teammdasha cross-functional mixture of programmers testers business analysts and

stakeholdersmdashcollaborate to flesh out the stories The stakeholders often referred to

collectively as the customer focus on the observable ldquohappy-pathrdquo outcomes while

the testers add details to cover the odd exceptional or ldquocornerrdquo cases In some agile

methodologies these details are captured in the form of acceptance tests Conventional

approaches that use requirements documents as the basis of test cases tend to struggle

in agile settings

Agile depends heavily on effective collaboration among the programmers testers and

stakeholders It is not uncommon in agile environments to find testers and developers

working closely together on a daily basis This tight coordination of the various groups

often is absent in more conventional methodologies where the product transitions

through the development life cycle as a series of handoffs The business analysts deliv-

er the requirements to the architects who in turn deliver a design to the programmers

who deliver an application to the testers who test it according to the initial require-

ments People working in conventional settings sometimes are uncomfortable collabo-

rating with those in other groups

Successful agile efforts must find ways to overcome each of these challenges

Automation is essential to addressing the issue of pace By building automation infra-

structure early in the project when there is little real functionality available to test it

becomes far easier to run more tests later in the project Further this approach helps

to provide the regression testing necessary to ensure the product is evolving stably and

not losing functionality once working However not all tests can nor should be auto-

mated Skilled and experienced professional testers can add significant value through

effective manual testing By automating what can be automated testers have more

time to apply their talents through manual testing efforts

attempt to cover as much of the appli-cation as possible and they shouldrely on mocks and simulators sparing-ly This allows the tests to provide afairly realistic view of the state of theprogram and to more accurately pre-dict how it will behave in a productionenvironment

Creating and maintaining a usefulsuite of automated regression tests maybe the most difficult aspect of testing inan agile environment In early iterationsnot much of the application exists sothere are few meaningful regression

tests that can be written and executedAs the program grows the scope andcomplexity of the project may becomeoverwhelming and it may be challengingto keep up with the pace of develop-ment It is usually helpful to spend earlyiterations creating a framework forautomated regression tests andpreparing outlines and test scripts forfunctionality that will be added overtime By investing in the scaffolding inthe beginning it becomes possible forthe testers to add new regressiontests incrementally Also the existing

regression tests help to ensure thatthe application continues to performaccording to the expectations estab-lished in prior iterations This makes itpossible for the testers to continue tofocus on new functionality rather thanconstantly reconfirming existing behav-ior is correct

In conventional development meth-odologies there are many categoriesof tests intended to exercise the appli-cation end-to-end These include per-formance tests load tests stresstests integration tests smoke testsand others In the end each of thesemay be thought of as a type of regres-sion test All attempts to determinehow well the product meets the func-tional and non-functional requirementsalso attempt to exercise the completeapplication and all help to ensure thatfunctionality is never lost once it hasbeen implemented

One of the key components ofagile methodologies is a focus on busi-ness value Therefore any of thesetypes of tests that provide valuableand meaningful feedback about thestate of the project should be givenattention in agile environments Forexample if performance is important tothe application then performancetests should be automated and made apart of the regression test suite Thesame is true for any of the other typesof tests mentioned here

Because automated regression

[Automated

regression tests

should cover

as much of the

application

as possible

and should

use mocks

sparingly]

16 bull Software Test amp Performance JULY 2009

tests may serve a variety of purposesthere is no standard approach to imple-menting them Often a combination oftools must be employed to address eachspecific situation Both commercial andfree tools are available for the differenttypes of testing and each may be bestsuited to a particular type of applicationIn addition some teams build customframeworks using Ruby Java PerlPython or another language to get thebest match for their project

MANUAL EXPLORATORY TESTINGThere are two main types ofdefects those under whichthe application fails to performaccording to expectations (asdefined by its requirements)and those that cause theapplication to behave improp-erly in response to an unex-pected event Automatedtests can be effective at find-ing the former For a given setof expected inputs the appli-cation should produce a specific output Howeverautomation is not particularlygood at finding the unexpect-ed

Consider for example anapplication that permits a userto perform tasks A B and CAs the product was beingdeveloped no one expected that A Band C were in any way related Howeverwhen the user performs these tasks in aspecific order the program crashes It isunlikely that the functional requirementsinclude a statement like ldquoThe systemshall not crash when the user performsA then C then Brdquo As a result it isunlikely that an automated test exists tosimulate the situation This is where man-ual exploratory testing can be beneficial

Manual exploratory testing gives thetester an opportunity to truly demon-strate their value to a project Testerstend to be good at coming up withunusual ldquowhat ifrdquo scenarios that repre-sent what actual users will do with theapplication in the production environ-ments These might be long series oftasks odd combinations of events orprolonged use of the program Oftenthese scenarios are too difficult or cost-ly to automate

Manual exploratory testing is morethan the tester wandering randomlythrough the application To be done

effectively the tester must start with aplan He needs to isolate a section ofthe program to explore and time box theexploration Once the tester begins theexploration he should create a record ofthe actions taken This is crucial to theability to recreate any problems foundScreen recording tools also might proveuseful for documenting exactly what thetester does during an exploratory testingsession When flaws are found duringthis process new acceptance or regres-sion tests should be written (and auto-mated) to expose the bad behavior pro-

vide a verification that it has been cor-rected and ensure that it does notreturn

For software testing to be success-ful in an agile development environmenta four-part strategy comprising compli-mentary approaches has given me thebest results

Automated unit testing puts theonus of removing obvious errors ondevelopers where it belongs and resultsin higher initial quality delivered totesters

Automated acceptance testingsolves the critical problem of engagingstakeholders to identify key functionalityand to define what it means for the pro-gram to work correctly and provide con-fidence that the core functionality isbeing properly implemented

Automated regression testing pro-vides thorough coverage ensures thatthe application meets functional and non-functional requirements and guaranteesthat functionality once working is not dis-abled in later iterations

Manual exploratory testing exposesunexpected behavior by letting testersexercise the application in ways too dif-ficult or costly to automate

This four-phase approach to testingenables a project team to effectively andthoroughly determine an applicationrsquosreadiness for release Consistent with

agile practices testingbegins early in the softwaredevelopment life cycle and itis done constantly through-out the project Every teammember participates in someway in the testing effort andthe mixture of testing typesminimizes the risks associat-ed with any single techniqueBy automating what can beautomated it becomes pos-sible to provide testers withsufficient time to fully applytheir unique skills and experi-ences yacute

REFERENCES AND FURTHERREADINGbull Cunningham Ward Framework

for Integrated Test October 122007 httpfitc2com

bull Dunkin Robert Burrell TheLanguage of Testing Dummies

Mocks Fakes and Stubs Weirdest UndreamtUse Case November 29 2008httpwwwjrollercomrobertburrelldonkinentrytdd_mocks_fakes_stubs

bull FitNesse Front Page FitNesse httpwwwfitnesseorg

bull Fowler Martin Mocks Arenrsquot Stubs MocksArenrsquot Stubs January 2 2007 httpwwwmartinfowlercomarticlesmocksArentStubshtml

bull jUnitorg Resources for Test DrivenDevelopment jUnit httpwwwjunitorg

bull Porter Robert Mock Objects in Test DrivenDesign BlogOfBob October 7 2007httpwwwrp2ccomblogofbobMockObjectsInTestDrivenDesignaspx

bull Rothman Johanna Does Exploratory TestingHave A Place On Agile Teams StickyMinds July7 2008httpwwwstickymindscomsitewideaspFunction=edetailampObjectType=COLampObjectId=13860

[Manual exploratory

testing is more than

wandering randomly

through the application

the tester must start

with a plan]

LEARN TEST AUTOMATION from Robert Walsh at the Software Test ampPerformance Conference this fall Hersquoll be with us in Cambridge MAOct 19-23 teaching ldquoFirst Steps to Test Automationrdquo an in-depth coursethat can automate your testing fast VViissiitt wwwwwwssttppccoonnccoomm

observed with clients using agile methodologies typi-cally Scrum While my associates and I have helpedclients overcome these challenges I will not present allof our suggested solutions here Instead Software Testamp Performance magazine has given test consultant andcertified Scrum Master Bob Galen an opportunity tooffer solutions

What I describe here are challenges we haveobserved or discussed with clients who told us they werefollowing an agile life cycle Some of these challengesmay be inherent in agile methodologies However somemay in fact actually arise from improper understanding orimplementation of the chosen methodology resulting inavoidable problems

VOLUME AND SPEED OF CHANGE One of the principles of agile development is that proj-ect teams should welcome changing requirementseven late in development Many testing strategies

especially analytical require-ments-based testing becomequite inefficient in such situa-tions While automated test-ing can help accommodateon-going change automationcan be fragile and high-main-tenance if not implementedproperly We find that risk-based testing coupled withreactive techniques likeexploratory testing and prop-

By Rex Black

Every software development model presents itsown testing implications Some ease the testingprocess others make it more challenging Whatfollows is a look at some challenges I have

Rex Black is founder and president of RBCS a software hardware andsystems consultancy and is a founding member of the STP CollaborativeStrategic Advisory Board

18 bull Software Test amp Performance JULY 2009

The AgileProcessCreatesTestingPuzzles

Which AreSolved When

Agile IsPerformedCorrectly

of traditional testing consultants I donrsquot believe theywere gathered from the perspective of an actual agiletester one with serious experience working in matureand high performing agile teams

Such views often come from the perspective of ascribe or observer of teams who at least from my per-spective arenrsquot practicing agility very well Or theyrsquoreanecdotal reactions to the Agile Manifesto or other writ-ings without trying to understand the underlying intentionsWhatrsquos discouraging is that superficial reactions from wellrespected sources can give the wrong impression

VOLUME AND SPEED OF CHANGEThe point of embracing change is the realization thatmost software projects are volatile Business conditionsare incredibly dynamic Customers may not actuallyknow what they need until they see the application startto coalesce as working software The developers maynot have understood the challenges associated with thearchitecture without actuallybuilding parts of it nor mighttesters understand the mostrisk-based areas to test with-out seeing the application run

The point of trying to freezerequirements at a historicalinstant in time where thingsare well understood withoutwriting a lick of software is afoolrsquos errand Agile methods tryto embrace change react to it

Iwas asked by Software Test amp Performance magazine to provide a rebuttal to Rex Blackrsquos perspectives on how ldquoagile challenges testingrdquoRexrsquos views and reactions to agility are typical

By Bob GalenBob Galen is principal consultant of the RGalen Consulting Group anda founding member of the STP Collaborative Strategic Advisory Board

JULY 2009 wwwstpcollaborativecom bull 19

erly-done automation works bestWhatrsquos more change-related challenges often arise from

changes in the definition of the product and its correct behav-ior When the test team is not kept informed of thesechanges or when the rate of change is very high this canimpose inefficiencies on the development execution andmaintenance of tests

SHORT ITERATIONS SQUEEZE TESTINGIn sequential life cycles if the organization involves the testteam early in the project the testers have lots of time todevelop and maintain their tests and can do so in parallel withthe development of the system and prior to the start of sys-tem test execution Some more formal iterative lifecycle mod-els such as Rapid Application Development and the RationalUnified Process often allow substantial periods of timebetween test execution periods for each iteration Theseintervals allow test teams develop and maintain their test sys-tems

Most teams using agile life cycles progress more quicklyeg Scrum uses sprints two to four week periods of shortfast-paced development iterations I have seen this pacesqueeze the test teamrsquos ability to develop and maintain testsystems Testing strategies that include an automation ele-ment also have proven particularly sensitive to this challenge

UNIT TESTING IS OFTEN INADEQUATE Most agile proponents stress good automated unit testing andframeworks such as JUnit have helped simplify and minimize

and make the changes visible across the team and to stakehold-ers while guiding the software toward the customersrsquo true needsIf the traditional testing approaches that Rex alludes to can notadapt to change effectively then Irsquod argue that they need tochange

SHORT ITERATIONS SQUEEZE TESTINGIf your parochial goal is simply testing then I agree that shortiterations can be quite challenging But if your goal is to deliversmall chunks of working software to gain customer and prod-uct quality feedback and to ultimately deliver high value andworking products applications then small iterations rock

To Rexrsquos point regarding ldquosqueezing test system mainte-nancerdquo all of the agile methods adhere to the notion of com-pleting a small set of features that meet rdquodonenessrdquo criteriaat the end of each Iteration or Sprint

There should be no work that is left outmdashfor exampleappropriate test automation development or maintenance oftest ldquosystemsrdquo

Itrsquos up to the testers to simply place this work on theProduct and Sprint Backlogs and plan its completion as partof each Iteration or Sprint

Sometimes this takes effort and courage to defend this workagainst feature pressure However all team members encounterthe same challenge and need to ensure that they deliver ldquodonerdquosoftware from each of their functional perspectives

UNIT TESTING IS OFTEN INADEQUATEFirst agile methods stress automated testing of all kindsmdashnotsimply unit tests FitNesse Selenium and Watir Watin are

examples of open source automation tools that are widelyused in the agile community and that focus on non-unit testing

We do stress unit tests as being an important part of theoverall coverage and as being a safety net for the teamsrsquo refac-toring efforts We want developers to be responsible for con-tributing to test automationmdashnot just testers And if a team isusing TDD as a practice wersquove learned that developing testscan be a powerful aid as one designs their software in smallincremental steps

No agile team to my knowledge views unit tests as theend-all of testing In fact as part of delivering ldquodonerdquo softwareincrements they must decide on the set of tests (unit func-

20 bull Software Test amp Performance JULY 2009

[I expect testers to pair with

developers to improve practices anddevelopers

to deliver solid unit tests before they

can be considered lsquodonersquo]

tional acceptance non-functional regression etc) thatshould be run within the Sprint in order to meet a customer-ready deliverable Every team member should be definingthese testing tasks with the testers taking the lead

To Rexrsquos last point that programmers donrsquot commit to doingeffective unit testing I expect the testers to pair with the devel-opers to improve these practices and I expect developers todeliver solid unit tests before they can be considered ldquodonerdquo

MANAGING THE INCREASED REGRESSION RISKThere is a continuing misperception that unit tests are theonly tests run on a Sprint-by-Sprint basis While itrsquos true thatthe team might not be able to run a full regression test with-in each Sprint it is their responsibility to apply risk-basedtechniques based upon whatrsquos been added or changed andto run appropriate functional non-functional exploratory andregression tests as part of delivering ldquodonerdquo software at theend of the Sprint

Clearly automation development at all levels helps withthis so the team should be investing time in improving theiroverall ability to automate all of the testing types I mention asappropriate

POOR OR MISSING TEST ORACLESI hate to break the news to Rex but the best test oracles arethe people within the team not the documentation Documen-tation almost always loses its relevance and accuracy Yes itrsquoshelpful but should never be considered the single source forall knowledge

What agility encourages is for the team to talkmdashto plan their

the costs Such automated unit tests allow what agile propo-nents call refactoring Refactoring is the redesign of majorchunks of code or even entire objects The automated unit

tests provide for quick regression testing of refactored codeSome agilists recommend substituting automated unit tests fordesign as in Test Driven Development While this soundsgood in theory there are two problems that I have observedwith this approach in practice

First unit testing has limited bug-finding utility In researchpublished in June 2002 in CrossTalk the Journal of Defense

Software Engineering author Capers Jones found unit testingto be only 25 to 30 percent effective at finding and removingdefects In RBCS assessments we have found that good sys-tem testing by an independent test team averages around 85percent effective at finding defects So organizations that careabout quality should have both good unit testing and good sys-tem testing

Second while unit testing is often thought of as thedeveloperrsquos job I have found that many programmers justdonrsquot do it or deliver cursory attempts This can create com-plications and delays for the system test team in its role asldquobug finder of last resortrdquo as they deal with buggy code dur-ing short test execution periods in the sprints This can alsolead to code churn during testing since so much code mustchange to fix the bugs

The amount of change can ultimately outstrip even theability of the best automated regression test system to keepup which then leads to lower defect detection effectivenessfor the test team

MANAGING THE INCREASED REGRESSION RISKCapers Jones also reported that regression accounts forabout seven percent of bugs In iterative life cycles such asScrum however code that worked in previous sprints getschurned by new features in each subsequent sprint Thisincreases the risk of regression

Agilists emphasize good automated unit testing in part tomanage the regression risk inherent in such churn But with eventhe best unit testing topping out at about 50 percent defect

JULY 2009 wwwstpcollaborativecom bull 21

[While unit testing is often thought

of as the developerrsquos job I have found that

they just donrsquot do it or [they] deliver

cursory attempts]

22 bull Software Test amp Performance JULY 2009

work together to cluster in small groups working on specific fea-tures to work with the customer gaining valuable feedbackmdashandto collaborate across disciplines From this they can decidewhere when and to what degree documentation is necessary

We expect team members to be professionals and to beresponsible and accountable We trust their judgment overblindly filling in a plan template or a requirement document orblindly running the same tests over and over again to meetsome perceived risk

All oracles have a cost We engage the customers in thesedecisions as wellmdashensuring that those oracles that are providedare actually used and kept current

THE SHIFTING TEST BASISIrsquom lost here Agile teams define requirements Some usePost-it Notes and cards or define them as User StoriesOthers describe use cases or use traditional requirementforms Many even add a construct called acceptance tests ona finely grained story level that helps define the critical ele-ments of each feature or requirement These are things thatthe customer will ldquotestrdquo as part of their acceptance at the endof a Sprint or Iteration

Irsquod guess that Rex has seen poor or missing requirements inhis agile experience and Irsquod say those were not mature or solid-ly performing agile teams ldquoGoodrdquo teams define requirementsat a level that assures the team will build what the customerneeds They ensure collaboration surrounding each require-mentmdashat the point of attack

To capture a key point here The agile methods try to adopta Just-in-Time and Just Enough approach to their requirement

removal effectiveness according to Jones automated regressiontesting via unit tests will miss at least half of the regression bugs

POOR OR MISSING TEST ORACLESAgile favors working software over written documentationand in my observations special scorn is reserved for specifi-cations The suggestion by the Agile Manifesto (wwwagile-manifesto org) that people should value ldquoworking softwareover comprehensive documentationrdquo can create real chal-lenges for a test team when taken too far Testers userequirements specifications and other documents as testoracles ie the means to determine correct behavior undera given test condition We have seen testers in agile situa-tions given documents with insufficient detail and in somecases given no documents at all

Even when testers are given adequate documents twoother agile development principles keep the test oracle chal-lenge alive First agile requires teams to embrace changeand espouses that ldquothe most efficient and effective methodof conveying information to and within a development team isface-to-face conversationrdquo

From what Irsquove seen these two principles allow the proj-ect team to change the definition of correct behavior at anytime including after testers have created tests to confirm aparticular behavior and even after testers have reported bugsagainst a particular behavior

Further the definition of correct behavior can change in ameeting or a discussion where a tester is not present If thatchange is not communicated to the tester the project suffers

false positives when testers report bugs against behaviornow defined as correct

THE SHIFTING TEST BASISRequirements-based testing strategies cannot handle vague ormissing requirements specifications Missing requirementsspecifications would mean a test team following a requirements-based testing strategy not only canrsquot say what it means for aparticular test to pass or fail they wouldnrsquot have a test basisupon which to base their tests

The test basis also provides a means to measure theresults In a requirements-based test strategy testers canrsquotreport test results accurately if the requirements are missing orpoor Testers canrsquot report the percentage of the test basis cov-ered by passed tests because the requirements wonrsquot provideenough detail for meaningful coverage analysis

TOO MANY MEETINGSA ldquoface-to-face conversationrdquo is just another way to describe ameeting A client once jokingly described Scrum as a heavy-weight process saying ldquoIrsquom surprised at the name agilemdashitshould be called couch potato There are too many meetingsTherersquos too much jawboning I find it really ironic that there are allthese books explaining how simple it isrdquo

Perhaps it was intentional not to use the dreaded word meet-ing in the Agile Manifesto but the result is often the same In someorganizations everyone gets invited to every meeting meetingsballoon in number and duration and managers and leads are notavailable to manage and lead their team because they are in meet-

ings much of the day Effectiveness and efficiency drop Anotherclient once told me of a senior manager who said in response toa complaint from a line manager about how attending meetingswas negatively impacting his ability to lead his team ldquoWersquore goingto continue to have these meetings until I find out why nothing isgetting done around hererdquo The senior manager was I am told theonly one in the room who didnrsquot get the irony

THE BOTTOM LINEIn RBCS client engagements we typically recommend blend-ed testing strategies that align well with Scrum and otheragile methodologies What seems most likely to work in manycases is a blend of risk-based testing maintainable automat-ed testing (at both the unit and system level) and a percent-age of effort spent on reactive testing (eg exploratory) Insome cases our methods will help mitigate the testing risksand reduce the testing challenges associated with thesemethodologies

Clearly there is no single way to perform effective soft-ware testing and as the industry evolves so too will the mosteffective methods that do the most good yacute

development An option of ldquono requirementsrdquo does not exist

TOO MANY MEETINGSThe point of face-to-face collaboration is that itrsquos the most effec-tive and richest way to communicate in technical teams Has anydeveloper ever implemented a requirement had a tester test itand a customer receive it with perfect alignment That wouldtruly be exceptional More oftenthe tester will say the softwareperforms differently than the testsandor the customer says I didnrsquotask for that even though theywere all working off of the exactsame requirement

The point is that technicalteams need to rely on written aswell as face-to-face collabora-tion to effectively reduce themisunderstandings and reworkthat written requirements alone often drive

Beyond this agile teams should be conducting incrediblyeffective meetings something perhaps lacking in the exam-ples Rex has seen I suspect this problem isnrsquot agile-centricand that those teams simply need a good facilitator

THE BOTTOM LINEEffectively testing todayrsquos software is incredibly challenging andwe all need to stop thinking of agile testing as somethingopposed to traditional testing Testers need every possible tool

to perform testing well Agile methods have emerged as a set ofpractices and a mindset that is context based and powerful Theytake a holistic view to testing and qualitymdashreinforcing the two asa whole Testers in this model have a powerful opportunity to notonly test but to champion and guide the overall teamrsquos focus ondelivering quality software

We need to stop our history of opposition to new testing app-roaches and begin embracing abroader set of testing practicesthat work in each projectrsquos uniquecontext I also believe we that weneed to stop quoting CapersJones as I suspect his data andstudies do not reflect the dynam-ics of mature agile teams Irsquom look-ing forward to encouraging Rex totry and gain deeper understandingof agile testing practices yacute

Bob will have ample oppor-tunity to do that in October when he and Rex meet for thefirst time as fellow members of STampP CollaborativersquosStrategic Advisory Board ndashEd

JULY 2009 wwwstpcollaborativecom bull 23

[Teams need to rely on written and face-to-face

collaboration to effectively reduce rework]

LEARN AGILE TESTING TECHNIQUESdirectly from Bob Galen himself this fall atSTPCon The Scrum Master will be on hand inCambridge MA Oct 19-23 to teach ldquoAgile andHigh-Speed Software Testing Techniquesrdquo anin-depth training course for bringing agile test-

ing to your organization VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

LEARN TEST MANAGEMENT from Rex Blackat the Software Test amp Performance Conferencethis fall in Cambridge Mass Learn about ldquoTestManagement A Risk-Based Approachrdquo the prac-tice that guides test activities based on carefulassessment and prioritization of the risks involved VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

ciplines and draw on a variety of skills to do their jobs And whilethere are many unsung testing heroes inside software develop-ment shops there are the beginnings in many organizations of agrowing recognition that test and performance engineers shouldbe considered peers of their development counterparts

We asked several testing gurus to share their thoughts onthe state of the testing industry and give us some insights on wherethey see things evolving One thing for anyone to realize is thatwhile testing as an industry is maturing it is still relatively new Thesoftware tester job isnt something that shows up on the US CensusWe are still very much a new profession says Scott Barber thefounder and chief technologist of PerfTestPlus in Palm Bay Fla Mostof our very first programmers are still alive we arent thousands ofyears old like other disciplinesrdquo

Perhaps the best rundown of the industryrsquos evolution is summed up by Michael Boltonwho lives in Toronto Testing has historically been confirmation validation and verificationhe says I believe that testing must become far more about exploration discovery investi-gation learning and communicating what youve learnedrdquo Bolton runs test consultancyDevelopSense and teaches rapid software testing methods ldquoI like to encourage people torecognize and manage uncertainty and ambiguity and instead of trying to eliminate them get-ting them down to levels that are suffi-ciently low for people to do use-ful workrdquo

While any gathering ofsoftware testers will find dis-agreement on overall pur-pose one thing is cleartesting isnt about how toimprove code qualityndashthat isthe role of the engineeringpeople who do the coding workand of management says BoltonTesting is about providing informationto those people so that better-informed decisionsabout quality are possible

The future careers for the current cropof test engineers seem to be on the rise

All this activity has made software testing a very big tent The industry has matured and grownto encompass numerous approaches to methods training and tools Because of this diver-sity there is a growing consensus that the best testers are those that can span multiple dis-

The software testing industry is at a historic crossroads Rich Internet applica-tions cloud computing the rise of virtualization and more capable mobiledevices have all made the job of producing reliable software more complexwhile at the same time enabling more sophistication for automated test tools

Wersquore All Part of the Story

We are a new profession most of our first programmers are

still alivendash Scott Barber

A skilledtester is theproduct of avariety of

experiencesndash Jon Bach

24 bull Software Test amp Performance

the business outcomes theyrsquove delivered he says He goes on to describe the best mix in his development teams A performance

engineering organization should be comprised of engineers that are just as capableor more capable than the core product engineers This allows for the performanceteam to be just as familiar with the product code base and for them to go as far asmaking performance enhancements to the product and testing the outcome of thosechanges

At Google employees involved in testing generally have one of three titles saysDavid Saff a Google software engineer in test whorsquos basedin Cambridge Mass Software engineers are expected to unit-test their own code and many take an active role in maintain-ing testing frameworks and system tests for their teams Testengineers look at the quality requirements of a project identifyholes and fill them by applying existing automation tools where pos-sible or managing manual testing teams where needed Softwareengineers in test look for holes in the tools available and serve as internal

tool developers and implementation consultants Any teamcant survive without high competence in each domain

CLOUDS ARE FORMINGThe move toward cloud computing has helped softwaretesting overall but also made for some new challengestoo Cloud computing adds layers of complexity to test-ing but it also makes things more interesting says BoltonThese layers include being able to predict latency and avail-ability of all components that reside in the cloud Barber saysthat Too many people take latency for granted still and many

application developers still take the entire network infrastruc-ture for granted and dont consider it part of their domain

so programmers just assume it is working unless theircode has a problem

Bartow agrees Network latency hasbecome even more important with the intro-duction of Rich Internet Application tech-nologies such as Adobe Flex Every value that you enter

David Strom is a freelance reporter and author living in St Louis

particularly within large software shops such as Intuit and Google There software testengineers are often more important than the development engineers who writethe original code and have their own career track with parallel compensationand benefits

At Intuit QA engineers are held to the exact same standards and expec-tations as product developersrdquo says Dan Bartow a manager of performanceengineering for the company which makes TurboTax and other financial softwareldquoIn fact my team of five full-time performance engineers is calibrated the sameway as the product engineers at the end of year in terms of their skill-sets and

Virtualizationmakes things

easierand harder

ndash Michael Bolton

wwwstpcollaborativecom bull 25

of the Story

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 4: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

GET MORE ONLINE AT

wwwstpcollaborativecom bull 7

A PublicationREDWOODCollaborative Media

Software Test amp Performance (ISSN- 1548-3460) is published monthly by Redwood Collaborative Media 105 Maxess Avenue Suite 207 Melville NY 11747Periodicals postage paid at Huntington NY and additional mailing offices Software Test amp Performance is a registered trademark of Redwood CollaborativeMedia All contents copyrighted 2009 Redwood Collaborative Media All rights reserved POSTMASTER Send changes of address to Software Test amp Performance 105 Maxess Road Suite 207 Melville NY 11747 To contact Software Test amp Performance subscriber services please send an e-mail to membership-servicesstpcollaborativecom or call 631-393-6051 ext 200

Testing a product that was due forrelease yesterday Learn thesecrets of just-in-time testing fromlong-time consultant and testingguru ROB SABOURINstpcollaborativecomsabourin

Agile testing boom or bust BOBGALEN and REX BLACK duke itout in The Great Agile Debate aLIVE WEB EVENTstpcollaborativecomeseminars

Join us at STPCON 2009 STPCollaborativersquos first annual gath-ering STPCon is the industryrsquosleading event for software devel-opment managers testQA pro-fessionals and notable luminar-ies to network and learnndashOct19ndash23 Cambridge MAstpcollaborativecomconferences

CONTRIBUTORS

ROBERT WALSHis president andmanager of applica-tion development at EnvisionWare aprovider of librarysoftware solutionsHe has been work-

ing with agile methodologies since2002

With more than 25years of softwareand systems engi-neering experienceREX BLACK haswritten numerousbooks on softwaretest management

He is president and principal con-sultant of RBCS

A Certified ScrumMaster Practicing(CSP) since 2004BOB GALEN isthe director of prod-uct developmentand agile architectfor ChannelAdvisor

and the principal consultant forRGalen Consulting Group

CHRIS FERRISis an IBM disting-uished engineer andCTO of IndustryStandards in IBMsSoftware GroupStandards Strategyorganization Hes a

former chair of the WS-I BasicProfile Working Group

DAVID STROMformerly editor-in-chief at Toms Hard-warecom NetworkComputing Maga-zine and DigitalLanding has writtentwo books and thou-

sands of articles for trade publi-cations

Can you identify slow spots failures and bottlenecks Learnhow with app-performance sleuthSCOTT BARBERstpcollaborativecombarber

Our new resource directory show-cases all the LATEST TESTERSOLUTIONS and technologiesSearch by category see demosdownload trials get members-onlybenefits and much morestpcollaborativecomresources

July 2009

Dear Subscribers

In December 2008 Redwood Collaborative Media purchased the Software Test amp Performance

division of BZ Media We had a vision of creating new opportunities for software testers to con-

nect with and learn from test and QA colleagues around the globe through a new collaborative

media model Today we are pleased to realize that vision and unveil the new Software Test amp

Performance Collaborative

STP Collaborative will continue to bring you the high quality news valuable information and

face-to-face networking opportunities yoursquove come to expect from Software Test amp Performance

magazine and the Software Test amp Performance Conference In addition wersquore excited to

announce the launch of an all-new Web site which will serve as the cornerstone of a new mem-

bership model that enables community-building education training and peer-to-peer knowl-

edge sharing which are so vital to your profession Were excited about our new programs and

we hope you will be too We invite you to visit wwwstpcollaborativecom and join our com-

munity today

Once again we welcome you to STP Collaborative and invite you to use and comment on our

content contribute to our knowledge base and interact with each other to build a community

that will benefit everyone in your growing profession We look forward to working with you to

build a global community of knowledge workers in the software test and QA profession

Sincerely

Andy Muns

President Redwood Collaborative Media

Publisher Software Test amp Performance magazine

JAMES BACHCEOSatisfice

SCOTT BARBERChief Technologist PerfTestPlusDirector AST

KENT BECKFounder and directorThree Rivers Institute

VLADIMIR BELORUSETSSQA ManagerXerox

REX BLACKFounder and presidentRBCS

Introducing the STP Collaborative Strategic Advisory Board

8 bull Software Test amp Performance JULY 2009

MICHAEL BOLTONFounderDevelopSense

ROSS COLLARDFounderCollard amp Co

JAN FISHQA ManagerPhillips Lifeline

ROBERT GALENPrincipal ConsultantRGalen Consulting Group

MATT HEUSSERInd software developertester amp trainer Socialtext

DAVID HOLCOMBECo-founder president amp CEOeLearning Guild

ANDREW MUNSPresident amp CEORedwood Collaborative Media

RON MUNSChairmanRedwood Collaborative Media

BJ ROLLISONTest ArchitectMicrosoft

DAVID STROMFreelance columnist and speaker

IF I HAD TO DESCRIBE IN A SINGLEword the characters that make up todayrsquos castof test industry thought leaders one that fits wellwould be diverse Theyrsquore geographically di-versemdashstretching from Quebec to Kalamazoofrom Boston to Bulverde Their clients are evenmore so Engagements spread across the USand Canada to China New Zealand Russia andSweden to name just a few

Then therersquos the philosophical divide asyoursquoll discover in David Stromrsquos fine piece in thisissue titled ldquoWersquore All Part of the Storyrdquo in whichhe looks at the state of the testing industry andits major players Yoursquoll hear from the Bach broth-ers James and Jon Scott Barber Rex BlackMichael Bolton Kent Beck and many others Justto prove that we didnrsquot simplytake names from the B sec-tion of the testerrsquos phone bookwe also have Googlersquos DavidSaff weighing in

While opinions and philo-sophies vary therersquos one thingon which they all seem to agreeWith the ever-increasing com-plexity of software paired withever-shrinking release timeta-bles there will always be a needto reduce testing time andincrease efficiency Is thereagreement on how best toaccomplish that Maybe in car-toons

Which brings me to thestory behind this monthrsquos cov-er illustration concept CreditMr Strom for the caricatureidea during a brainstormingmeeting with me and Redwoodpresident Andy Muns I knewthat self-effacement would beokay with Jon Bach and hisbrother James who recently recounted the sto-ry of how the two had received caricatures ofthemselves at a conference they were bothattending We asked permission anyway ofcourse as we did with the remaining playersPeople in testing are exceedingly good natured

in my experienceat least

Varied opin-ions are furtherhighlighted in thisissue in The AgileProcess CreatesTesting Puzzlesa lively point-coun-terpoint by con-sultant Rex Blackand Scrum Mas-ter Bob Galen Ofcourse even critics of agile methods acknowl-edge merits of the practices when executed cor-rectly After all what can be bad about putting

people over process Of put-ting working software over com-prehensive documentation

You might say that agileprinciples were behind the designof our new Web site which wereexcited to launch this month Itrsquosdesigned for individuals to inter-act with each other and with usThe site also features process-es tools and resources fortesters QA professionals andsoftware managers Itrsquos designedwith the customer (you) in mindand looks to add interactivity andcollaboration to your favorite test-ing magazine And well con-stantly add more features andcontent to make this new site arich and valuable online resourceand community

Also beginning this monththe magazine has a fresh newlook Sincere thanks go to artdirector LuAnn Palazzo with-out whose talent and infinite

patience this project would not have happenedWe hope yoursquoll enjoy the magazine and participate in our new membership communityand collaborative Web site Please join us atwwwstpcollaborativecom and tell us what youthink yacute

ed notes

The Test IndustryrsquosCartoonland Villagec

Edward J Correia

ldquoWhile opinions

and philosophies

vary there will

always be a need

to reduce testing

time and increase

efficiency rdquo

VOLUME 6 bull ISSUE 7 bull JULY 2009

President Andrew Muns

Chairman Ron Muns

105 Maxess Road Suite 207Melville NY 11747+1-631-393-6051+1-631-393-6057 faxwwwstpcollaborativecom

Cover Illustration Caricatures by Steve Nyman

REDWOODCollaborative Media

Editor

Edward J Correiaecorreiastpcollaborativecom

Contributing Editors

Joel ShoreMatt HeusserChris McMahon

Art Director

LuAnn T Palazzolpalazzostpcollaborativecom

Publisher

Andrew Munsamunsstpcollaborativecom

Associate Publisher

David Karpdkarpstpcollaborativecom

Director of Operations

Kristin Munskmunsstpcollaborativecom

Chief Marketing Officer

Jennifer McClurejmcclurestpcollaborativecom

Marketing Coordinator

Teresa Cantwelltcantwellstpcollaborativecom

Reprints

Lisa Abelsonabelsonstpcollaborativecom

516-379-7097

MembershipCustomer Service

membership-servicesstpcollaborativecom

631-393-6051 x 200

Circulation and List Services

Lisa Fiskelfiskestpcollaborativecom

wwwstpcollaborativecom bull 9

IN FEBRUARY OF 2001 17 TECH-nology professionals gathered for a peerconference in Park City Utah and craft-ed a document called the Agile Man-ifesto That document is probably thesingle most influential statement on soft-ware development of the past decadeand its values led to a series of newideas and concepts insoftware testing

Agile SoftwareMethodsXP Short for ExtremeProgramming XP was cre-ated by Kent Beck duringhis work at Chrysler Atthe time projects at theauto maker struggled withheavyweight specifica-tions specialized rolesand enough turnover thatprojects would simply burnout mid-stream delivering no workingsoftware at all before being canceledExtreme Programming focuses on deliver-ing software by first developing simpleworking features and then adding morefeatures incrementally while improvingthe code constantly Most XP techniquessuch as Pair Programming Test DrivenDevelopment and Refactoring are nowconsidered engineering practices

Scrum Popularized at (now defunct)Easel Corp by Ken Schwaber and JeffSutherland At Easel requirementswere changing so quickly that thedevelopers could never get anythingcompletely finished Scrum focuses onstable requirements for each time boxcommitments from team membersand communication through dailystand up meetings Most of the focusof Scrum is on project management

DSDM Short for Dynamic SystemsDevelopment Method Developed in theUnited Kingdom DSDM advocatesnotice that many projects have fixed staffand a fixed release date DSDM thereforefocuses on techniques to allow the teamto alter the project scope while maintain-ing high quality thus hitting the original

due date

Agile SoftwareProcessesTraditional Software Pro-cess generally breaksdown work by phase orby type of work all of therequirements first thenall of the coding then allof the testing and so onAgile processes insteadbreak down work by fea-ture one thin slice at atime of the final product

which the customer can use to actuallyand concretely do something new

Agile teams then batch the work upinto time boxes periodically bringing thecode to production quality Agile teamsrelease production code as often asevery few days to as rarely as every fewmonths Because of that teams needgreater discipline and skill to deal with anincreased regression testing burden

Iteration Used in Extreme Program-ming an iteration is a time box (typicallyone to three weeks) starting with arequirements exercise and ending withnew features at production quality

Sprint A short time box used in Scrumthat allows the technical staff to createcode ready for production without therequirements changing as they workTechnical staff are expected to own thedeliverable and release production fea-tures at the end of the time box

User Story A form of requirement thatdescribes a single specific meaningfulpiece of functionality that has value to the

business User stories are small and canbe implemented by a pair of programmersin one or a few days User Stories line uproughly with a single scenario within ause case in traditional software devel-opment A user story is not a specifica-tion it is an example of what the softwarewill allow a person to do

Story Card Stories are often express-ed on 3x5 index cards Agile Coach RonJeffries has described a story card as aplaceholder for a conversation

Story Tests A series of tests express-ed in plain English but eventually automat-ed Story tests provide examples for theprogrammers and testers and describethe minimum amount of functionality forthe story to be called done The goal ofstory tests is not to convey a comprehen-sive specification but to provide examplesso that the whole team can build a sharedmental model of what is expected Story Kickoffs A kickoff is the finalmeeting held before developers start

writing code The idea that testersshould be involved early in the processhas been around for decades The agiletwist is to have testers help as part ofthe design for the story During the kick-off developers product owners andtesters can discuss and ask questionsabout what the software will do this canlead to a better description and prevent

stamp pedia

10 bull Software Test amp Performance JULY 2009

Agile Testing in Practicec

The encyclopedia for software testing professionals

Matt Huesser andChris McMahon

Matt Heusser and Chris McMahon are careersoftware developers testers and bloggers Mattalso works at Socialtext where he performstesting and quality assurance for the compa-nyrsquos Web-based collaboration software

ldquoGood ideas can come

from anywhere at

any time rdquo

continued on page 36 gt

wwwseapinecomstpswiftSatisfy your quality obsession[ [

copy 2009 Seapine Software Inc All rights reserved

Swiftcovercom cut their testing time in half with TestTrack Studio and QA Wizard Pro while still providing the quality their customers expectSeapinersquos end-to-end Software Quality Assurance (SQA) solutions help you deliver quality products faster Start with

QA Wizard Pro for automated testing and add TestTrack Studio for issue tracking and test case managementmdashintegrated quality assurance solutions that together reduce testing time saving you money and improving customer satisfaction

Reduce quality assurance costs with automated functional and regression testing

Manage test case development defect assignments and QA verification with one application

Track which test cases have been automated schedule script runs and link test results with defects

Learn how to test faster while protecting quality Visit wwwseapinecomstpswift

Save timewhile protectingsoftware quality

So much of success boils down to time QA Wizard Pro and TestTrack Studio allow us to be more protable because we do more in less time mdash Test Manager Swiftcover

Test

Trac

kregPr

o T e

stTr

ackreg

TCM

T e

stTr

ackreg

Stud

io

Surr

ound

SCM

regSe

apin

e C M

regQ

A W

izar

dregPr

oIss

ue M

anag

emen

t Te

st C

ase

Man

agem

ent

Test

Pla

nnin

g amp

T rac

king

Co

nfi g

urat

ion

Man

agem

ent

Chan

ge M

anag

emen

t Au

tom

ated

Test

ing

Illus

trat

ion

by S

teve

Nym

an

necessary for the others to succeed They alsoprovide a system of checks and balances thatallows the strengths of one to compensate forthe weaknesses of the others Each pillar is theresponsibility of a different group within theagile development team and when each groupdoes its part my experience has shown that theresult is usually a successful project

AUTOMATED UNIT TESTINGUnit testing is perhaps the easiest type of test-ing to automate Automated unit testing is apractice through which nearly every line andbranch of code is covered with an automatedtest that proves the code works as intendedAutomated unit tests focus on the discreteunits or modules within the application eachperforming its specific tasks in isolation fromthe rest Automated unit tests may be writtenbefore or after the code under test is writtenbut itrsquos most effective to write them beforehand(a process known as Test-Driven Developmentor TDD)

With TDD the code tends to more close-ly represent the developerrsquos intent becausefor the developer itrsquos easier to write justenough code to satisfy the single new failingtest Less code written means less code tomaintain and fewer places where bugs canhide When the unit tests are written after thecode the tests scan sometimes show onlythat the code works as written and not nec-

essarily as intended Further therersquos a ten-dency to write more code than is necessarybecause the programmer is speculating aboutwhat conditions might exist rather than start-ing from concrete test cases that highlight thetrue requirements

Automated unit testing falls almost solelyon the shoulders of the developers and helpsto ensure that the code they are writingbehaves according to their interpretation ofthe requirements The word interpretation isimportant there because while a modulemight achieve 100 percent code coverage andpass all of its unit tests it might still fail tomeet the needs of the application If the pro-grammer misunderstands the requirementthe tests he writes will be flawed and theresulting code will misbehave Thatrsquos amongthe reasons that automated unit testing is notthe only form of automated testing employedon an agile project

Automated unit tests are generally writtenin the same language as the application beingtested A family of testing tools exists to facili-tate automated unit testing Among the mostwell-knows is jUnit created for Java by KentBeck and Erich Gamma with ports now avail-able for most languages These tools provide

By Robert Walsh

Successful agile testing relies on a strategy built on four pillars automated unit test-ing automated acceptance testing automated regression testing and manualexploratory testing As with columns that hold up a building no single pillar cansupport the entire load The four are interdependent and each provides benefits

THE BUILDING BLOCKS OF AGILEmdash

AUTOMATED UNIT ACCEPTANCE AND

REGRESSION TESTING PAVE THE WAY TO

MANUAL EXPLORATORY TESTING

Robert Walsh is president and manager of applicationdevelopment at EnvisionWare which provides soft-ware solutions for public and academic libraries

14 bull Software Test amp Performance JULY 2009

language elements for defining testcases combining tests into suites andasserting the various exit conditions thatdetermine whether the tests pass or failMany also provide a run-time environ-ment in which the tests execute Theserange from simple command-line reportsshowing the results of the tests to com-plete GUIs that integrate directly withthe developerrsquos programming environ-ment Automated unit tests often areintegrated with the continuous build sys-tem and are run each and every time theproject is built This helps to ensure thatthe developers know instantaneouslywhen a recent check-in has caused thesuite of unit tests to fail

When working with automated unittests developers commonly find itnecessary to use mock objects fakeobjects or method stubs to stand in forother parts of the application Forexample if a module obtains informa-tion from a database it might beadvantageous to use a fake databaseobject to provide canneddata that is similar to thereal data available in theproduction environmentThis approach gives theprogrammer more controlover the test environ-ment and lets them dic-tate which data isreturned and how to rep-resent a variety of inter-esting situations Furtherthe technique allowsdevelopers to simulateproblems and failures thatmay be difficult to triggerwhen using a real data-base These might includeconnectivity issues bufferoverruns and other errorconditions

Automated unit testsprovide several key ben-efits in an agile environ-ment First they make itpossible to achieve highlevels of code coveragealmost as a side-effect Ifa programmer is disciplined aboutTDD then near 100 percent coveragejust happens Second automated unittests help to provide an early warningindicator that shows when behaviorwithin a module has changed This canbe used to identify unintended side-effects of other development and toserve as a kind of safety net for later

refactorings If after reworking a mod-ule all of the existing unit tests passthe developer can be confident that themodule will function properly in thecontext of the full application Finallyautomated unit tests also help to raisethe initial quality of the applicationdelivered to testers By using tests aspart of the coding process developersare able to catch many of the minorerrors and trivial mistakes that wouldotherwise fall to testers Reducingobvious coding errors through unittesting helps free up testers for theimportant job of manual exploratorytesting

AUTOMATED ACCEPTANCETESTINGAn essential element to the success ofan application project is how well itmeets the needs of its users A prod-uct might be absolutely flawless per-fect in every way and still not do thejob One way to ensure that the busi-

ness needs are met iswith automated accept-ance testing

With automated ac-ceptance testing the cus-tomer often with help ofQA writes tests for thekey features that arescheduled to be imple-mented in the upcomingiteration These tests arewritten in a natural lan-guage that non-technicalpeople can read and as aresult they become exe-cutable specifications thatcannot get out-of-syncwith the application beingtested

Automated accept-ance tests tend to focuson successful use casesthey often ignore the sadpaths and exceptional situ-ations For this reason atest strategy built entirelyon automated acceptancetesting will be inadequate

to thoroughly test the applicationSometimes the testers will augment thetest suite created by the customer withadditional acceptance tests that docover more of the corner cases

Automated acceptance tests pro-vide the primary measure of progress asthe product evolves over the course ofan iteration or even across an entire

release plan The acceptance tests arewritten early in (or even before) the iter-ation and running them clearly illustrateshow many of the stories have been com-pleted As part of an iteration reviewmeeting the tests are run again todemonstrate that the scope selected atthe start of the iteration was implement-ed according to the customerrsquos own def-inition of what is correct This process ismainly a formality it should be no sur-prise to anyone whether the tests willpass or fail when run during the iterationreview meeting

Two approaches to automatedacceptance testing are the Frameworkfor Integrated Test (FIT) developed byWard Cunningham and FitNesse awiki originally written in Java by MicahMartin Robert C Martin and MichaelFeathers FitNesse extends the princi-ples established by Cunninghamrsquos tooland provides a mechanism by whichtests may be created and maintained inaddition to being executed Both are

[With

automated

acceptance

testing

the

customer

writes

tests for

the key

features]

based on creating tables of input datathat are fed into the application undertest The programrsquos output is capturedand compared to the expected resultswhich is also provided in the tables

While tests themselves are definedusing a combination of narrative textand tabular data theyrsquore executed byfixture code that accepts the inputfrom the tables calls the appropriatemethods in the application and pro-vides the output back to the test run-ner These fixtures are generally writ-ten in the language of the applicationand must be implemented and main-tained by developers However thegoal is that they be thin and serve onlyto translate inputs and outputsbetween the test harness and theapplication Therefore the overallimpact that working with the fixturecode has on the developersrsquo productiv-ity should be minimal

As with automated unit tests auto-mated acceptance tests are likely tobe run against an incomplete applica-tion Not much functionality exists inearly iterations so it may be necessaryto test one part before another hasbeen built Again simulators are oftenused to provide the missing behaviorsIt is up to the project team to decidewhether the simulators are replacedwith real implementations as theybecome available in later iterationsEach approach has its advantagesUsing real components may providemore confidence that the applicationworks properly However using simula-tors may provide more control over thesystem during the tests and as withthe mocks and fakes used in unit test-ing itrsquos sometimes easier to recreatefailures to verify that the applicationhandles them properly

AUTOMATED REGRESSIONTESTINGResponsible thorough testing mustverify not only the functional require-ments but non-functional requirementsas well Automated regression testingwhich seeks to verify functions thathave worked in the past can be aneffective way to provide this additionalcoverage and fill in the gaps left by theacceptance tests

Unlike unit and acceptance testingwhich most heavily involve developersand customers automated regressiontesting is performed by testersAutomated regression tests should

JULY 2009 wwwstpcollaborativecom bull 15

CHALLENGES OF TESTING IN AGILE ENVIRONMENTS

The agile environment exaggerates many of the challenges faced in more convention-

al approaches to testing First the pace of the development effort requires that the

application be tested within each iteration in parallel with the actual coding

Conventional approaches that focus on full regression during a time when the code is

frozen are ineffective in agile environments Agile testing tends to focus on the new

functionality and providing rapid feedback to the programmers However regression

testing is still important and some mechanisms by which the entire application may

be exercised end-to-end must be implemented

Testing on agile projects begins almost immediately However in the early iterations

there may not be much of the application to test Conventional approaches that focus

on running the completed application do not work well in agile environments Even

when automated efforts are employed in conventional environments the emphasis

often is on exercising the application from the outside in In other words scripts drive

the application through the UI In a methodology where the UI evolves with the rest of

the application though it is not possible to test the application in this manner

Requirements in agile environments tend to be less formal than in conventional

methodologies Requirements also referred to as stories in some agile iterations often

begin life as a single phrase or short sentence describing at a very high level what the

application must do As part of the iteration or release planning process the project

teammdasha cross-functional mixture of programmers testers business analysts and

stakeholdersmdashcollaborate to flesh out the stories The stakeholders often referred to

collectively as the customer focus on the observable ldquohappy-pathrdquo outcomes while

the testers add details to cover the odd exceptional or ldquocornerrdquo cases In some agile

methodologies these details are captured in the form of acceptance tests Conventional

approaches that use requirements documents as the basis of test cases tend to struggle

in agile settings

Agile depends heavily on effective collaboration among the programmers testers and

stakeholders It is not uncommon in agile environments to find testers and developers

working closely together on a daily basis This tight coordination of the various groups

often is absent in more conventional methodologies where the product transitions

through the development life cycle as a series of handoffs The business analysts deliv-

er the requirements to the architects who in turn deliver a design to the programmers

who deliver an application to the testers who test it according to the initial require-

ments People working in conventional settings sometimes are uncomfortable collabo-

rating with those in other groups

Successful agile efforts must find ways to overcome each of these challenges

Automation is essential to addressing the issue of pace By building automation infra-

structure early in the project when there is little real functionality available to test it

becomes far easier to run more tests later in the project Further this approach helps

to provide the regression testing necessary to ensure the product is evolving stably and

not losing functionality once working However not all tests can nor should be auto-

mated Skilled and experienced professional testers can add significant value through

effective manual testing By automating what can be automated testers have more

time to apply their talents through manual testing efforts

attempt to cover as much of the appli-cation as possible and they shouldrely on mocks and simulators sparing-ly This allows the tests to provide afairly realistic view of the state of theprogram and to more accurately pre-dict how it will behave in a productionenvironment

Creating and maintaining a usefulsuite of automated regression tests maybe the most difficult aspect of testing inan agile environment In early iterationsnot much of the application exists sothere are few meaningful regression

tests that can be written and executedAs the program grows the scope andcomplexity of the project may becomeoverwhelming and it may be challengingto keep up with the pace of develop-ment It is usually helpful to spend earlyiterations creating a framework forautomated regression tests andpreparing outlines and test scripts forfunctionality that will be added overtime By investing in the scaffolding inthe beginning it becomes possible forthe testers to add new regressiontests incrementally Also the existing

regression tests help to ensure thatthe application continues to performaccording to the expectations estab-lished in prior iterations This makes itpossible for the testers to continue tofocus on new functionality rather thanconstantly reconfirming existing behav-ior is correct

In conventional development meth-odologies there are many categoriesof tests intended to exercise the appli-cation end-to-end These include per-formance tests load tests stresstests integration tests smoke testsand others In the end each of thesemay be thought of as a type of regres-sion test All attempts to determinehow well the product meets the func-tional and non-functional requirementsalso attempt to exercise the completeapplication and all help to ensure thatfunctionality is never lost once it hasbeen implemented

One of the key components ofagile methodologies is a focus on busi-ness value Therefore any of thesetypes of tests that provide valuableand meaningful feedback about thestate of the project should be givenattention in agile environments Forexample if performance is important tothe application then performancetests should be automated and made apart of the regression test suite Thesame is true for any of the other typesof tests mentioned here

Because automated regression

[Automated

regression tests

should cover

as much of the

application

as possible

and should

use mocks

sparingly]

16 bull Software Test amp Performance JULY 2009

tests may serve a variety of purposesthere is no standard approach to imple-menting them Often a combination oftools must be employed to address eachspecific situation Both commercial andfree tools are available for the differenttypes of testing and each may be bestsuited to a particular type of applicationIn addition some teams build customframeworks using Ruby Java PerlPython or another language to get thebest match for their project

MANUAL EXPLORATORY TESTINGThere are two main types ofdefects those under whichthe application fails to performaccording to expectations (asdefined by its requirements)and those that cause theapplication to behave improp-erly in response to an unex-pected event Automatedtests can be effective at find-ing the former For a given setof expected inputs the appli-cation should produce a specific output Howeverautomation is not particularlygood at finding the unexpect-ed

Consider for example anapplication that permits a userto perform tasks A B and CAs the product was beingdeveloped no one expected that A Band C were in any way related Howeverwhen the user performs these tasks in aspecific order the program crashes It isunlikely that the functional requirementsinclude a statement like ldquoThe systemshall not crash when the user performsA then C then Brdquo As a result it isunlikely that an automated test exists tosimulate the situation This is where man-ual exploratory testing can be beneficial

Manual exploratory testing gives thetester an opportunity to truly demon-strate their value to a project Testerstend to be good at coming up withunusual ldquowhat ifrdquo scenarios that repre-sent what actual users will do with theapplication in the production environ-ments These might be long series oftasks odd combinations of events orprolonged use of the program Oftenthese scenarios are too difficult or cost-ly to automate

Manual exploratory testing is morethan the tester wandering randomlythrough the application To be done

effectively the tester must start with aplan He needs to isolate a section ofthe program to explore and time box theexploration Once the tester begins theexploration he should create a record ofthe actions taken This is crucial to theability to recreate any problems foundScreen recording tools also might proveuseful for documenting exactly what thetester does during an exploratory testingsession When flaws are found duringthis process new acceptance or regres-sion tests should be written (and auto-mated) to expose the bad behavior pro-

vide a verification that it has been cor-rected and ensure that it does notreturn

For software testing to be success-ful in an agile development environmenta four-part strategy comprising compli-mentary approaches has given me thebest results

Automated unit testing puts theonus of removing obvious errors ondevelopers where it belongs and resultsin higher initial quality delivered totesters

Automated acceptance testingsolves the critical problem of engagingstakeholders to identify key functionalityand to define what it means for the pro-gram to work correctly and provide con-fidence that the core functionality isbeing properly implemented

Automated regression testing pro-vides thorough coverage ensures thatthe application meets functional and non-functional requirements and guaranteesthat functionality once working is not dis-abled in later iterations

Manual exploratory testing exposesunexpected behavior by letting testersexercise the application in ways too dif-ficult or costly to automate

This four-phase approach to testingenables a project team to effectively andthoroughly determine an applicationrsquosreadiness for release Consistent with

agile practices testingbegins early in the softwaredevelopment life cycle and itis done constantly through-out the project Every teammember participates in someway in the testing effort andthe mixture of testing typesminimizes the risks associat-ed with any single techniqueBy automating what can beautomated it becomes pos-sible to provide testers withsufficient time to fully applytheir unique skills and experi-ences yacute

REFERENCES AND FURTHERREADINGbull Cunningham Ward Framework

for Integrated Test October 122007 httpfitc2com

bull Dunkin Robert Burrell TheLanguage of Testing Dummies

Mocks Fakes and Stubs Weirdest UndreamtUse Case November 29 2008httpwwwjrollercomrobertburrelldonkinentrytdd_mocks_fakes_stubs

bull FitNesse Front Page FitNesse httpwwwfitnesseorg

bull Fowler Martin Mocks Arenrsquot Stubs MocksArenrsquot Stubs January 2 2007 httpwwwmartinfowlercomarticlesmocksArentStubshtml

bull jUnitorg Resources for Test DrivenDevelopment jUnit httpwwwjunitorg

bull Porter Robert Mock Objects in Test DrivenDesign BlogOfBob October 7 2007httpwwwrp2ccomblogofbobMockObjectsInTestDrivenDesignaspx

bull Rothman Johanna Does Exploratory TestingHave A Place On Agile Teams StickyMinds July7 2008httpwwwstickymindscomsitewideaspFunction=edetailampObjectType=COLampObjectId=13860

[Manual exploratory

testing is more than

wandering randomly

through the application

the tester must start

with a plan]

LEARN TEST AUTOMATION from Robert Walsh at the Software Test ampPerformance Conference this fall Hersquoll be with us in Cambridge MAOct 19-23 teaching ldquoFirst Steps to Test Automationrdquo an in-depth coursethat can automate your testing fast VViissiitt wwwwwwssttppccoonnccoomm

observed with clients using agile methodologies typi-cally Scrum While my associates and I have helpedclients overcome these challenges I will not present allof our suggested solutions here Instead Software Testamp Performance magazine has given test consultant andcertified Scrum Master Bob Galen an opportunity tooffer solutions

What I describe here are challenges we haveobserved or discussed with clients who told us they werefollowing an agile life cycle Some of these challengesmay be inherent in agile methodologies However somemay in fact actually arise from improper understanding orimplementation of the chosen methodology resulting inavoidable problems

VOLUME AND SPEED OF CHANGE One of the principles of agile development is that proj-ect teams should welcome changing requirementseven late in development Many testing strategies

especially analytical require-ments-based testing becomequite inefficient in such situa-tions While automated test-ing can help accommodateon-going change automationcan be fragile and high-main-tenance if not implementedproperly We find that risk-based testing coupled withreactive techniques likeexploratory testing and prop-

By Rex Black

Every software development model presents itsown testing implications Some ease the testingprocess others make it more challenging Whatfollows is a look at some challenges I have

Rex Black is founder and president of RBCS a software hardware andsystems consultancy and is a founding member of the STP CollaborativeStrategic Advisory Board

18 bull Software Test amp Performance JULY 2009

The AgileProcessCreatesTestingPuzzles

Which AreSolved When

Agile IsPerformedCorrectly

of traditional testing consultants I donrsquot believe theywere gathered from the perspective of an actual agiletester one with serious experience working in matureand high performing agile teams

Such views often come from the perspective of ascribe or observer of teams who at least from my per-spective arenrsquot practicing agility very well Or theyrsquoreanecdotal reactions to the Agile Manifesto or other writ-ings without trying to understand the underlying intentionsWhatrsquos discouraging is that superficial reactions from wellrespected sources can give the wrong impression

VOLUME AND SPEED OF CHANGEThe point of embracing change is the realization thatmost software projects are volatile Business conditionsare incredibly dynamic Customers may not actuallyknow what they need until they see the application startto coalesce as working software The developers maynot have understood the challenges associated with thearchitecture without actuallybuilding parts of it nor mighttesters understand the mostrisk-based areas to test with-out seeing the application run

The point of trying to freezerequirements at a historicalinstant in time where thingsare well understood withoutwriting a lick of software is afoolrsquos errand Agile methods tryto embrace change react to it

Iwas asked by Software Test amp Performance magazine to provide a rebuttal to Rex Blackrsquos perspectives on how ldquoagile challenges testingrdquoRexrsquos views and reactions to agility are typical

By Bob GalenBob Galen is principal consultant of the RGalen Consulting Group anda founding member of the STP Collaborative Strategic Advisory Board

JULY 2009 wwwstpcollaborativecom bull 19

erly-done automation works bestWhatrsquos more change-related challenges often arise from

changes in the definition of the product and its correct behav-ior When the test team is not kept informed of thesechanges or when the rate of change is very high this canimpose inefficiencies on the development execution andmaintenance of tests

SHORT ITERATIONS SQUEEZE TESTINGIn sequential life cycles if the organization involves the testteam early in the project the testers have lots of time todevelop and maintain their tests and can do so in parallel withthe development of the system and prior to the start of sys-tem test execution Some more formal iterative lifecycle mod-els such as Rapid Application Development and the RationalUnified Process often allow substantial periods of timebetween test execution periods for each iteration Theseintervals allow test teams develop and maintain their test sys-tems

Most teams using agile life cycles progress more quicklyeg Scrum uses sprints two to four week periods of shortfast-paced development iterations I have seen this pacesqueeze the test teamrsquos ability to develop and maintain testsystems Testing strategies that include an automation ele-ment also have proven particularly sensitive to this challenge

UNIT TESTING IS OFTEN INADEQUATE Most agile proponents stress good automated unit testing andframeworks such as JUnit have helped simplify and minimize

and make the changes visible across the team and to stakehold-ers while guiding the software toward the customersrsquo true needsIf the traditional testing approaches that Rex alludes to can notadapt to change effectively then Irsquod argue that they need tochange

SHORT ITERATIONS SQUEEZE TESTINGIf your parochial goal is simply testing then I agree that shortiterations can be quite challenging But if your goal is to deliversmall chunks of working software to gain customer and prod-uct quality feedback and to ultimately deliver high value andworking products applications then small iterations rock

To Rexrsquos point regarding ldquosqueezing test system mainte-nancerdquo all of the agile methods adhere to the notion of com-pleting a small set of features that meet rdquodonenessrdquo criteriaat the end of each Iteration or Sprint

There should be no work that is left outmdashfor exampleappropriate test automation development or maintenance oftest ldquosystemsrdquo

Itrsquos up to the testers to simply place this work on theProduct and Sprint Backlogs and plan its completion as partof each Iteration or Sprint

Sometimes this takes effort and courage to defend this workagainst feature pressure However all team members encounterthe same challenge and need to ensure that they deliver ldquodonerdquosoftware from each of their functional perspectives

UNIT TESTING IS OFTEN INADEQUATEFirst agile methods stress automated testing of all kindsmdashnotsimply unit tests FitNesse Selenium and Watir Watin are

examples of open source automation tools that are widelyused in the agile community and that focus on non-unit testing

We do stress unit tests as being an important part of theoverall coverage and as being a safety net for the teamsrsquo refac-toring efforts We want developers to be responsible for con-tributing to test automationmdashnot just testers And if a team isusing TDD as a practice wersquove learned that developing testscan be a powerful aid as one designs their software in smallincremental steps

No agile team to my knowledge views unit tests as theend-all of testing In fact as part of delivering ldquodonerdquo softwareincrements they must decide on the set of tests (unit func-

20 bull Software Test amp Performance JULY 2009

[I expect testers to pair with

developers to improve practices anddevelopers

to deliver solid unit tests before they

can be considered lsquodonersquo]

tional acceptance non-functional regression etc) thatshould be run within the Sprint in order to meet a customer-ready deliverable Every team member should be definingthese testing tasks with the testers taking the lead

To Rexrsquos last point that programmers donrsquot commit to doingeffective unit testing I expect the testers to pair with the devel-opers to improve these practices and I expect developers todeliver solid unit tests before they can be considered ldquodonerdquo

MANAGING THE INCREASED REGRESSION RISKThere is a continuing misperception that unit tests are theonly tests run on a Sprint-by-Sprint basis While itrsquos true thatthe team might not be able to run a full regression test with-in each Sprint it is their responsibility to apply risk-basedtechniques based upon whatrsquos been added or changed andto run appropriate functional non-functional exploratory andregression tests as part of delivering ldquodonerdquo software at theend of the Sprint

Clearly automation development at all levels helps withthis so the team should be investing time in improving theiroverall ability to automate all of the testing types I mention asappropriate

POOR OR MISSING TEST ORACLESI hate to break the news to Rex but the best test oracles arethe people within the team not the documentation Documen-tation almost always loses its relevance and accuracy Yes itrsquoshelpful but should never be considered the single source forall knowledge

What agility encourages is for the team to talkmdashto plan their

the costs Such automated unit tests allow what agile propo-nents call refactoring Refactoring is the redesign of majorchunks of code or even entire objects The automated unit

tests provide for quick regression testing of refactored codeSome agilists recommend substituting automated unit tests fordesign as in Test Driven Development While this soundsgood in theory there are two problems that I have observedwith this approach in practice

First unit testing has limited bug-finding utility In researchpublished in June 2002 in CrossTalk the Journal of Defense

Software Engineering author Capers Jones found unit testingto be only 25 to 30 percent effective at finding and removingdefects In RBCS assessments we have found that good sys-tem testing by an independent test team averages around 85percent effective at finding defects So organizations that careabout quality should have both good unit testing and good sys-tem testing

Second while unit testing is often thought of as thedeveloperrsquos job I have found that many programmers justdonrsquot do it or deliver cursory attempts This can create com-plications and delays for the system test team in its role asldquobug finder of last resortrdquo as they deal with buggy code dur-ing short test execution periods in the sprints This can alsolead to code churn during testing since so much code mustchange to fix the bugs

The amount of change can ultimately outstrip even theability of the best automated regression test system to keepup which then leads to lower defect detection effectivenessfor the test team

MANAGING THE INCREASED REGRESSION RISKCapers Jones also reported that regression accounts forabout seven percent of bugs In iterative life cycles such asScrum however code that worked in previous sprints getschurned by new features in each subsequent sprint Thisincreases the risk of regression

Agilists emphasize good automated unit testing in part tomanage the regression risk inherent in such churn But with eventhe best unit testing topping out at about 50 percent defect

JULY 2009 wwwstpcollaborativecom bull 21

[While unit testing is often thought

of as the developerrsquos job I have found that

they just donrsquot do it or [they] deliver

cursory attempts]

22 bull Software Test amp Performance JULY 2009

work together to cluster in small groups working on specific fea-tures to work with the customer gaining valuable feedbackmdashandto collaborate across disciplines From this they can decidewhere when and to what degree documentation is necessary

We expect team members to be professionals and to beresponsible and accountable We trust their judgment overblindly filling in a plan template or a requirement document orblindly running the same tests over and over again to meetsome perceived risk

All oracles have a cost We engage the customers in thesedecisions as wellmdashensuring that those oracles that are providedare actually used and kept current

THE SHIFTING TEST BASISIrsquom lost here Agile teams define requirements Some usePost-it Notes and cards or define them as User StoriesOthers describe use cases or use traditional requirementforms Many even add a construct called acceptance tests ona finely grained story level that helps define the critical ele-ments of each feature or requirement These are things thatthe customer will ldquotestrdquo as part of their acceptance at the endof a Sprint or Iteration

Irsquod guess that Rex has seen poor or missing requirements inhis agile experience and Irsquod say those were not mature or solid-ly performing agile teams ldquoGoodrdquo teams define requirementsat a level that assures the team will build what the customerneeds They ensure collaboration surrounding each require-mentmdashat the point of attack

To capture a key point here The agile methods try to adopta Just-in-Time and Just Enough approach to their requirement

removal effectiveness according to Jones automated regressiontesting via unit tests will miss at least half of the regression bugs

POOR OR MISSING TEST ORACLESAgile favors working software over written documentationand in my observations special scorn is reserved for specifi-cations The suggestion by the Agile Manifesto (wwwagile-manifesto org) that people should value ldquoworking softwareover comprehensive documentationrdquo can create real chal-lenges for a test team when taken too far Testers userequirements specifications and other documents as testoracles ie the means to determine correct behavior undera given test condition We have seen testers in agile situa-tions given documents with insufficient detail and in somecases given no documents at all

Even when testers are given adequate documents twoother agile development principles keep the test oracle chal-lenge alive First agile requires teams to embrace changeand espouses that ldquothe most efficient and effective methodof conveying information to and within a development team isface-to-face conversationrdquo

From what Irsquove seen these two principles allow the proj-ect team to change the definition of correct behavior at anytime including after testers have created tests to confirm aparticular behavior and even after testers have reported bugsagainst a particular behavior

Further the definition of correct behavior can change in ameeting or a discussion where a tester is not present If thatchange is not communicated to the tester the project suffers

false positives when testers report bugs against behaviornow defined as correct

THE SHIFTING TEST BASISRequirements-based testing strategies cannot handle vague ormissing requirements specifications Missing requirementsspecifications would mean a test team following a requirements-based testing strategy not only canrsquot say what it means for aparticular test to pass or fail they wouldnrsquot have a test basisupon which to base their tests

The test basis also provides a means to measure theresults In a requirements-based test strategy testers canrsquotreport test results accurately if the requirements are missing orpoor Testers canrsquot report the percentage of the test basis cov-ered by passed tests because the requirements wonrsquot provideenough detail for meaningful coverage analysis

TOO MANY MEETINGSA ldquoface-to-face conversationrdquo is just another way to describe ameeting A client once jokingly described Scrum as a heavy-weight process saying ldquoIrsquom surprised at the name agilemdashitshould be called couch potato There are too many meetingsTherersquos too much jawboning I find it really ironic that there are allthese books explaining how simple it isrdquo

Perhaps it was intentional not to use the dreaded word meet-ing in the Agile Manifesto but the result is often the same In someorganizations everyone gets invited to every meeting meetingsballoon in number and duration and managers and leads are notavailable to manage and lead their team because they are in meet-

ings much of the day Effectiveness and efficiency drop Anotherclient once told me of a senior manager who said in response toa complaint from a line manager about how attending meetingswas negatively impacting his ability to lead his team ldquoWersquore goingto continue to have these meetings until I find out why nothing isgetting done around hererdquo The senior manager was I am told theonly one in the room who didnrsquot get the irony

THE BOTTOM LINEIn RBCS client engagements we typically recommend blend-ed testing strategies that align well with Scrum and otheragile methodologies What seems most likely to work in manycases is a blend of risk-based testing maintainable automat-ed testing (at both the unit and system level) and a percent-age of effort spent on reactive testing (eg exploratory) Insome cases our methods will help mitigate the testing risksand reduce the testing challenges associated with thesemethodologies

Clearly there is no single way to perform effective soft-ware testing and as the industry evolves so too will the mosteffective methods that do the most good yacute

development An option of ldquono requirementsrdquo does not exist

TOO MANY MEETINGSThe point of face-to-face collaboration is that itrsquos the most effec-tive and richest way to communicate in technical teams Has anydeveloper ever implemented a requirement had a tester test itand a customer receive it with perfect alignment That wouldtruly be exceptional More oftenthe tester will say the softwareperforms differently than the testsandor the customer says I didnrsquotask for that even though theywere all working off of the exactsame requirement

The point is that technicalteams need to rely on written aswell as face-to-face collabora-tion to effectively reduce themisunderstandings and reworkthat written requirements alone often drive

Beyond this agile teams should be conducting incrediblyeffective meetings something perhaps lacking in the exam-ples Rex has seen I suspect this problem isnrsquot agile-centricand that those teams simply need a good facilitator

THE BOTTOM LINEEffectively testing todayrsquos software is incredibly challenging andwe all need to stop thinking of agile testing as somethingopposed to traditional testing Testers need every possible tool

to perform testing well Agile methods have emerged as a set ofpractices and a mindset that is context based and powerful Theytake a holistic view to testing and qualitymdashreinforcing the two asa whole Testers in this model have a powerful opportunity to notonly test but to champion and guide the overall teamrsquos focus ondelivering quality software

We need to stop our history of opposition to new testing app-roaches and begin embracing abroader set of testing practicesthat work in each projectrsquos uniquecontext I also believe we that weneed to stop quoting CapersJones as I suspect his data andstudies do not reflect the dynam-ics of mature agile teams Irsquom look-ing forward to encouraging Rex totry and gain deeper understandingof agile testing practices yacute

Bob will have ample oppor-tunity to do that in October when he and Rex meet for thefirst time as fellow members of STampP CollaborativersquosStrategic Advisory Board ndashEd

JULY 2009 wwwstpcollaborativecom bull 23

[Teams need to rely on written and face-to-face

collaboration to effectively reduce rework]

LEARN AGILE TESTING TECHNIQUESdirectly from Bob Galen himself this fall atSTPCon The Scrum Master will be on hand inCambridge MA Oct 19-23 to teach ldquoAgile andHigh-Speed Software Testing Techniquesrdquo anin-depth training course for bringing agile test-

ing to your organization VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

LEARN TEST MANAGEMENT from Rex Blackat the Software Test amp Performance Conferencethis fall in Cambridge Mass Learn about ldquoTestManagement A Risk-Based Approachrdquo the prac-tice that guides test activities based on carefulassessment and prioritization of the risks involved VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

ciplines and draw on a variety of skills to do their jobs And whilethere are many unsung testing heroes inside software develop-ment shops there are the beginnings in many organizations of agrowing recognition that test and performance engineers shouldbe considered peers of their development counterparts

We asked several testing gurus to share their thoughts onthe state of the testing industry and give us some insights on wherethey see things evolving One thing for anyone to realize is thatwhile testing as an industry is maturing it is still relatively new Thesoftware tester job isnt something that shows up on the US CensusWe are still very much a new profession says Scott Barber thefounder and chief technologist of PerfTestPlus in Palm Bay Fla Mostof our very first programmers are still alive we arent thousands ofyears old like other disciplinesrdquo

Perhaps the best rundown of the industryrsquos evolution is summed up by Michael Boltonwho lives in Toronto Testing has historically been confirmation validation and verificationhe says I believe that testing must become far more about exploration discovery investi-gation learning and communicating what youve learnedrdquo Bolton runs test consultancyDevelopSense and teaches rapid software testing methods ldquoI like to encourage people torecognize and manage uncertainty and ambiguity and instead of trying to eliminate them get-ting them down to levels that are suffi-ciently low for people to do use-ful workrdquo

While any gathering ofsoftware testers will find dis-agreement on overall pur-pose one thing is cleartesting isnt about how toimprove code qualityndashthat isthe role of the engineeringpeople who do the coding workand of management says BoltonTesting is about providing informationto those people so that better-informed decisionsabout quality are possible

The future careers for the current cropof test engineers seem to be on the rise

All this activity has made software testing a very big tent The industry has matured and grownto encompass numerous approaches to methods training and tools Because of this diver-sity there is a growing consensus that the best testers are those that can span multiple dis-

The software testing industry is at a historic crossroads Rich Internet applica-tions cloud computing the rise of virtualization and more capable mobiledevices have all made the job of producing reliable software more complexwhile at the same time enabling more sophistication for automated test tools

Wersquore All Part of the Story

We are a new profession most of our first programmers are

still alivendash Scott Barber

A skilledtester is theproduct of avariety of

experiencesndash Jon Bach

24 bull Software Test amp Performance

the business outcomes theyrsquove delivered he says He goes on to describe the best mix in his development teams A performance

engineering organization should be comprised of engineers that are just as capableor more capable than the core product engineers This allows for the performanceteam to be just as familiar with the product code base and for them to go as far asmaking performance enhancements to the product and testing the outcome of thosechanges

At Google employees involved in testing generally have one of three titles saysDavid Saff a Google software engineer in test whorsquos basedin Cambridge Mass Software engineers are expected to unit-test their own code and many take an active role in maintain-ing testing frameworks and system tests for their teams Testengineers look at the quality requirements of a project identifyholes and fill them by applying existing automation tools where pos-sible or managing manual testing teams where needed Softwareengineers in test look for holes in the tools available and serve as internal

tool developers and implementation consultants Any teamcant survive without high competence in each domain

CLOUDS ARE FORMINGThe move toward cloud computing has helped softwaretesting overall but also made for some new challengestoo Cloud computing adds layers of complexity to test-ing but it also makes things more interesting says BoltonThese layers include being able to predict latency and avail-ability of all components that reside in the cloud Barber saysthat Too many people take latency for granted still and many

application developers still take the entire network infrastruc-ture for granted and dont consider it part of their domain

so programmers just assume it is working unless theircode has a problem

Bartow agrees Network latency hasbecome even more important with the intro-duction of Rich Internet Application tech-nologies such as Adobe Flex Every value that you enter

David Strom is a freelance reporter and author living in St Louis

particularly within large software shops such as Intuit and Google There software testengineers are often more important than the development engineers who writethe original code and have their own career track with parallel compensationand benefits

At Intuit QA engineers are held to the exact same standards and expec-tations as product developersrdquo says Dan Bartow a manager of performanceengineering for the company which makes TurboTax and other financial softwareldquoIn fact my team of five full-time performance engineers is calibrated the sameway as the product engineers at the end of year in terms of their skill-sets and

Virtualizationmakes things

easierand harder

ndash Michael Bolton

wwwstpcollaborativecom bull 25

of the Story

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 5: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

July 2009

Dear Subscribers

In December 2008 Redwood Collaborative Media purchased the Software Test amp Performance

division of BZ Media We had a vision of creating new opportunities for software testers to con-

nect with and learn from test and QA colleagues around the globe through a new collaborative

media model Today we are pleased to realize that vision and unveil the new Software Test amp

Performance Collaborative

STP Collaborative will continue to bring you the high quality news valuable information and

face-to-face networking opportunities yoursquove come to expect from Software Test amp Performance

magazine and the Software Test amp Performance Conference In addition wersquore excited to

announce the launch of an all-new Web site which will serve as the cornerstone of a new mem-

bership model that enables community-building education training and peer-to-peer knowl-

edge sharing which are so vital to your profession Were excited about our new programs and

we hope you will be too We invite you to visit wwwstpcollaborativecom and join our com-

munity today

Once again we welcome you to STP Collaborative and invite you to use and comment on our

content contribute to our knowledge base and interact with each other to build a community

that will benefit everyone in your growing profession We look forward to working with you to

build a global community of knowledge workers in the software test and QA profession

Sincerely

Andy Muns

President Redwood Collaborative Media

Publisher Software Test amp Performance magazine

JAMES BACHCEOSatisfice

SCOTT BARBERChief Technologist PerfTestPlusDirector AST

KENT BECKFounder and directorThree Rivers Institute

VLADIMIR BELORUSETSSQA ManagerXerox

REX BLACKFounder and presidentRBCS

Introducing the STP Collaborative Strategic Advisory Board

8 bull Software Test amp Performance JULY 2009

MICHAEL BOLTONFounderDevelopSense

ROSS COLLARDFounderCollard amp Co

JAN FISHQA ManagerPhillips Lifeline

ROBERT GALENPrincipal ConsultantRGalen Consulting Group

MATT HEUSSERInd software developertester amp trainer Socialtext

DAVID HOLCOMBECo-founder president amp CEOeLearning Guild

ANDREW MUNSPresident amp CEORedwood Collaborative Media

RON MUNSChairmanRedwood Collaborative Media

BJ ROLLISONTest ArchitectMicrosoft

DAVID STROMFreelance columnist and speaker

IF I HAD TO DESCRIBE IN A SINGLEword the characters that make up todayrsquos castof test industry thought leaders one that fits wellwould be diverse Theyrsquore geographically di-versemdashstretching from Quebec to Kalamazoofrom Boston to Bulverde Their clients are evenmore so Engagements spread across the USand Canada to China New Zealand Russia andSweden to name just a few

Then therersquos the philosophical divide asyoursquoll discover in David Stromrsquos fine piece in thisissue titled ldquoWersquore All Part of the Storyrdquo in whichhe looks at the state of the testing industry andits major players Yoursquoll hear from the Bach broth-ers James and Jon Scott Barber Rex BlackMichael Bolton Kent Beck and many others Justto prove that we didnrsquot simplytake names from the B sec-tion of the testerrsquos phone bookwe also have Googlersquos DavidSaff weighing in

While opinions and philo-sophies vary therersquos one thingon which they all seem to agreeWith the ever-increasing com-plexity of software paired withever-shrinking release timeta-bles there will always be a needto reduce testing time andincrease efficiency Is thereagreement on how best toaccomplish that Maybe in car-toons

Which brings me to thestory behind this monthrsquos cov-er illustration concept CreditMr Strom for the caricatureidea during a brainstormingmeeting with me and Redwoodpresident Andy Muns I knewthat self-effacement would beokay with Jon Bach and hisbrother James who recently recounted the sto-ry of how the two had received caricatures ofthemselves at a conference they were bothattending We asked permission anyway ofcourse as we did with the remaining playersPeople in testing are exceedingly good natured

in my experienceat least

Varied opin-ions are furtherhighlighted in thisissue in The AgileProcess CreatesTesting Puzzlesa lively point-coun-terpoint by con-sultant Rex Blackand Scrum Mas-ter Bob Galen Ofcourse even critics of agile methods acknowl-edge merits of the practices when executed cor-rectly After all what can be bad about putting

people over process Of put-ting working software over com-prehensive documentation

You might say that agileprinciples were behind the designof our new Web site which wereexcited to launch this month Itrsquosdesigned for individuals to inter-act with each other and with usThe site also features process-es tools and resources fortesters QA professionals andsoftware managers Itrsquos designedwith the customer (you) in mindand looks to add interactivity andcollaboration to your favorite test-ing magazine And well con-stantly add more features andcontent to make this new site arich and valuable online resourceand community

Also beginning this monththe magazine has a fresh newlook Sincere thanks go to artdirector LuAnn Palazzo with-out whose talent and infinite

patience this project would not have happenedWe hope yoursquoll enjoy the magazine and participate in our new membership communityand collaborative Web site Please join us atwwwstpcollaborativecom and tell us what youthink yacute

ed notes

The Test IndustryrsquosCartoonland Villagec

Edward J Correia

ldquoWhile opinions

and philosophies

vary there will

always be a need

to reduce testing

time and increase

efficiency rdquo

VOLUME 6 bull ISSUE 7 bull JULY 2009

President Andrew Muns

Chairman Ron Muns

105 Maxess Road Suite 207Melville NY 11747+1-631-393-6051+1-631-393-6057 faxwwwstpcollaborativecom

Cover Illustration Caricatures by Steve Nyman

REDWOODCollaborative Media

Editor

Edward J Correiaecorreiastpcollaborativecom

Contributing Editors

Joel ShoreMatt HeusserChris McMahon

Art Director

LuAnn T Palazzolpalazzostpcollaborativecom

Publisher

Andrew Munsamunsstpcollaborativecom

Associate Publisher

David Karpdkarpstpcollaborativecom

Director of Operations

Kristin Munskmunsstpcollaborativecom

Chief Marketing Officer

Jennifer McClurejmcclurestpcollaborativecom

Marketing Coordinator

Teresa Cantwelltcantwellstpcollaborativecom

Reprints

Lisa Abelsonabelsonstpcollaborativecom

516-379-7097

MembershipCustomer Service

membership-servicesstpcollaborativecom

631-393-6051 x 200

Circulation and List Services

Lisa Fiskelfiskestpcollaborativecom

wwwstpcollaborativecom bull 9

IN FEBRUARY OF 2001 17 TECH-nology professionals gathered for a peerconference in Park City Utah and craft-ed a document called the Agile Man-ifesto That document is probably thesingle most influential statement on soft-ware development of the past decadeand its values led to a series of newideas and concepts insoftware testing

Agile SoftwareMethodsXP Short for ExtremeProgramming XP was cre-ated by Kent Beck duringhis work at Chrysler Atthe time projects at theauto maker struggled withheavyweight specifica-tions specialized rolesand enough turnover thatprojects would simply burnout mid-stream delivering no workingsoftware at all before being canceledExtreme Programming focuses on deliver-ing software by first developing simpleworking features and then adding morefeatures incrementally while improvingthe code constantly Most XP techniquessuch as Pair Programming Test DrivenDevelopment and Refactoring are nowconsidered engineering practices

Scrum Popularized at (now defunct)Easel Corp by Ken Schwaber and JeffSutherland At Easel requirementswere changing so quickly that thedevelopers could never get anythingcompletely finished Scrum focuses onstable requirements for each time boxcommitments from team membersand communication through dailystand up meetings Most of the focusof Scrum is on project management

DSDM Short for Dynamic SystemsDevelopment Method Developed in theUnited Kingdom DSDM advocatesnotice that many projects have fixed staffand a fixed release date DSDM thereforefocuses on techniques to allow the teamto alter the project scope while maintain-ing high quality thus hitting the original

due date

Agile SoftwareProcessesTraditional Software Pro-cess generally breaksdown work by phase orby type of work all of therequirements first thenall of the coding then allof the testing and so onAgile processes insteadbreak down work by fea-ture one thin slice at atime of the final product

which the customer can use to actuallyand concretely do something new

Agile teams then batch the work upinto time boxes periodically bringing thecode to production quality Agile teamsrelease production code as often asevery few days to as rarely as every fewmonths Because of that teams needgreater discipline and skill to deal with anincreased regression testing burden

Iteration Used in Extreme Program-ming an iteration is a time box (typicallyone to three weeks) starting with arequirements exercise and ending withnew features at production quality

Sprint A short time box used in Scrumthat allows the technical staff to createcode ready for production without therequirements changing as they workTechnical staff are expected to own thedeliverable and release production fea-tures at the end of the time box

User Story A form of requirement thatdescribes a single specific meaningfulpiece of functionality that has value to the

business User stories are small and canbe implemented by a pair of programmersin one or a few days User Stories line uproughly with a single scenario within ause case in traditional software devel-opment A user story is not a specifica-tion it is an example of what the softwarewill allow a person to do

Story Card Stories are often express-ed on 3x5 index cards Agile Coach RonJeffries has described a story card as aplaceholder for a conversation

Story Tests A series of tests express-ed in plain English but eventually automat-ed Story tests provide examples for theprogrammers and testers and describethe minimum amount of functionality forthe story to be called done The goal ofstory tests is not to convey a comprehen-sive specification but to provide examplesso that the whole team can build a sharedmental model of what is expected Story Kickoffs A kickoff is the finalmeeting held before developers start

writing code The idea that testersshould be involved early in the processhas been around for decades The agiletwist is to have testers help as part ofthe design for the story During the kick-off developers product owners andtesters can discuss and ask questionsabout what the software will do this canlead to a better description and prevent

stamp pedia

10 bull Software Test amp Performance JULY 2009

Agile Testing in Practicec

The encyclopedia for software testing professionals

Matt Huesser andChris McMahon

Matt Heusser and Chris McMahon are careersoftware developers testers and bloggers Mattalso works at Socialtext where he performstesting and quality assurance for the compa-nyrsquos Web-based collaboration software

ldquoGood ideas can come

from anywhere at

any time rdquo

continued on page 36 gt

wwwseapinecomstpswiftSatisfy your quality obsession[ [

copy 2009 Seapine Software Inc All rights reserved

Swiftcovercom cut their testing time in half with TestTrack Studio and QA Wizard Pro while still providing the quality their customers expectSeapinersquos end-to-end Software Quality Assurance (SQA) solutions help you deliver quality products faster Start with

QA Wizard Pro for automated testing and add TestTrack Studio for issue tracking and test case managementmdashintegrated quality assurance solutions that together reduce testing time saving you money and improving customer satisfaction

Reduce quality assurance costs with automated functional and regression testing

Manage test case development defect assignments and QA verification with one application

Track which test cases have been automated schedule script runs and link test results with defects

Learn how to test faster while protecting quality Visit wwwseapinecomstpswift

Save timewhile protectingsoftware quality

So much of success boils down to time QA Wizard Pro and TestTrack Studio allow us to be more protable because we do more in less time mdash Test Manager Swiftcover

Test

Trac

kregPr

o T e

stTr

ackreg

TCM

T e

stTr

ackreg

Stud

io

Surr

ound

SCM

regSe

apin

e C M

regQ

A W

izar

dregPr

oIss

ue M

anag

emen

t Te

st C

ase

Man

agem

ent

Test

Pla

nnin

g amp

T rac

king

Co

nfi g

urat

ion

Man

agem

ent

Chan

ge M

anag

emen

t Au

tom

ated

Test

ing

Illus

trat

ion

by S

teve

Nym

an

necessary for the others to succeed They alsoprovide a system of checks and balances thatallows the strengths of one to compensate forthe weaknesses of the others Each pillar is theresponsibility of a different group within theagile development team and when each groupdoes its part my experience has shown that theresult is usually a successful project

AUTOMATED UNIT TESTINGUnit testing is perhaps the easiest type of test-ing to automate Automated unit testing is apractice through which nearly every line andbranch of code is covered with an automatedtest that proves the code works as intendedAutomated unit tests focus on the discreteunits or modules within the application eachperforming its specific tasks in isolation fromthe rest Automated unit tests may be writtenbefore or after the code under test is writtenbut itrsquos most effective to write them beforehand(a process known as Test-Driven Developmentor TDD)

With TDD the code tends to more close-ly represent the developerrsquos intent becausefor the developer itrsquos easier to write justenough code to satisfy the single new failingtest Less code written means less code tomaintain and fewer places where bugs canhide When the unit tests are written after thecode the tests scan sometimes show onlythat the code works as written and not nec-

essarily as intended Further therersquos a ten-dency to write more code than is necessarybecause the programmer is speculating aboutwhat conditions might exist rather than start-ing from concrete test cases that highlight thetrue requirements

Automated unit testing falls almost solelyon the shoulders of the developers and helpsto ensure that the code they are writingbehaves according to their interpretation ofthe requirements The word interpretation isimportant there because while a modulemight achieve 100 percent code coverage andpass all of its unit tests it might still fail tomeet the needs of the application If the pro-grammer misunderstands the requirementthe tests he writes will be flawed and theresulting code will misbehave Thatrsquos amongthe reasons that automated unit testing is notthe only form of automated testing employedon an agile project

Automated unit tests are generally writtenin the same language as the application beingtested A family of testing tools exists to facili-tate automated unit testing Among the mostwell-knows is jUnit created for Java by KentBeck and Erich Gamma with ports now avail-able for most languages These tools provide

By Robert Walsh

Successful agile testing relies on a strategy built on four pillars automated unit test-ing automated acceptance testing automated regression testing and manualexploratory testing As with columns that hold up a building no single pillar cansupport the entire load The four are interdependent and each provides benefits

THE BUILDING BLOCKS OF AGILEmdash

AUTOMATED UNIT ACCEPTANCE AND

REGRESSION TESTING PAVE THE WAY TO

MANUAL EXPLORATORY TESTING

Robert Walsh is president and manager of applicationdevelopment at EnvisionWare which provides soft-ware solutions for public and academic libraries

14 bull Software Test amp Performance JULY 2009

language elements for defining testcases combining tests into suites andasserting the various exit conditions thatdetermine whether the tests pass or failMany also provide a run-time environ-ment in which the tests execute Theserange from simple command-line reportsshowing the results of the tests to com-plete GUIs that integrate directly withthe developerrsquos programming environ-ment Automated unit tests often areintegrated with the continuous build sys-tem and are run each and every time theproject is built This helps to ensure thatthe developers know instantaneouslywhen a recent check-in has caused thesuite of unit tests to fail

When working with automated unittests developers commonly find itnecessary to use mock objects fakeobjects or method stubs to stand in forother parts of the application Forexample if a module obtains informa-tion from a database it might beadvantageous to use a fake databaseobject to provide canneddata that is similar to thereal data available in theproduction environmentThis approach gives theprogrammer more controlover the test environ-ment and lets them dic-tate which data isreturned and how to rep-resent a variety of inter-esting situations Furtherthe technique allowsdevelopers to simulateproblems and failures thatmay be difficult to triggerwhen using a real data-base These might includeconnectivity issues bufferoverruns and other errorconditions

Automated unit testsprovide several key ben-efits in an agile environ-ment First they make itpossible to achieve highlevels of code coveragealmost as a side-effect Ifa programmer is disciplined aboutTDD then near 100 percent coveragejust happens Second automated unittests help to provide an early warningindicator that shows when behaviorwithin a module has changed This canbe used to identify unintended side-effects of other development and toserve as a kind of safety net for later

refactorings If after reworking a mod-ule all of the existing unit tests passthe developer can be confident that themodule will function properly in thecontext of the full application Finallyautomated unit tests also help to raisethe initial quality of the applicationdelivered to testers By using tests aspart of the coding process developersare able to catch many of the minorerrors and trivial mistakes that wouldotherwise fall to testers Reducingobvious coding errors through unittesting helps free up testers for theimportant job of manual exploratorytesting

AUTOMATED ACCEPTANCETESTINGAn essential element to the success ofan application project is how well itmeets the needs of its users A prod-uct might be absolutely flawless per-fect in every way and still not do thejob One way to ensure that the busi-

ness needs are met iswith automated accept-ance testing

With automated ac-ceptance testing the cus-tomer often with help ofQA writes tests for thekey features that arescheduled to be imple-mented in the upcomingiteration These tests arewritten in a natural lan-guage that non-technicalpeople can read and as aresult they become exe-cutable specifications thatcannot get out-of-syncwith the application beingtested

Automated accept-ance tests tend to focuson successful use casesthey often ignore the sadpaths and exceptional situ-ations For this reason atest strategy built entirelyon automated acceptancetesting will be inadequate

to thoroughly test the applicationSometimes the testers will augment thetest suite created by the customer withadditional acceptance tests that docover more of the corner cases

Automated acceptance tests pro-vide the primary measure of progress asthe product evolves over the course ofan iteration or even across an entire

release plan The acceptance tests arewritten early in (or even before) the iter-ation and running them clearly illustrateshow many of the stories have been com-pleted As part of an iteration reviewmeeting the tests are run again todemonstrate that the scope selected atthe start of the iteration was implement-ed according to the customerrsquos own def-inition of what is correct This process ismainly a formality it should be no sur-prise to anyone whether the tests willpass or fail when run during the iterationreview meeting

Two approaches to automatedacceptance testing are the Frameworkfor Integrated Test (FIT) developed byWard Cunningham and FitNesse awiki originally written in Java by MicahMartin Robert C Martin and MichaelFeathers FitNesse extends the princi-ples established by Cunninghamrsquos tooland provides a mechanism by whichtests may be created and maintained inaddition to being executed Both are

[With

automated

acceptance

testing

the

customer

writes

tests for

the key

features]

based on creating tables of input datathat are fed into the application undertest The programrsquos output is capturedand compared to the expected resultswhich is also provided in the tables

While tests themselves are definedusing a combination of narrative textand tabular data theyrsquore executed byfixture code that accepts the inputfrom the tables calls the appropriatemethods in the application and pro-vides the output back to the test run-ner These fixtures are generally writ-ten in the language of the applicationand must be implemented and main-tained by developers However thegoal is that they be thin and serve onlyto translate inputs and outputsbetween the test harness and theapplication Therefore the overallimpact that working with the fixturecode has on the developersrsquo productiv-ity should be minimal

As with automated unit tests auto-mated acceptance tests are likely tobe run against an incomplete applica-tion Not much functionality exists inearly iterations so it may be necessaryto test one part before another hasbeen built Again simulators are oftenused to provide the missing behaviorsIt is up to the project team to decidewhether the simulators are replacedwith real implementations as theybecome available in later iterationsEach approach has its advantagesUsing real components may providemore confidence that the applicationworks properly However using simula-tors may provide more control over thesystem during the tests and as withthe mocks and fakes used in unit test-ing itrsquos sometimes easier to recreatefailures to verify that the applicationhandles them properly

AUTOMATED REGRESSIONTESTINGResponsible thorough testing mustverify not only the functional require-ments but non-functional requirementsas well Automated regression testingwhich seeks to verify functions thathave worked in the past can be aneffective way to provide this additionalcoverage and fill in the gaps left by theacceptance tests

Unlike unit and acceptance testingwhich most heavily involve developersand customers automated regressiontesting is performed by testersAutomated regression tests should

JULY 2009 wwwstpcollaborativecom bull 15

CHALLENGES OF TESTING IN AGILE ENVIRONMENTS

The agile environment exaggerates many of the challenges faced in more convention-

al approaches to testing First the pace of the development effort requires that the

application be tested within each iteration in parallel with the actual coding

Conventional approaches that focus on full regression during a time when the code is

frozen are ineffective in agile environments Agile testing tends to focus on the new

functionality and providing rapid feedback to the programmers However regression

testing is still important and some mechanisms by which the entire application may

be exercised end-to-end must be implemented

Testing on agile projects begins almost immediately However in the early iterations

there may not be much of the application to test Conventional approaches that focus

on running the completed application do not work well in agile environments Even

when automated efforts are employed in conventional environments the emphasis

often is on exercising the application from the outside in In other words scripts drive

the application through the UI In a methodology where the UI evolves with the rest of

the application though it is not possible to test the application in this manner

Requirements in agile environments tend to be less formal than in conventional

methodologies Requirements also referred to as stories in some agile iterations often

begin life as a single phrase or short sentence describing at a very high level what the

application must do As part of the iteration or release planning process the project

teammdasha cross-functional mixture of programmers testers business analysts and

stakeholdersmdashcollaborate to flesh out the stories The stakeholders often referred to

collectively as the customer focus on the observable ldquohappy-pathrdquo outcomes while

the testers add details to cover the odd exceptional or ldquocornerrdquo cases In some agile

methodologies these details are captured in the form of acceptance tests Conventional

approaches that use requirements documents as the basis of test cases tend to struggle

in agile settings

Agile depends heavily on effective collaboration among the programmers testers and

stakeholders It is not uncommon in agile environments to find testers and developers

working closely together on a daily basis This tight coordination of the various groups

often is absent in more conventional methodologies where the product transitions

through the development life cycle as a series of handoffs The business analysts deliv-

er the requirements to the architects who in turn deliver a design to the programmers

who deliver an application to the testers who test it according to the initial require-

ments People working in conventional settings sometimes are uncomfortable collabo-

rating with those in other groups

Successful agile efforts must find ways to overcome each of these challenges

Automation is essential to addressing the issue of pace By building automation infra-

structure early in the project when there is little real functionality available to test it

becomes far easier to run more tests later in the project Further this approach helps

to provide the regression testing necessary to ensure the product is evolving stably and

not losing functionality once working However not all tests can nor should be auto-

mated Skilled and experienced professional testers can add significant value through

effective manual testing By automating what can be automated testers have more

time to apply their talents through manual testing efforts

attempt to cover as much of the appli-cation as possible and they shouldrely on mocks and simulators sparing-ly This allows the tests to provide afairly realistic view of the state of theprogram and to more accurately pre-dict how it will behave in a productionenvironment

Creating and maintaining a usefulsuite of automated regression tests maybe the most difficult aspect of testing inan agile environment In early iterationsnot much of the application exists sothere are few meaningful regression

tests that can be written and executedAs the program grows the scope andcomplexity of the project may becomeoverwhelming and it may be challengingto keep up with the pace of develop-ment It is usually helpful to spend earlyiterations creating a framework forautomated regression tests andpreparing outlines and test scripts forfunctionality that will be added overtime By investing in the scaffolding inthe beginning it becomes possible forthe testers to add new regressiontests incrementally Also the existing

regression tests help to ensure thatthe application continues to performaccording to the expectations estab-lished in prior iterations This makes itpossible for the testers to continue tofocus on new functionality rather thanconstantly reconfirming existing behav-ior is correct

In conventional development meth-odologies there are many categoriesof tests intended to exercise the appli-cation end-to-end These include per-formance tests load tests stresstests integration tests smoke testsand others In the end each of thesemay be thought of as a type of regres-sion test All attempts to determinehow well the product meets the func-tional and non-functional requirementsalso attempt to exercise the completeapplication and all help to ensure thatfunctionality is never lost once it hasbeen implemented

One of the key components ofagile methodologies is a focus on busi-ness value Therefore any of thesetypes of tests that provide valuableand meaningful feedback about thestate of the project should be givenattention in agile environments Forexample if performance is important tothe application then performancetests should be automated and made apart of the regression test suite Thesame is true for any of the other typesof tests mentioned here

Because automated regression

[Automated

regression tests

should cover

as much of the

application

as possible

and should

use mocks

sparingly]

16 bull Software Test amp Performance JULY 2009

tests may serve a variety of purposesthere is no standard approach to imple-menting them Often a combination oftools must be employed to address eachspecific situation Both commercial andfree tools are available for the differenttypes of testing and each may be bestsuited to a particular type of applicationIn addition some teams build customframeworks using Ruby Java PerlPython or another language to get thebest match for their project

MANUAL EXPLORATORY TESTINGThere are two main types ofdefects those under whichthe application fails to performaccording to expectations (asdefined by its requirements)and those that cause theapplication to behave improp-erly in response to an unex-pected event Automatedtests can be effective at find-ing the former For a given setof expected inputs the appli-cation should produce a specific output Howeverautomation is not particularlygood at finding the unexpect-ed

Consider for example anapplication that permits a userto perform tasks A B and CAs the product was beingdeveloped no one expected that A Band C were in any way related Howeverwhen the user performs these tasks in aspecific order the program crashes It isunlikely that the functional requirementsinclude a statement like ldquoThe systemshall not crash when the user performsA then C then Brdquo As a result it isunlikely that an automated test exists tosimulate the situation This is where man-ual exploratory testing can be beneficial

Manual exploratory testing gives thetester an opportunity to truly demon-strate their value to a project Testerstend to be good at coming up withunusual ldquowhat ifrdquo scenarios that repre-sent what actual users will do with theapplication in the production environ-ments These might be long series oftasks odd combinations of events orprolonged use of the program Oftenthese scenarios are too difficult or cost-ly to automate

Manual exploratory testing is morethan the tester wandering randomlythrough the application To be done

effectively the tester must start with aplan He needs to isolate a section ofthe program to explore and time box theexploration Once the tester begins theexploration he should create a record ofthe actions taken This is crucial to theability to recreate any problems foundScreen recording tools also might proveuseful for documenting exactly what thetester does during an exploratory testingsession When flaws are found duringthis process new acceptance or regres-sion tests should be written (and auto-mated) to expose the bad behavior pro-

vide a verification that it has been cor-rected and ensure that it does notreturn

For software testing to be success-ful in an agile development environmenta four-part strategy comprising compli-mentary approaches has given me thebest results

Automated unit testing puts theonus of removing obvious errors ondevelopers where it belongs and resultsin higher initial quality delivered totesters

Automated acceptance testingsolves the critical problem of engagingstakeholders to identify key functionalityand to define what it means for the pro-gram to work correctly and provide con-fidence that the core functionality isbeing properly implemented

Automated regression testing pro-vides thorough coverage ensures thatthe application meets functional and non-functional requirements and guaranteesthat functionality once working is not dis-abled in later iterations

Manual exploratory testing exposesunexpected behavior by letting testersexercise the application in ways too dif-ficult or costly to automate

This four-phase approach to testingenables a project team to effectively andthoroughly determine an applicationrsquosreadiness for release Consistent with

agile practices testingbegins early in the softwaredevelopment life cycle and itis done constantly through-out the project Every teammember participates in someway in the testing effort andthe mixture of testing typesminimizes the risks associat-ed with any single techniqueBy automating what can beautomated it becomes pos-sible to provide testers withsufficient time to fully applytheir unique skills and experi-ences yacute

REFERENCES AND FURTHERREADINGbull Cunningham Ward Framework

for Integrated Test October 122007 httpfitc2com

bull Dunkin Robert Burrell TheLanguage of Testing Dummies

Mocks Fakes and Stubs Weirdest UndreamtUse Case November 29 2008httpwwwjrollercomrobertburrelldonkinentrytdd_mocks_fakes_stubs

bull FitNesse Front Page FitNesse httpwwwfitnesseorg

bull Fowler Martin Mocks Arenrsquot Stubs MocksArenrsquot Stubs January 2 2007 httpwwwmartinfowlercomarticlesmocksArentStubshtml

bull jUnitorg Resources for Test DrivenDevelopment jUnit httpwwwjunitorg

bull Porter Robert Mock Objects in Test DrivenDesign BlogOfBob October 7 2007httpwwwrp2ccomblogofbobMockObjectsInTestDrivenDesignaspx

bull Rothman Johanna Does Exploratory TestingHave A Place On Agile Teams StickyMinds July7 2008httpwwwstickymindscomsitewideaspFunction=edetailampObjectType=COLampObjectId=13860

[Manual exploratory

testing is more than

wandering randomly

through the application

the tester must start

with a plan]

LEARN TEST AUTOMATION from Robert Walsh at the Software Test ampPerformance Conference this fall Hersquoll be with us in Cambridge MAOct 19-23 teaching ldquoFirst Steps to Test Automationrdquo an in-depth coursethat can automate your testing fast VViissiitt wwwwwwssttppccoonnccoomm

observed with clients using agile methodologies typi-cally Scrum While my associates and I have helpedclients overcome these challenges I will not present allof our suggested solutions here Instead Software Testamp Performance magazine has given test consultant andcertified Scrum Master Bob Galen an opportunity tooffer solutions

What I describe here are challenges we haveobserved or discussed with clients who told us they werefollowing an agile life cycle Some of these challengesmay be inherent in agile methodologies However somemay in fact actually arise from improper understanding orimplementation of the chosen methodology resulting inavoidable problems

VOLUME AND SPEED OF CHANGE One of the principles of agile development is that proj-ect teams should welcome changing requirementseven late in development Many testing strategies

especially analytical require-ments-based testing becomequite inefficient in such situa-tions While automated test-ing can help accommodateon-going change automationcan be fragile and high-main-tenance if not implementedproperly We find that risk-based testing coupled withreactive techniques likeexploratory testing and prop-

By Rex Black

Every software development model presents itsown testing implications Some ease the testingprocess others make it more challenging Whatfollows is a look at some challenges I have

Rex Black is founder and president of RBCS a software hardware andsystems consultancy and is a founding member of the STP CollaborativeStrategic Advisory Board

18 bull Software Test amp Performance JULY 2009

The AgileProcessCreatesTestingPuzzles

Which AreSolved When

Agile IsPerformedCorrectly

of traditional testing consultants I donrsquot believe theywere gathered from the perspective of an actual agiletester one with serious experience working in matureand high performing agile teams

Such views often come from the perspective of ascribe or observer of teams who at least from my per-spective arenrsquot practicing agility very well Or theyrsquoreanecdotal reactions to the Agile Manifesto or other writ-ings without trying to understand the underlying intentionsWhatrsquos discouraging is that superficial reactions from wellrespected sources can give the wrong impression

VOLUME AND SPEED OF CHANGEThe point of embracing change is the realization thatmost software projects are volatile Business conditionsare incredibly dynamic Customers may not actuallyknow what they need until they see the application startto coalesce as working software The developers maynot have understood the challenges associated with thearchitecture without actuallybuilding parts of it nor mighttesters understand the mostrisk-based areas to test with-out seeing the application run

The point of trying to freezerequirements at a historicalinstant in time where thingsare well understood withoutwriting a lick of software is afoolrsquos errand Agile methods tryto embrace change react to it

Iwas asked by Software Test amp Performance magazine to provide a rebuttal to Rex Blackrsquos perspectives on how ldquoagile challenges testingrdquoRexrsquos views and reactions to agility are typical

By Bob GalenBob Galen is principal consultant of the RGalen Consulting Group anda founding member of the STP Collaborative Strategic Advisory Board

JULY 2009 wwwstpcollaborativecom bull 19

erly-done automation works bestWhatrsquos more change-related challenges often arise from

changes in the definition of the product and its correct behav-ior When the test team is not kept informed of thesechanges or when the rate of change is very high this canimpose inefficiencies on the development execution andmaintenance of tests

SHORT ITERATIONS SQUEEZE TESTINGIn sequential life cycles if the organization involves the testteam early in the project the testers have lots of time todevelop and maintain their tests and can do so in parallel withthe development of the system and prior to the start of sys-tem test execution Some more formal iterative lifecycle mod-els such as Rapid Application Development and the RationalUnified Process often allow substantial periods of timebetween test execution periods for each iteration Theseintervals allow test teams develop and maintain their test sys-tems

Most teams using agile life cycles progress more quicklyeg Scrum uses sprints two to four week periods of shortfast-paced development iterations I have seen this pacesqueeze the test teamrsquos ability to develop and maintain testsystems Testing strategies that include an automation ele-ment also have proven particularly sensitive to this challenge

UNIT TESTING IS OFTEN INADEQUATE Most agile proponents stress good automated unit testing andframeworks such as JUnit have helped simplify and minimize

and make the changes visible across the team and to stakehold-ers while guiding the software toward the customersrsquo true needsIf the traditional testing approaches that Rex alludes to can notadapt to change effectively then Irsquod argue that they need tochange

SHORT ITERATIONS SQUEEZE TESTINGIf your parochial goal is simply testing then I agree that shortiterations can be quite challenging But if your goal is to deliversmall chunks of working software to gain customer and prod-uct quality feedback and to ultimately deliver high value andworking products applications then small iterations rock

To Rexrsquos point regarding ldquosqueezing test system mainte-nancerdquo all of the agile methods adhere to the notion of com-pleting a small set of features that meet rdquodonenessrdquo criteriaat the end of each Iteration or Sprint

There should be no work that is left outmdashfor exampleappropriate test automation development or maintenance oftest ldquosystemsrdquo

Itrsquos up to the testers to simply place this work on theProduct and Sprint Backlogs and plan its completion as partof each Iteration or Sprint

Sometimes this takes effort and courage to defend this workagainst feature pressure However all team members encounterthe same challenge and need to ensure that they deliver ldquodonerdquosoftware from each of their functional perspectives

UNIT TESTING IS OFTEN INADEQUATEFirst agile methods stress automated testing of all kindsmdashnotsimply unit tests FitNesse Selenium and Watir Watin are

examples of open source automation tools that are widelyused in the agile community and that focus on non-unit testing

We do stress unit tests as being an important part of theoverall coverage and as being a safety net for the teamsrsquo refac-toring efforts We want developers to be responsible for con-tributing to test automationmdashnot just testers And if a team isusing TDD as a practice wersquove learned that developing testscan be a powerful aid as one designs their software in smallincremental steps

No agile team to my knowledge views unit tests as theend-all of testing In fact as part of delivering ldquodonerdquo softwareincrements they must decide on the set of tests (unit func-

20 bull Software Test amp Performance JULY 2009

[I expect testers to pair with

developers to improve practices anddevelopers

to deliver solid unit tests before they

can be considered lsquodonersquo]

tional acceptance non-functional regression etc) thatshould be run within the Sprint in order to meet a customer-ready deliverable Every team member should be definingthese testing tasks with the testers taking the lead

To Rexrsquos last point that programmers donrsquot commit to doingeffective unit testing I expect the testers to pair with the devel-opers to improve these practices and I expect developers todeliver solid unit tests before they can be considered ldquodonerdquo

MANAGING THE INCREASED REGRESSION RISKThere is a continuing misperception that unit tests are theonly tests run on a Sprint-by-Sprint basis While itrsquos true thatthe team might not be able to run a full regression test with-in each Sprint it is their responsibility to apply risk-basedtechniques based upon whatrsquos been added or changed andto run appropriate functional non-functional exploratory andregression tests as part of delivering ldquodonerdquo software at theend of the Sprint

Clearly automation development at all levels helps withthis so the team should be investing time in improving theiroverall ability to automate all of the testing types I mention asappropriate

POOR OR MISSING TEST ORACLESI hate to break the news to Rex but the best test oracles arethe people within the team not the documentation Documen-tation almost always loses its relevance and accuracy Yes itrsquoshelpful but should never be considered the single source forall knowledge

What agility encourages is for the team to talkmdashto plan their

the costs Such automated unit tests allow what agile propo-nents call refactoring Refactoring is the redesign of majorchunks of code or even entire objects The automated unit

tests provide for quick regression testing of refactored codeSome agilists recommend substituting automated unit tests fordesign as in Test Driven Development While this soundsgood in theory there are two problems that I have observedwith this approach in practice

First unit testing has limited bug-finding utility In researchpublished in June 2002 in CrossTalk the Journal of Defense

Software Engineering author Capers Jones found unit testingto be only 25 to 30 percent effective at finding and removingdefects In RBCS assessments we have found that good sys-tem testing by an independent test team averages around 85percent effective at finding defects So organizations that careabout quality should have both good unit testing and good sys-tem testing

Second while unit testing is often thought of as thedeveloperrsquos job I have found that many programmers justdonrsquot do it or deliver cursory attempts This can create com-plications and delays for the system test team in its role asldquobug finder of last resortrdquo as they deal with buggy code dur-ing short test execution periods in the sprints This can alsolead to code churn during testing since so much code mustchange to fix the bugs

The amount of change can ultimately outstrip even theability of the best automated regression test system to keepup which then leads to lower defect detection effectivenessfor the test team

MANAGING THE INCREASED REGRESSION RISKCapers Jones also reported that regression accounts forabout seven percent of bugs In iterative life cycles such asScrum however code that worked in previous sprints getschurned by new features in each subsequent sprint Thisincreases the risk of regression

Agilists emphasize good automated unit testing in part tomanage the regression risk inherent in such churn But with eventhe best unit testing topping out at about 50 percent defect

JULY 2009 wwwstpcollaborativecom bull 21

[While unit testing is often thought

of as the developerrsquos job I have found that

they just donrsquot do it or [they] deliver

cursory attempts]

22 bull Software Test amp Performance JULY 2009

work together to cluster in small groups working on specific fea-tures to work with the customer gaining valuable feedbackmdashandto collaborate across disciplines From this they can decidewhere when and to what degree documentation is necessary

We expect team members to be professionals and to beresponsible and accountable We trust their judgment overblindly filling in a plan template or a requirement document orblindly running the same tests over and over again to meetsome perceived risk

All oracles have a cost We engage the customers in thesedecisions as wellmdashensuring that those oracles that are providedare actually used and kept current

THE SHIFTING TEST BASISIrsquom lost here Agile teams define requirements Some usePost-it Notes and cards or define them as User StoriesOthers describe use cases or use traditional requirementforms Many even add a construct called acceptance tests ona finely grained story level that helps define the critical ele-ments of each feature or requirement These are things thatthe customer will ldquotestrdquo as part of their acceptance at the endof a Sprint or Iteration

Irsquod guess that Rex has seen poor or missing requirements inhis agile experience and Irsquod say those were not mature or solid-ly performing agile teams ldquoGoodrdquo teams define requirementsat a level that assures the team will build what the customerneeds They ensure collaboration surrounding each require-mentmdashat the point of attack

To capture a key point here The agile methods try to adopta Just-in-Time and Just Enough approach to their requirement

removal effectiveness according to Jones automated regressiontesting via unit tests will miss at least half of the regression bugs

POOR OR MISSING TEST ORACLESAgile favors working software over written documentationand in my observations special scorn is reserved for specifi-cations The suggestion by the Agile Manifesto (wwwagile-manifesto org) that people should value ldquoworking softwareover comprehensive documentationrdquo can create real chal-lenges for a test team when taken too far Testers userequirements specifications and other documents as testoracles ie the means to determine correct behavior undera given test condition We have seen testers in agile situa-tions given documents with insufficient detail and in somecases given no documents at all

Even when testers are given adequate documents twoother agile development principles keep the test oracle chal-lenge alive First agile requires teams to embrace changeand espouses that ldquothe most efficient and effective methodof conveying information to and within a development team isface-to-face conversationrdquo

From what Irsquove seen these two principles allow the proj-ect team to change the definition of correct behavior at anytime including after testers have created tests to confirm aparticular behavior and even after testers have reported bugsagainst a particular behavior

Further the definition of correct behavior can change in ameeting or a discussion where a tester is not present If thatchange is not communicated to the tester the project suffers

false positives when testers report bugs against behaviornow defined as correct

THE SHIFTING TEST BASISRequirements-based testing strategies cannot handle vague ormissing requirements specifications Missing requirementsspecifications would mean a test team following a requirements-based testing strategy not only canrsquot say what it means for aparticular test to pass or fail they wouldnrsquot have a test basisupon which to base their tests

The test basis also provides a means to measure theresults In a requirements-based test strategy testers canrsquotreport test results accurately if the requirements are missing orpoor Testers canrsquot report the percentage of the test basis cov-ered by passed tests because the requirements wonrsquot provideenough detail for meaningful coverage analysis

TOO MANY MEETINGSA ldquoface-to-face conversationrdquo is just another way to describe ameeting A client once jokingly described Scrum as a heavy-weight process saying ldquoIrsquom surprised at the name agilemdashitshould be called couch potato There are too many meetingsTherersquos too much jawboning I find it really ironic that there are allthese books explaining how simple it isrdquo

Perhaps it was intentional not to use the dreaded word meet-ing in the Agile Manifesto but the result is often the same In someorganizations everyone gets invited to every meeting meetingsballoon in number and duration and managers and leads are notavailable to manage and lead their team because they are in meet-

ings much of the day Effectiveness and efficiency drop Anotherclient once told me of a senior manager who said in response toa complaint from a line manager about how attending meetingswas negatively impacting his ability to lead his team ldquoWersquore goingto continue to have these meetings until I find out why nothing isgetting done around hererdquo The senior manager was I am told theonly one in the room who didnrsquot get the irony

THE BOTTOM LINEIn RBCS client engagements we typically recommend blend-ed testing strategies that align well with Scrum and otheragile methodologies What seems most likely to work in manycases is a blend of risk-based testing maintainable automat-ed testing (at both the unit and system level) and a percent-age of effort spent on reactive testing (eg exploratory) Insome cases our methods will help mitigate the testing risksand reduce the testing challenges associated with thesemethodologies

Clearly there is no single way to perform effective soft-ware testing and as the industry evolves so too will the mosteffective methods that do the most good yacute

development An option of ldquono requirementsrdquo does not exist

TOO MANY MEETINGSThe point of face-to-face collaboration is that itrsquos the most effec-tive and richest way to communicate in technical teams Has anydeveloper ever implemented a requirement had a tester test itand a customer receive it with perfect alignment That wouldtruly be exceptional More oftenthe tester will say the softwareperforms differently than the testsandor the customer says I didnrsquotask for that even though theywere all working off of the exactsame requirement

The point is that technicalteams need to rely on written aswell as face-to-face collabora-tion to effectively reduce themisunderstandings and reworkthat written requirements alone often drive

Beyond this agile teams should be conducting incrediblyeffective meetings something perhaps lacking in the exam-ples Rex has seen I suspect this problem isnrsquot agile-centricand that those teams simply need a good facilitator

THE BOTTOM LINEEffectively testing todayrsquos software is incredibly challenging andwe all need to stop thinking of agile testing as somethingopposed to traditional testing Testers need every possible tool

to perform testing well Agile methods have emerged as a set ofpractices and a mindset that is context based and powerful Theytake a holistic view to testing and qualitymdashreinforcing the two asa whole Testers in this model have a powerful opportunity to notonly test but to champion and guide the overall teamrsquos focus ondelivering quality software

We need to stop our history of opposition to new testing app-roaches and begin embracing abroader set of testing practicesthat work in each projectrsquos uniquecontext I also believe we that weneed to stop quoting CapersJones as I suspect his data andstudies do not reflect the dynam-ics of mature agile teams Irsquom look-ing forward to encouraging Rex totry and gain deeper understandingof agile testing practices yacute

Bob will have ample oppor-tunity to do that in October when he and Rex meet for thefirst time as fellow members of STampP CollaborativersquosStrategic Advisory Board ndashEd

JULY 2009 wwwstpcollaborativecom bull 23

[Teams need to rely on written and face-to-face

collaboration to effectively reduce rework]

LEARN AGILE TESTING TECHNIQUESdirectly from Bob Galen himself this fall atSTPCon The Scrum Master will be on hand inCambridge MA Oct 19-23 to teach ldquoAgile andHigh-Speed Software Testing Techniquesrdquo anin-depth training course for bringing agile test-

ing to your organization VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

LEARN TEST MANAGEMENT from Rex Blackat the Software Test amp Performance Conferencethis fall in Cambridge Mass Learn about ldquoTestManagement A Risk-Based Approachrdquo the prac-tice that guides test activities based on carefulassessment and prioritization of the risks involved VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

ciplines and draw on a variety of skills to do their jobs And whilethere are many unsung testing heroes inside software develop-ment shops there are the beginnings in many organizations of agrowing recognition that test and performance engineers shouldbe considered peers of their development counterparts

We asked several testing gurus to share their thoughts onthe state of the testing industry and give us some insights on wherethey see things evolving One thing for anyone to realize is thatwhile testing as an industry is maturing it is still relatively new Thesoftware tester job isnt something that shows up on the US CensusWe are still very much a new profession says Scott Barber thefounder and chief technologist of PerfTestPlus in Palm Bay Fla Mostof our very first programmers are still alive we arent thousands ofyears old like other disciplinesrdquo

Perhaps the best rundown of the industryrsquos evolution is summed up by Michael Boltonwho lives in Toronto Testing has historically been confirmation validation and verificationhe says I believe that testing must become far more about exploration discovery investi-gation learning and communicating what youve learnedrdquo Bolton runs test consultancyDevelopSense and teaches rapid software testing methods ldquoI like to encourage people torecognize and manage uncertainty and ambiguity and instead of trying to eliminate them get-ting them down to levels that are suffi-ciently low for people to do use-ful workrdquo

While any gathering ofsoftware testers will find dis-agreement on overall pur-pose one thing is cleartesting isnt about how toimprove code qualityndashthat isthe role of the engineeringpeople who do the coding workand of management says BoltonTesting is about providing informationto those people so that better-informed decisionsabout quality are possible

The future careers for the current cropof test engineers seem to be on the rise

All this activity has made software testing a very big tent The industry has matured and grownto encompass numerous approaches to methods training and tools Because of this diver-sity there is a growing consensus that the best testers are those that can span multiple dis-

The software testing industry is at a historic crossroads Rich Internet applica-tions cloud computing the rise of virtualization and more capable mobiledevices have all made the job of producing reliable software more complexwhile at the same time enabling more sophistication for automated test tools

Wersquore All Part of the Story

We are a new profession most of our first programmers are

still alivendash Scott Barber

A skilledtester is theproduct of avariety of

experiencesndash Jon Bach

24 bull Software Test amp Performance

the business outcomes theyrsquove delivered he says He goes on to describe the best mix in his development teams A performance

engineering organization should be comprised of engineers that are just as capableor more capable than the core product engineers This allows for the performanceteam to be just as familiar with the product code base and for them to go as far asmaking performance enhancements to the product and testing the outcome of thosechanges

At Google employees involved in testing generally have one of three titles saysDavid Saff a Google software engineer in test whorsquos basedin Cambridge Mass Software engineers are expected to unit-test their own code and many take an active role in maintain-ing testing frameworks and system tests for their teams Testengineers look at the quality requirements of a project identifyholes and fill them by applying existing automation tools where pos-sible or managing manual testing teams where needed Softwareengineers in test look for holes in the tools available and serve as internal

tool developers and implementation consultants Any teamcant survive without high competence in each domain

CLOUDS ARE FORMINGThe move toward cloud computing has helped softwaretesting overall but also made for some new challengestoo Cloud computing adds layers of complexity to test-ing but it also makes things more interesting says BoltonThese layers include being able to predict latency and avail-ability of all components that reside in the cloud Barber saysthat Too many people take latency for granted still and many

application developers still take the entire network infrastruc-ture for granted and dont consider it part of their domain

so programmers just assume it is working unless theircode has a problem

Bartow agrees Network latency hasbecome even more important with the intro-duction of Rich Internet Application tech-nologies such as Adobe Flex Every value that you enter

David Strom is a freelance reporter and author living in St Louis

particularly within large software shops such as Intuit and Google There software testengineers are often more important than the development engineers who writethe original code and have their own career track with parallel compensationand benefits

At Intuit QA engineers are held to the exact same standards and expec-tations as product developersrdquo says Dan Bartow a manager of performanceengineering for the company which makes TurboTax and other financial softwareldquoIn fact my team of five full-time performance engineers is calibrated the sameway as the product engineers at the end of year in terms of their skill-sets and

Virtualizationmakes things

easierand harder

ndash Michael Bolton

wwwstpcollaborativecom bull 25

of the Story

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 6: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

IF I HAD TO DESCRIBE IN A SINGLEword the characters that make up todayrsquos castof test industry thought leaders one that fits wellwould be diverse Theyrsquore geographically di-versemdashstretching from Quebec to Kalamazoofrom Boston to Bulverde Their clients are evenmore so Engagements spread across the USand Canada to China New Zealand Russia andSweden to name just a few

Then therersquos the philosophical divide asyoursquoll discover in David Stromrsquos fine piece in thisissue titled ldquoWersquore All Part of the Storyrdquo in whichhe looks at the state of the testing industry andits major players Yoursquoll hear from the Bach broth-ers James and Jon Scott Barber Rex BlackMichael Bolton Kent Beck and many others Justto prove that we didnrsquot simplytake names from the B sec-tion of the testerrsquos phone bookwe also have Googlersquos DavidSaff weighing in

While opinions and philo-sophies vary therersquos one thingon which they all seem to agreeWith the ever-increasing com-plexity of software paired withever-shrinking release timeta-bles there will always be a needto reduce testing time andincrease efficiency Is thereagreement on how best toaccomplish that Maybe in car-toons

Which brings me to thestory behind this monthrsquos cov-er illustration concept CreditMr Strom for the caricatureidea during a brainstormingmeeting with me and Redwoodpresident Andy Muns I knewthat self-effacement would beokay with Jon Bach and hisbrother James who recently recounted the sto-ry of how the two had received caricatures ofthemselves at a conference they were bothattending We asked permission anyway ofcourse as we did with the remaining playersPeople in testing are exceedingly good natured

in my experienceat least

Varied opin-ions are furtherhighlighted in thisissue in The AgileProcess CreatesTesting Puzzlesa lively point-coun-terpoint by con-sultant Rex Blackand Scrum Mas-ter Bob Galen Ofcourse even critics of agile methods acknowl-edge merits of the practices when executed cor-rectly After all what can be bad about putting

people over process Of put-ting working software over com-prehensive documentation

You might say that agileprinciples were behind the designof our new Web site which wereexcited to launch this month Itrsquosdesigned for individuals to inter-act with each other and with usThe site also features process-es tools and resources fortesters QA professionals andsoftware managers Itrsquos designedwith the customer (you) in mindand looks to add interactivity andcollaboration to your favorite test-ing magazine And well con-stantly add more features andcontent to make this new site arich and valuable online resourceand community

Also beginning this monththe magazine has a fresh newlook Sincere thanks go to artdirector LuAnn Palazzo with-out whose talent and infinite

patience this project would not have happenedWe hope yoursquoll enjoy the magazine and participate in our new membership communityand collaborative Web site Please join us atwwwstpcollaborativecom and tell us what youthink yacute

ed notes

The Test IndustryrsquosCartoonland Villagec

Edward J Correia

ldquoWhile opinions

and philosophies

vary there will

always be a need

to reduce testing

time and increase

efficiency rdquo

VOLUME 6 bull ISSUE 7 bull JULY 2009

President Andrew Muns

Chairman Ron Muns

105 Maxess Road Suite 207Melville NY 11747+1-631-393-6051+1-631-393-6057 faxwwwstpcollaborativecom

Cover Illustration Caricatures by Steve Nyman

REDWOODCollaborative Media

Editor

Edward J Correiaecorreiastpcollaborativecom

Contributing Editors

Joel ShoreMatt HeusserChris McMahon

Art Director

LuAnn T Palazzolpalazzostpcollaborativecom

Publisher

Andrew Munsamunsstpcollaborativecom

Associate Publisher

David Karpdkarpstpcollaborativecom

Director of Operations

Kristin Munskmunsstpcollaborativecom

Chief Marketing Officer

Jennifer McClurejmcclurestpcollaborativecom

Marketing Coordinator

Teresa Cantwelltcantwellstpcollaborativecom

Reprints

Lisa Abelsonabelsonstpcollaborativecom

516-379-7097

MembershipCustomer Service

membership-servicesstpcollaborativecom

631-393-6051 x 200

Circulation and List Services

Lisa Fiskelfiskestpcollaborativecom

wwwstpcollaborativecom bull 9

IN FEBRUARY OF 2001 17 TECH-nology professionals gathered for a peerconference in Park City Utah and craft-ed a document called the Agile Man-ifesto That document is probably thesingle most influential statement on soft-ware development of the past decadeand its values led to a series of newideas and concepts insoftware testing

Agile SoftwareMethodsXP Short for ExtremeProgramming XP was cre-ated by Kent Beck duringhis work at Chrysler Atthe time projects at theauto maker struggled withheavyweight specifica-tions specialized rolesand enough turnover thatprojects would simply burnout mid-stream delivering no workingsoftware at all before being canceledExtreme Programming focuses on deliver-ing software by first developing simpleworking features and then adding morefeatures incrementally while improvingthe code constantly Most XP techniquessuch as Pair Programming Test DrivenDevelopment and Refactoring are nowconsidered engineering practices

Scrum Popularized at (now defunct)Easel Corp by Ken Schwaber and JeffSutherland At Easel requirementswere changing so quickly that thedevelopers could never get anythingcompletely finished Scrum focuses onstable requirements for each time boxcommitments from team membersand communication through dailystand up meetings Most of the focusof Scrum is on project management

DSDM Short for Dynamic SystemsDevelopment Method Developed in theUnited Kingdom DSDM advocatesnotice that many projects have fixed staffand a fixed release date DSDM thereforefocuses on techniques to allow the teamto alter the project scope while maintain-ing high quality thus hitting the original

due date

Agile SoftwareProcessesTraditional Software Pro-cess generally breaksdown work by phase orby type of work all of therequirements first thenall of the coding then allof the testing and so onAgile processes insteadbreak down work by fea-ture one thin slice at atime of the final product

which the customer can use to actuallyand concretely do something new

Agile teams then batch the work upinto time boxes periodically bringing thecode to production quality Agile teamsrelease production code as often asevery few days to as rarely as every fewmonths Because of that teams needgreater discipline and skill to deal with anincreased regression testing burden

Iteration Used in Extreme Program-ming an iteration is a time box (typicallyone to three weeks) starting with arequirements exercise and ending withnew features at production quality

Sprint A short time box used in Scrumthat allows the technical staff to createcode ready for production without therequirements changing as they workTechnical staff are expected to own thedeliverable and release production fea-tures at the end of the time box

User Story A form of requirement thatdescribes a single specific meaningfulpiece of functionality that has value to the

business User stories are small and canbe implemented by a pair of programmersin one or a few days User Stories line uproughly with a single scenario within ause case in traditional software devel-opment A user story is not a specifica-tion it is an example of what the softwarewill allow a person to do

Story Card Stories are often express-ed on 3x5 index cards Agile Coach RonJeffries has described a story card as aplaceholder for a conversation

Story Tests A series of tests express-ed in plain English but eventually automat-ed Story tests provide examples for theprogrammers and testers and describethe minimum amount of functionality forthe story to be called done The goal ofstory tests is not to convey a comprehen-sive specification but to provide examplesso that the whole team can build a sharedmental model of what is expected Story Kickoffs A kickoff is the finalmeeting held before developers start

writing code The idea that testersshould be involved early in the processhas been around for decades The agiletwist is to have testers help as part ofthe design for the story During the kick-off developers product owners andtesters can discuss and ask questionsabout what the software will do this canlead to a better description and prevent

stamp pedia

10 bull Software Test amp Performance JULY 2009

Agile Testing in Practicec

The encyclopedia for software testing professionals

Matt Huesser andChris McMahon

Matt Heusser and Chris McMahon are careersoftware developers testers and bloggers Mattalso works at Socialtext where he performstesting and quality assurance for the compa-nyrsquos Web-based collaboration software

ldquoGood ideas can come

from anywhere at

any time rdquo

continued on page 36 gt

wwwseapinecomstpswiftSatisfy your quality obsession[ [

copy 2009 Seapine Software Inc All rights reserved

Swiftcovercom cut their testing time in half with TestTrack Studio and QA Wizard Pro while still providing the quality their customers expectSeapinersquos end-to-end Software Quality Assurance (SQA) solutions help you deliver quality products faster Start with

QA Wizard Pro for automated testing and add TestTrack Studio for issue tracking and test case managementmdashintegrated quality assurance solutions that together reduce testing time saving you money and improving customer satisfaction

Reduce quality assurance costs with automated functional and regression testing

Manage test case development defect assignments and QA verification with one application

Track which test cases have been automated schedule script runs and link test results with defects

Learn how to test faster while protecting quality Visit wwwseapinecomstpswift

Save timewhile protectingsoftware quality

So much of success boils down to time QA Wizard Pro and TestTrack Studio allow us to be more protable because we do more in less time mdash Test Manager Swiftcover

Test

Trac

kregPr

o T e

stTr

ackreg

TCM

T e

stTr

ackreg

Stud

io

Surr

ound

SCM

regSe

apin

e C M

regQ

A W

izar

dregPr

oIss

ue M

anag

emen

t Te

st C

ase

Man

agem

ent

Test

Pla

nnin

g amp

T rac

king

Co

nfi g

urat

ion

Man

agem

ent

Chan

ge M

anag

emen

t Au

tom

ated

Test

ing

Illus

trat

ion

by S

teve

Nym

an

necessary for the others to succeed They alsoprovide a system of checks and balances thatallows the strengths of one to compensate forthe weaknesses of the others Each pillar is theresponsibility of a different group within theagile development team and when each groupdoes its part my experience has shown that theresult is usually a successful project

AUTOMATED UNIT TESTINGUnit testing is perhaps the easiest type of test-ing to automate Automated unit testing is apractice through which nearly every line andbranch of code is covered with an automatedtest that proves the code works as intendedAutomated unit tests focus on the discreteunits or modules within the application eachperforming its specific tasks in isolation fromthe rest Automated unit tests may be writtenbefore or after the code under test is writtenbut itrsquos most effective to write them beforehand(a process known as Test-Driven Developmentor TDD)

With TDD the code tends to more close-ly represent the developerrsquos intent becausefor the developer itrsquos easier to write justenough code to satisfy the single new failingtest Less code written means less code tomaintain and fewer places where bugs canhide When the unit tests are written after thecode the tests scan sometimes show onlythat the code works as written and not nec-

essarily as intended Further therersquos a ten-dency to write more code than is necessarybecause the programmer is speculating aboutwhat conditions might exist rather than start-ing from concrete test cases that highlight thetrue requirements

Automated unit testing falls almost solelyon the shoulders of the developers and helpsto ensure that the code they are writingbehaves according to their interpretation ofthe requirements The word interpretation isimportant there because while a modulemight achieve 100 percent code coverage andpass all of its unit tests it might still fail tomeet the needs of the application If the pro-grammer misunderstands the requirementthe tests he writes will be flawed and theresulting code will misbehave Thatrsquos amongthe reasons that automated unit testing is notthe only form of automated testing employedon an agile project

Automated unit tests are generally writtenin the same language as the application beingtested A family of testing tools exists to facili-tate automated unit testing Among the mostwell-knows is jUnit created for Java by KentBeck and Erich Gamma with ports now avail-able for most languages These tools provide

By Robert Walsh

Successful agile testing relies on a strategy built on four pillars automated unit test-ing automated acceptance testing automated regression testing and manualexploratory testing As with columns that hold up a building no single pillar cansupport the entire load The four are interdependent and each provides benefits

THE BUILDING BLOCKS OF AGILEmdash

AUTOMATED UNIT ACCEPTANCE AND

REGRESSION TESTING PAVE THE WAY TO

MANUAL EXPLORATORY TESTING

Robert Walsh is president and manager of applicationdevelopment at EnvisionWare which provides soft-ware solutions for public and academic libraries

14 bull Software Test amp Performance JULY 2009

language elements for defining testcases combining tests into suites andasserting the various exit conditions thatdetermine whether the tests pass or failMany also provide a run-time environ-ment in which the tests execute Theserange from simple command-line reportsshowing the results of the tests to com-plete GUIs that integrate directly withthe developerrsquos programming environ-ment Automated unit tests often areintegrated with the continuous build sys-tem and are run each and every time theproject is built This helps to ensure thatthe developers know instantaneouslywhen a recent check-in has caused thesuite of unit tests to fail

When working with automated unittests developers commonly find itnecessary to use mock objects fakeobjects or method stubs to stand in forother parts of the application Forexample if a module obtains informa-tion from a database it might beadvantageous to use a fake databaseobject to provide canneddata that is similar to thereal data available in theproduction environmentThis approach gives theprogrammer more controlover the test environ-ment and lets them dic-tate which data isreturned and how to rep-resent a variety of inter-esting situations Furtherthe technique allowsdevelopers to simulateproblems and failures thatmay be difficult to triggerwhen using a real data-base These might includeconnectivity issues bufferoverruns and other errorconditions

Automated unit testsprovide several key ben-efits in an agile environ-ment First they make itpossible to achieve highlevels of code coveragealmost as a side-effect Ifa programmer is disciplined aboutTDD then near 100 percent coveragejust happens Second automated unittests help to provide an early warningindicator that shows when behaviorwithin a module has changed This canbe used to identify unintended side-effects of other development and toserve as a kind of safety net for later

refactorings If after reworking a mod-ule all of the existing unit tests passthe developer can be confident that themodule will function properly in thecontext of the full application Finallyautomated unit tests also help to raisethe initial quality of the applicationdelivered to testers By using tests aspart of the coding process developersare able to catch many of the minorerrors and trivial mistakes that wouldotherwise fall to testers Reducingobvious coding errors through unittesting helps free up testers for theimportant job of manual exploratorytesting

AUTOMATED ACCEPTANCETESTINGAn essential element to the success ofan application project is how well itmeets the needs of its users A prod-uct might be absolutely flawless per-fect in every way and still not do thejob One way to ensure that the busi-

ness needs are met iswith automated accept-ance testing

With automated ac-ceptance testing the cus-tomer often with help ofQA writes tests for thekey features that arescheduled to be imple-mented in the upcomingiteration These tests arewritten in a natural lan-guage that non-technicalpeople can read and as aresult they become exe-cutable specifications thatcannot get out-of-syncwith the application beingtested

Automated accept-ance tests tend to focuson successful use casesthey often ignore the sadpaths and exceptional situ-ations For this reason atest strategy built entirelyon automated acceptancetesting will be inadequate

to thoroughly test the applicationSometimes the testers will augment thetest suite created by the customer withadditional acceptance tests that docover more of the corner cases

Automated acceptance tests pro-vide the primary measure of progress asthe product evolves over the course ofan iteration or even across an entire

release plan The acceptance tests arewritten early in (or even before) the iter-ation and running them clearly illustrateshow many of the stories have been com-pleted As part of an iteration reviewmeeting the tests are run again todemonstrate that the scope selected atthe start of the iteration was implement-ed according to the customerrsquos own def-inition of what is correct This process ismainly a formality it should be no sur-prise to anyone whether the tests willpass or fail when run during the iterationreview meeting

Two approaches to automatedacceptance testing are the Frameworkfor Integrated Test (FIT) developed byWard Cunningham and FitNesse awiki originally written in Java by MicahMartin Robert C Martin and MichaelFeathers FitNesse extends the princi-ples established by Cunninghamrsquos tooland provides a mechanism by whichtests may be created and maintained inaddition to being executed Both are

[With

automated

acceptance

testing

the

customer

writes

tests for

the key

features]

based on creating tables of input datathat are fed into the application undertest The programrsquos output is capturedand compared to the expected resultswhich is also provided in the tables

While tests themselves are definedusing a combination of narrative textand tabular data theyrsquore executed byfixture code that accepts the inputfrom the tables calls the appropriatemethods in the application and pro-vides the output back to the test run-ner These fixtures are generally writ-ten in the language of the applicationand must be implemented and main-tained by developers However thegoal is that they be thin and serve onlyto translate inputs and outputsbetween the test harness and theapplication Therefore the overallimpact that working with the fixturecode has on the developersrsquo productiv-ity should be minimal

As with automated unit tests auto-mated acceptance tests are likely tobe run against an incomplete applica-tion Not much functionality exists inearly iterations so it may be necessaryto test one part before another hasbeen built Again simulators are oftenused to provide the missing behaviorsIt is up to the project team to decidewhether the simulators are replacedwith real implementations as theybecome available in later iterationsEach approach has its advantagesUsing real components may providemore confidence that the applicationworks properly However using simula-tors may provide more control over thesystem during the tests and as withthe mocks and fakes used in unit test-ing itrsquos sometimes easier to recreatefailures to verify that the applicationhandles them properly

AUTOMATED REGRESSIONTESTINGResponsible thorough testing mustverify not only the functional require-ments but non-functional requirementsas well Automated regression testingwhich seeks to verify functions thathave worked in the past can be aneffective way to provide this additionalcoverage and fill in the gaps left by theacceptance tests

Unlike unit and acceptance testingwhich most heavily involve developersand customers automated regressiontesting is performed by testersAutomated regression tests should

JULY 2009 wwwstpcollaborativecom bull 15

CHALLENGES OF TESTING IN AGILE ENVIRONMENTS

The agile environment exaggerates many of the challenges faced in more convention-

al approaches to testing First the pace of the development effort requires that the

application be tested within each iteration in parallel with the actual coding

Conventional approaches that focus on full regression during a time when the code is

frozen are ineffective in agile environments Agile testing tends to focus on the new

functionality and providing rapid feedback to the programmers However regression

testing is still important and some mechanisms by which the entire application may

be exercised end-to-end must be implemented

Testing on agile projects begins almost immediately However in the early iterations

there may not be much of the application to test Conventional approaches that focus

on running the completed application do not work well in agile environments Even

when automated efforts are employed in conventional environments the emphasis

often is on exercising the application from the outside in In other words scripts drive

the application through the UI In a methodology where the UI evolves with the rest of

the application though it is not possible to test the application in this manner

Requirements in agile environments tend to be less formal than in conventional

methodologies Requirements also referred to as stories in some agile iterations often

begin life as a single phrase or short sentence describing at a very high level what the

application must do As part of the iteration or release planning process the project

teammdasha cross-functional mixture of programmers testers business analysts and

stakeholdersmdashcollaborate to flesh out the stories The stakeholders often referred to

collectively as the customer focus on the observable ldquohappy-pathrdquo outcomes while

the testers add details to cover the odd exceptional or ldquocornerrdquo cases In some agile

methodologies these details are captured in the form of acceptance tests Conventional

approaches that use requirements documents as the basis of test cases tend to struggle

in agile settings

Agile depends heavily on effective collaboration among the programmers testers and

stakeholders It is not uncommon in agile environments to find testers and developers

working closely together on a daily basis This tight coordination of the various groups

often is absent in more conventional methodologies where the product transitions

through the development life cycle as a series of handoffs The business analysts deliv-

er the requirements to the architects who in turn deliver a design to the programmers

who deliver an application to the testers who test it according to the initial require-

ments People working in conventional settings sometimes are uncomfortable collabo-

rating with those in other groups

Successful agile efforts must find ways to overcome each of these challenges

Automation is essential to addressing the issue of pace By building automation infra-

structure early in the project when there is little real functionality available to test it

becomes far easier to run more tests later in the project Further this approach helps

to provide the regression testing necessary to ensure the product is evolving stably and

not losing functionality once working However not all tests can nor should be auto-

mated Skilled and experienced professional testers can add significant value through

effective manual testing By automating what can be automated testers have more

time to apply their talents through manual testing efforts

attempt to cover as much of the appli-cation as possible and they shouldrely on mocks and simulators sparing-ly This allows the tests to provide afairly realistic view of the state of theprogram and to more accurately pre-dict how it will behave in a productionenvironment

Creating and maintaining a usefulsuite of automated regression tests maybe the most difficult aspect of testing inan agile environment In early iterationsnot much of the application exists sothere are few meaningful regression

tests that can be written and executedAs the program grows the scope andcomplexity of the project may becomeoverwhelming and it may be challengingto keep up with the pace of develop-ment It is usually helpful to spend earlyiterations creating a framework forautomated regression tests andpreparing outlines and test scripts forfunctionality that will be added overtime By investing in the scaffolding inthe beginning it becomes possible forthe testers to add new regressiontests incrementally Also the existing

regression tests help to ensure thatthe application continues to performaccording to the expectations estab-lished in prior iterations This makes itpossible for the testers to continue tofocus on new functionality rather thanconstantly reconfirming existing behav-ior is correct

In conventional development meth-odologies there are many categoriesof tests intended to exercise the appli-cation end-to-end These include per-formance tests load tests stresstests integration tests smoke testsand others In the end each of thesemay be thought of as a type of regres-sion test All attempts to determinehow well the product meets the func-tional and non-functional requirementsalso attempt to exercise the completeapplication and all help to ensure thatfunctionality is never lost once it hasbeen implemented

One of the key components ofagile methodologies is a focus on busi-ness value Therefore any of thesetypes of tests that provide valuableand meaningful feedback about thestate of the project should be givenattention in agile environments Forexample if performance is important tothe application then performancetests should be automated and made apart of the regression test suite Thesame is true for any of the other typesof tests mentioned here

Because automated regression

[Automated

regression tests

should cover

as much of the

application

as possible

and should

use mocks

sparingly]

16 bull Software Test amp Performance JULY 2009

tests may serve a variety of purposesthere is no standard approach to imple-menting them Often a combination oftools must be employed to address eachspecific situation Both commercial andfree tools are available for the differenttypes of testing and each may be bestsuited to a particular type of applicationIn addition some teams build customframeworks using Ruby Java PerlPython or another language to get thebest match for their project

MANUAL EXPLORATORY TESTINGThere are two main types ofdefects those under whichthe application fails to performaccording to expectations (asdefined by its requirements)and those that cause theapplication to behave improp-erly in response to an unex-pected event Automatedtests can be effective at find-ing the former For a given setof expected inputs the appli-cation should produce a specific output Howeverautomation is not particularlygood at finding the unexpect-ed

Consider for example anapplication that permits a userto perform tasks A B and CAs the product was beingdeveloped no one expected that A Band C were in any way related Howeverwhen the user performs these tasks in aspecific order the program crashes It isunlikely that the functional requirementsinclude a statement like ldquoThe systemshall not crash when the user performsA then C then Brdquo As a result it isunlikely that an automated test exists tosimulate the situation This is where man-ual exploratory testing can be beneficial

Manual exploratory testing gives thetester an opportunity to truly demon-strate their value to a project Testerstend to be good at coming up withunusual ldquowhat ifrdquo scenarios that repre-sent what actual users will do with theapplication in the production environ-ments These might be long series oftasks odd combinations of events orprolonged use of the program Oftenthese scenarios are too difficult or cost-ly to automate

Manual exploratory testing is morethan the tester wandering randomlythrough the application To be done

effectively the tester must start with aplan He needs to isolate a section ofthe program to explore and time box theexploration Once the tester begins theexploration he should create a record ofthe actions taken This is crucial to theability to recreate any problems foundScreen recording tools also might proveuseful for documenting exactly what thetester does during an exploratory testingsession When flaws are found duringthis process new acceptance or regres-sion tests should be written (and auto-mated) to expose the bad behavior pro-

vide a verification that it has been cor-rected and ensure that it does notreturn

For software testing to be success-ful in an agile development environmenta four-part strategy comprising compli-mentary approaches has given me thebest results

Automated unit testing puts theonus of removing obvious errors ondevelopers where it belongs and resultsin higher initial quality delivered totesters

Automated acceptance testingsolves the critical problem of engagingstakeholders to identify key functionalityand to define what it means for the pro-gram to work correctly and provide con-fidence that the core functionality isbeing properly implemented

Automated regression testing pro-vides thorough coverage ensures thatthe application meets functional and non-functional requirements and guaranteesthat functionality once working is not dis-abled in later iterations

Manual exploratory testing exposesunexpected behavior by letting testersexercise the application in ways too dif-ficult or costly to automate

This four-phase approach to testingenables a project team to effectively andthoroughly determine an applicationrsquosreadiness for release Consistent with

agile practices testingbegins early in the softwaredevelopment life cycle and itis done constantly through-out the project Every teammember participates in someway in the testing effort andthe mixture of testing typesminimizes the risks associat-ed with any single techniqueBy automating what can beautomated it becomes pos-sible to provide testers withsufficient time to fully applytheir unique skills and experi-ences yacute

REFERENCES AND FURTHERREADINGbull Cunningham Ward Framework

for Integrated Test October 122007 httpfitc2com

bull Dunkin Robert Burrell TheLanguage of Testing Dummies

Mocks Fakes and Stubs Weirdest UndreamtUse Case November 29 2008httpwwwjrollercomrobertburrelldonkinentrytdd_mocks_fakes_stubs

bull FitNesse Front Page FitNesse httpwwwfitnesseorg

bull Fowler Martin Mocks Arenrsquot Stubs MocksArenrsquot Stubs January 2 2007 httpwwwmartinfowlercomarticlesmocksArentStubshtml

bull jUnitorg Resources for Test DrivenDevelopment jUnit httpwwwjunitorg

bull Porter Robert Mock Objects in Test DrivenDesign BlogOfBob October 7 2007httpwwwrp2ccomblogofbobMockObjectsInTestDrivenDesignaspx

bull Rothman Johanna Does Exploratory TestingHave A Place On Agile Teams StickyMinds July7 2008httpwwwstickymindscomsitewideaspFunction=edetailampObjectType=COLampObjectId=13860

[Manual exploratory

testing is more than

wandering randomly

through the application

the tester must start

with a plan]

LEARN TEST AUTOMATION from Robert Walsh at the Software Test ampPerformance Conference this fall Hersquoll be with us in Cambridge MAOct 19-23 teaching ldquoFirst Steps to Test Automationrdquo an in-depth coursethat can automate your testing fast VViissiitt wwwwwwssttppccoonnccoomm

observed with clients using agile methodologies typi-cally Scrum While my associates and I have helpedclients overcome these challenges I will not present allof our suggested solutions here Instead Software Testamp Performance magazine has given test consultant andcertified Scrum Master Bob Galen an opportunity tooffer solutions

What I describe here are challenges we haveobserved or discussed with clients who told us they werefollowing an agile life cycle Some of these challengesmay be inherent in agile methodologies However somemay in fact actually arise from improper understanding orimplementation of the chosen methodology resulting inavoidable problems

VOLUME AND SPEED OF CHANGE One of the principles of agile development is that proj-ect teams should welcome changing requirementseven late in development Many testing strategies

especially analytical require-ments-based testing becomequite inefficient in such situa-tions While automated test-ing can help accommodateon-going change automationcan be fragile and high-main-tenance if not implementedproperly We find that risk-based testing coupled withreactive techniques likeexploratory testing and prop-

By Rex Black

Every software development model presents itsown testing implications Some ease the testingprocess others make it more challenging Whatfollows is a look at some challenges I have

Rex Black is founder and president of RBCS a software hardware andsystems consultancy and is a founding member of the STP CollaborativeStrategic Advisory Board

18 bull Software Test amp Performance JULY 2009

The AgileProcessCreatesTestingPuzzles

Which AreSolved When

Agile IsPerformedCorrectly

of traditional testing consultants I donrsquot believe theywere gathered from the perspective of an actual agiletester one with serious experience working in matureand high performing agile teams

Such views often come from the perspective of ascribe or observer of teams who at least from my per-spective arenrsquot practicing agility very well Or theyrsquoreanecdotal reactions to the Agile Manifesto or other writ-ings without trying to understand the underlying intentionsWhatrsquos discouraging is that superficial reactions from wellrespected sources can give the wrong impression

VOLUME AND SPEED OF CHANGEThe point of embracing change is the realization thatmost software projects are volatile Business conditionsare incredibly dynamic Customers may not actuallyknow what they need until they see the application startto coalesce as working software The developers maynot have understood the challenges associated with thearchitecture without actuallybuilding parts of it nor mighttesters understand the mostrisk-based areas to test with-out seeing the application run

The point of trying to freezerequirements at a historicalinstant in time where thingsare well understood withoutwriting a lick of software is afoolrsquos errand Agile methods tryto embrace change react to it

Iwas asked by Software Test amp Performance magazine to provide a rebuttal to Rex Blackrsquos perspectives on how ldquoagile challenges testingrdquoRexrsquos views and reactions to agility are typical

By Bob GalenBob Galen is principal consultant of the RGalen Consulting Group anda founding member of the STP Collaborative Strategic Advisory Board

JULY 2009 wwwstpcollaborativecom bull 19

erly-done automation works bestWhatrsquos more change-related challenges often arise from

changes in the definition of the product and its correct behav-ior When the test team is not kept informed of thesechanges or when the rate of change is very high this canimpose inefficiencies on the development execution andmaintenance of tests

SHORT ITERATIONS SQUEEZE TESTINGIn sequential life cycles if the organization involves the testteam early in the project the testers have lots of time todevelop and maintain their tests and can do so in parallel withthe development of the system and prior to the start of sys-tem test execution Some more formal iterative lifecycle mod-els such as Rapid Application Development and the RationalUnified Process often allow substantial periods of timebetween test execution periods for each iteration Theseintervals allow test teams develop and maintain their test sys-tems

Most teams using agile life cycles progress more quicklyeg Scrum uses sprints two to four week periods of shortfast-paced development iterations I have seen this pacesqueeze the test teamrsquos ability to develop and maintain testsystems Testing strategies that include an automation ele-ment also have proven particularly sensitive to this challenge

UNIT TESTING IS OFTEN INADEQUATE Most agile proponents stress good automated unit testing andframeworks such as JUnit have helped simplify and minimize

and make the changes visible across the team and to stakehold-ers while guiding the software toward the customersrsquo true needsIf the traditional testing approaches that Rex alludes to can notadapt to change effectively then Irsquod argue that they need tochange

SHORT ITERATIONS SQUEEZE TESTINGIf your parochial goal is simply testing then I agree that shortiterations can be quite challenging But if your goal is to deliversmall chunks of working software to gain customer and prod-uct quality feedback and to ultimately deliver high value andworking products applications then small iterations rock

To Rexrsquos point regarding ldquosqueezing test system mainte-nancerdquo all of the agile methods adhere to the notion of com-pleting a small set of features that meet rdquodonenessrdquo criteriaat the end of each Iteration or Sprint

There should be no work that is left outmdashfor exampleappropriate test automation development or maintenance oftest ldquosystemsrdquo

Itrsquos up to the testers to simply place this work on theProduct and Sprint Backlogs and plan its completion as partof each Iteration or Sprint

Sometimes this takes effort and courage to defend this workagainst feature pressure However all team members encounterthe same challenge and need to ensure that they deliver ldquodonerdquosoftware from each of their functional perspectives

UNIT TESTING IS OFTEN INADEQUATEFirst agile methods stress automated testing of all kindsmdashnotsimply unit tests FitNesse Selenium and Watir Watin are

examples of open source automation tools that are widelyused in the agile community and that focus on non-unit testing

We do stress unit tests as being an important part of theoverall coverage and as being a safety net for the teamsrsquo refac-toring efforts We want developers to be responsible for con-tributing to test automationmdashnot just testers And if a team isusing TDD as a practice wersquove learned that developing testscan be a powerful aid as one designs their software in smallincremental steps

No agile team to my knowledge views unit tests as theend-all of testing In fact as part of delivering ldquodonerdquo softwareincrements they must decide on the set of tests (unit func-

20 bull Software Test amp Performance JULY 2009

[I expect testers to pair with

developers to improve practices anddevelopers

to deliver solid unit tests before they

can be considered lsquodonersquo]

tional acceptance non-functional regression etc) thatshould be run within the Sprint in order to meet a customer-ready deliverable Every team member should be definingthese testing tasks with the testers taking the lead

To Rexrsquos last point that programmers donrsquot commit to doingeffective unit testing I expect the testers to pair with the devel-opers to improve these practices and I expect developers todeliver solid unit tests before they can be considered ldquodonerdquo

MANAGING THE INCREASED REGRESSION RISKThere is a continuing misperception that unit tests are theonly tests run on a Sprint-by-Sprint basis While itrsquos true thatthe team might not be able to run a full regression test with-in each Sprint it is their responsibility to apply risk-basedtechniques based upon whatrsquos been added or changed andto run appropriate functional non-functional exploratory andregression tests as part of delivering ldquodonerdquo software at theend of the Sprint

Clearly automation development at all levels helps withthis so the team should be investing time in improving theiroverall ability to automate all of the testing types I mention asappropriate

POOR OR MISSING TEST ORACLESI hate to break the news to Rex but the best test oracles arethe people within the team not the documentation Documen-tation almost always loses its relevance and accuracy Yes itrsquoshelpful but should never be considered the single source forall knowledge

What agility encourages is for the team to talkmdashto plan their

the costs Such automated unit tests allow what agile propo-nents call refactoring Refactoring is the redesign of majorchunks of code or even entire objects The automated unit

tests provide for quick regression testing of refactored codeSome agilists recommend substituting automated unit tests fordesign as in Test Driven Development While this soundsgood in theory there are two problems that I have observedwith this approach in practice

First unit testing has limited bug-finding utility In researchpublished in June 2002 in CrossTalk the Journal of Defense

Software Engineering author Capers Jones found unit testingto be only 25 to 30 percent effective at finding and removingdefects In RBCS assessments we have found that good sys-tem testing by an independent test team averages around 85percent effective at finding defects So organizations that careabout quality should have both good unit testing and good sys-tem testing

Second while unit testing is often thought of as thedeveloperrsquos job I have found that many programmers justdonrsquot do it or deliver cursory attempts This can create com-plications and delays for the system test team in its role asldquobug finder of last resortrdquo as they deal with buggy code dur-ing short test execution periods in the sprints This can alsolead to code churn during testing since so much code mustchange to fix the bugs

The amount of change can ultimately outstrip even theability of the best automated regression test system to keepup which then leads to lower defect detection effectivenessfor the test team

MANAGING THE INCREASED REGRESSION RISKCapers Jones also reported that regression accounts forabout seven percent of bugs In iterative life cycles such asScrum however code that worked in previous sprints getschurned by new features in each subsequent sprint Thisincreases the risk of regression

Agilists emphasize good automated unit testing in part tomanage the regression risk inherent in such churn But with eventhe best unit testing topping out at about 50 percent defect

JULY 2009 wwwstpcollaborativecom bull 21

[While unit testing is often thought

of as the developerrsquos job I have found that

they just donrsquot do it or [they] deliver

cursory attempts]

22 bull Software Test amp Performance JULY 2009

work together to cluster in small groups working on specific fea-tures to work with the customer gaining valuable feedbackmdashandto collaborate across disciplines From this they can decidewhere when and to what degree documentation is necessary

We expect team members to be professionals and to beresponsible and accountable We trust their judgment overblindly filling in a plan template or a requirement document orblindly running the same tests over and over again to meetsome perceived risk

All oracles have a cost We engage the customers in thesedecisions as wellmdashensuring that those oracles that are providedare actually used and kept current

THE SHIFTING TEST BASISIrsquom lost here Agile teams define requirements Some usePost-it Notes and cards or define them as User StoriesOthers describe use cases or use traditional requirementforms Many even add a construct called acceptance tests ona finely grained story level that helps define the critical ele-ments of each feature or requirement These are things thatthe customer will ldquotestrdquo as part of their acceptance at the endof a Sprint or Iteration

Irsquod guess that Rex has seen poor or missing requirements inhis agile experience and Irsquod say those were not mature or solid-ly performing agile teams ldquoGoodrdquo teams define requirementsat a level that assures the team will build what the customerneeds They ensure collaboration surrounding each require-mentmdashat the point of attack

To capture a key point here The agile methods try to adopta Just-in-Time and Just Enough approach to their requirement

removal effectiveness according to Jones automated regressiontesting via unit tests will miss at least half of the regression bugs

POOR OR MISSING TEST ORACLESAgile favors working software over written documentationand in my observations special scorn is reserved for specifi-cations The suggestion by the Agile Manifesto (wwwagile-manifesto org) that people should value ldquoworking softwareover comprehensive documentationrdquo can create real chal-lenges for a test team when taken too far Testers userequirements specifications and other documents as testoracles ie the means to determine correct behavior undera given test condition We have seen testers in agile situa-tions given documents with insufficient detail and in somecases given no documents at all

Even when testers are given adequate documents twoother agile development principles keep the test oracle chal-lenge alive First agile requires teams to embrace changeand espouses that ldquothe most efficient and effective methodof conveying information to and within a development team isface-to-face conversationrdquo

From what Irsquove seen these two principles allow the proj-ect team to change the definition of correct behavior at anytime including after testers have created tests to confirm aparticular behavior and even after testers have reported bugsagainst a particular behavior

Further the definition of correct behavior can change in ameeting or a discussion where a tester is not present If thatchange is not communicated to the tester the project suffers

false positives when testers report bugs against behaviornow defined as correct

THE SHIFTING TEST BASISRequirements-based testing strategies cannot handle vague ormissing requirements specifications Missing requirementsspecifications would mean a test team following a requirements-based testing strategy not only canrsquot say what it means for aparticular test to pass or fail they wouldnrsquot have a test basisupon which to base their tests

The test basis also provides a means to measure theresults In a requirements-based test strategy testers canrsquotreport test results accurately if the requirements are missing orpoor Testers canrsquot report the percentage of the test basis cov-ered by passed tests because the requirements wonrsquot provideenough detail for meaningful coverage analysis

TOO MANY MEETINGSA ldquoface-to-face conversationrdquo is just another way to describe ameeting A client once jokingly described Scrum as a heavy-weight process saying ldquoIrsquom surprised at the name agilemdashitshould be called couch potato There are too many meetingsTherersquos too much jawboning I find it really ironic that there are allthese books explaining how simple it isrdquo

Perhaps it was intentional not to use the dreaded word meet-ing in the Agile Manifesto but the result is often the same In someorganizations everyone gets invited to every meeting meetingsballoon in number and duration and managers and leads are notavailable to manage and lead their team because they are in meet-

ings much of the day Effectiveness and efficiency drop Anotherclient once told me of a senior manager who said in response toa complaint from a line manager about how attending meetingswas negatively impacting his ability to lead his team ldquoWersquore goingto continue to have these meetings until I find out why nothing isgetting done around hererdquo The senior manager was I am told theonly one in the room who didnrsquot get the irony

THE BOTTOM LINEIn RBCS client engagements we typically recommend blend-ed testing strategies that align well with Scrum and otheragile methodologies What seems most likely to work in manycases is a blend of risk-based testing maintainable automat-ed testing (at both the unit and system level) and a percent-age of effort spent on reactive testing (eg exploratory) Insome cases our methods will help mitigate the testing risksand reduce the testing challenges associated with thesemethodologies

Clearly there is no single way to perform effective soft-ware testing and as the industry evolves so too will the mosteffective methods that do the most good yacute

development An option of ldquono requirementsrdquo does not exist

TOO MANY MEETINGSThe point of face-to-face collaboration is that itrsquos the most effec-tive and richest way to communicate in technical teams Has anydeveloper ever implemented a requirement had a tester test itand a customer receive it with perfect alignment That wouldtruly be exceptional More oftenthe tester will say the softwareperforms differently than the testsandor the customer says I didnrsquotask for that even though theywere all working off of the exactsame requirement

The point is that technicalteams need to rely on written aswell as face-to-face collabora-tion to effectively reduce themisunderstandings and reworkthat written requirements alone often drive

Beyond this agile teams should be conducting incrediblyeffective meetings something perhaps lacking in the exam-ples Rex has seen I suspect this problem isnrsquot agile-centricand that those teams simply need a good facilitator

THE BOTTOM LINEEffectively testing todayrsquos software is incredibly challenging andwe all need to stop thinking of agile testing as somethingopposed to traditional testing Testers need every possible tool

to perform testing well Agile methods have emerged as a set ofpractices and a mindset that is context based and powerful Theytake a holistic view to testing and qualitymdashreinforcing the two asa whole Testers in this model have a powerful opportunity to notonly test but to champion and guide the overall teamrsquos focus ondelivering quality software

We need to stop our history of opposition to new testing app-roaches and begin embracing abroader set of testing practicesthat work in each projectrsquos uniquecontext I also believe we that weneed to stop quoting CapersJones as I suspect his data andstudies do not reflect the dynam-ics of mature agile teams Irsquom look-ing forward to encouraging Rex totry and gain deeper understandingof agile testing practices yacute

Bob will have ample oppor-tunity to do that in October when he and Rex meet for thefirst time as fellow members of STampP CollaborativersquosStrategic Advisory Board ndashEd

JULY 2009 wwwstpcollaborativecom bull 23

[Teams need to rely on written and face-to-face

collaboration to effectively reduce rework]

LEARN AGILE TESTING TECHNIQUESdirectly from Bob Galen himself this fall atSTPCon The Scrum Master will be on hand inCambridge MA Oct 19-23 to teach ldquoAgile andHigh-Speed Software Testing Techniquesrdquo anin-depth training course for bringing agile test-

ing to your organization VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

LEARN TEST MANAGEMENT from Rex Blackat the Software Test amp Performance Conferencethis fall in Cambridge Mass Learn about ldquoTestManagement A Risk-Based Approachrdquo the prac-tice that guides test activities based on carefulassessment and prioritization of the risks involved VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

ciplines and draw on a variety of skills to do their jobs And whilethere are many unsung testing heroes inside software develop-ment shops there are the beginnings in many organizations of agrowing recognition that test and performance engineers shouldbe considered peers of their development counterparts

We asked several testing gurus to share their thoughts onthe state of the testing industry and give us some insights on wherethey see things evolving One thing for anyone to realize is thatwhile testing as an industry is maturing it is still relatively new Thesoftware tester job isnt something that shows up on the US CensusWe are still very much a new profession says Scott Barber thefounder and chief technologist of PerfTestPlus in Palm Bay Fla Mostof our very first programmers are still alive we arent thousands ofyears old like other disciplinesrdquo

Perhaps the best rundown of the industryrsquos evolution is summed up by Michael Boltonwho lives in Toronto Testing has historically been confirmation validation and verificationhe says I believe that testing must become far more about exploration discovery investi-gation learning and communicating what youve learnedrdquo Bolton runs test consultancyDevelopSense and teaches rapid software testing methods ldquoI like to encourage people torecognize and manage uncertainty and ambiguity and instead of trying to eliminate them get-ting them down to levels that are suffi-ciently low for people to do use-ful workrdquo

While any gathering ofsoftware testers will find dis-agreement on overall pur-pose one thing is cleartesting isnt about how toimprove code qualityndashthat isthe role of the engineeringpeople who do the coding workand of management says BoltonTesting is about providing informationto those people so that better-informed decisionsabout quality are possible

The future careers for the current cropof test engineers seem to be on the rise

All this activity has made software testing a very big tent The industry has matured and grownto encompass numerous approaches to methods training and tools Because of this diver-sity there is a growing consensus that the best testers are those that can span multiple dis-

The software testing industry is at a historic crossroads Rich Internet applica-tions cloud computing the rise of virtualization and more capable mobiledevices have all made the job of producing reliable software more complexwhile at the same time enabling more sophistication for automated test tools

Wersquore All Part of the Story

We are a new profession most of our first programmers are

still alivendash Scott Barber

A skilledtester is theproduct of avariety of

experiencesndash Jon Bach

24 bull Software Test amp Performance

the business outcomes theyrsquove delivered he says He goes on to describe the best mix in his development teams A performance

engineering organization should be comprised of engineers that are just as capableor more capable than the core product engineers This allows for the performanceteam to be just as familiar with the product code base and for them to go as far asmaking performance enhancements to the product and testing the outcome of thosechanges

At Google employees involved in testing generally have one of three titles saysDavid Saff a Google software engineer in test whorsquos basedin Cambridge Mass Software engineers are expected to unit-test their own code and many take an active role in maintain-ing testing frameworks and system tests for their teams Testengineers look at the quality requirements of a project identifyholes and fill them by applying existing automation tools where pos-sible or managing manual testing teams where needed Softwareengineers in test look for holes in the tools available and serve as internal

tool developers and implementation consultants Any teamcant survive without high competence in each domain

CLOUDS ARE FORMINGThe move toward cloud computing has helped softwaretesting overall but also made for some new challengestoo Cloud computing adds layers of complexity to test-ing but it also makes things more interesting says BoltonThese layers include being able to predict latency and avail-ability of all components that reside in the cloud Barber saysthat Too many people take latency for granted still and many

application developers still take the entire network infrastruc-ture for granted and dont consider it part of their domain

so programmers just assume it is working unless theircode has a problem

Bartow agrees Network latency hasbecome even more important with the intro-duction of Rich Internet Application tech-nologies such as Adobe Flex Every value that you enter

David Strom is a freelance reporter and author living in St Louis

particularly within large software shops such as Intuit and Google There software testengineers are often more important than the development engineers who writethe original code and have their own career track with parallel compensationand benefits

At Intuit QA engineers are held to the exact same standards and expec-tations as product developersrdquo says Dan Bartow a manager of performanceengineering for the company which makes TurboTax and other financial softwareldquoIn fact my team of five full-time performance engineers is calibrated the sameway as the product engineers at the end of year in terms of their skill-sets and

Virtualizationmakes things

easierand harder

ndash Michael Bolton

wwwstpcollaborativecom bull 25

of the Story

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 7: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

IN FEBRUARY OF 2001 17 TECH-nology professionals gathered for a peerconference in Park City Utah and craft-ed a document called the Agile Man-ifesto That document is probably thesingle most influential statement on soft-ware development of the past decadeand its values led to a series of newideas and concepts insoftware testing

Agile SoftwareMethodsXP Short for ExtremeProgramming XP was cre-ated by Kent Beck duringhis work at Chrysler Atthe time projects at theauto maker struggled withheavyweight specifica-tions specialized rolesand enough turnover thatprojects would simply burnout mid-stream delivering no workingsoftware at all before being canceledExtreme Programming focuses on deliver-ing software by first developing simpleworking features and then adding morefeatures incrementally while improvingthe code constantly Most XP techniquessuch as Pair Programming Test DrivenDevelopment and Refactoring are nowconsidered engineering practices

Scrum Popularized at (now defunct)Easel Corp by Ken Schwaber and JeffSutherland At Easel requirementswere changing so quickly that thedevelopers could never get anythingcompletely finished Scrum focuses onstable requirements for each time boxcommitments from team membersand communication through dailystand up meetings Most of the focusof Scrum is on project management

DSDM Short for Dynamic SystemsDevelopment Method Developed in theUnited Kingdom DSDM advocatesnotice that many projects have fixed staffand a fixed release date DSDM thereforefocuses on techniques to allow the teamto alter the project scope while maintain-ing high quality thus hitting the original

due date

Agile SoftwareProcessesTraditional Software Pro-cess generally breaksdown work by phase orby type of work all of therequirements first thenall of the coding then allof the testing and so onAgile processes insteadbreak down work by fea-ture one thin slice at atime of the final product

which the customer can use to actuallyand concretely do something new

Agile teams then batch the work upinto time boxes periodically bringing thecode to production quality Agile teamsrelease production code as often asevery few days to as rarely as every fewmonths Because of that teams needgreater discipline and skill to deal with anincreased regression testing burden

Iteration Used in Extreme Program-ming an iteration is a time box (typicallyone to three weeks) starting with arequirements exercise and ending withnew features at production quality

Sprint A short time box used in Scrumthat allows the technical staff to createcode ready for production without therequirements changing as they workTechnical staff are expected to own thedeliverable and release production fea-tures at the end of the time box

User Story A form of requirement thatdescribes a single specific meaningfulpiece of functionality that has value to the

business User stories are small and canbe implemented by a pair of programmersin one or a few days User Stories line uproughly with a single scenario within ause case in traditional software devel-opment A user story is not a specifica-tion it is an example of what the softwarewill allow a person to do

Story Card Stories are often express-ed on 3x5 index cards Agile Coach RonJeffries has described a story card as aplaceholder for a conversation

Story Tests A series of tests express-ed in plain English but eventually automat-ed Story tests provide examples for theprogrammers and testers and describethe minimum amount of functionality forthe story to be called done The goal ofstory tests is not to convey a comprehen-sive specification but to provide examplesso that the whole team can build a sharedmental model of what is expected Story Kickoffs A kickoff is the finalmeeting held before developers start

writing code The idea that testersshould be involved early in the processhas been around for decades The agiletwist is to have testers help as part ofthe design for the story During the kick-off developers product owners andtesters can discuss and ask questionsabout what the software will do this canlead to a better description and prevent

stamp pedia

10 bull Software Test amp Performance JULY 2009

Agile Testing in Practicec

The encyclopedia for software testing professionals

Matt Huesser andChris McMahon

Matt Heusser and Chris McMahon are careersoftware developers testers and bloggers Mattalso works at Socialtext where he performstesting and quality assurance for the compa-nyrsquos Web-based collaboration software

ldquoGood ideas can come

from anywhere at

any time rdquo

continued on page 36 gt

wwwseapinecomstpswiftSatisfy your quality obsession[ [

copy 2009 Seapine Software Inc All rights reserved

Swiftcovercom cut their testing time in half with TestTrack Studio and QA Wizard Pro while still providing the quality their customers expectSeapinersquos end-to-end Software Quality Assurance (SQA) solutions help you deliver quality products faster Start with

QA Wizard Pro for automated testing and add TestTrack Studio for issue tracking and test case managementmdashintegrated quality assurance solutions that together reduce testing time saving you money and improving customer satisfaction

Reduce quality assurance costs with automated functional and regression testing

Manage test case development defect assignments and QA verification with one application

Track which test cases have been automated schedule script runs and link test results with defects

Learn how to test faster while protecting quality Visit wwwseapinecomstpswift

Save timewhile protectingsoftware quality

So much of success boils down to time QA Wizard Pro and TestTrack Studio allow us to be more protable because we do more in less time mdash Test Manager Swiftcover

Test

Trac

kregPr

o T e

stTr

ackreg

TCM

T e

stTr

ackreg

Stud

io

Surr

ound

SCM

regSe

apin

e C M

regQ

A W

izar

dregPr

oIss

ue M

anag

emen

t Te

st C

ase

Man

agem

ent

Test

Pla

nnin

g amp

T rac

king

Co

nfi g

urat

ion

Man

agem

ent

Chan

ge M

anag

emen

t Au

tom

ated

Test

ing

Illus

trat

ion

by S

teve

Nym

an

necessary for the others to succeed They alsoprovide a system of checks and balances thatallows the strengths of one to compensate forthe weaknesses of the others Each pillar is theresponsibility of a different group within theagile development team and when each groupdoes its part my experience has shown that theresult is usually a successful project

AUTOMATED UNIT TESTINGUnit testing is perhaps the easiest type of test-ing to automate Automated unit testing is apractice through which nearly every line andbranch of code is covered with an automatedtest that proves the code works as intendedAutomated unit tests focus on the discreteunits or modules within the application eachperforming its specific tasks in isolation fromthe rest Automated unit tests may be writtenbefore or after the code under test is writtenbut itrsquos most effective to write them beforehand(a process known as Test-Driven Developmentor TDD)

With TDD the code tends to more close-ly represent the developerrsquos intent becausefor the developer itrsquos easier to write justenough code to satisfy the single new failingtest Less code written means less code tomaintain and fewer places where bugs canhide When the unit tests are written after thecode the tests scan sometimes show onlythat the code works as written and not nec-

essarily as intended Further therersquos a ten-dency to write more code than is necessarybecause the programmer is speculating aboutwhat conditions might exist rather than start-ing from concrete test cases that highlight thetrue requirements

Automated unit testing falls almost solelyon the shoulders of the developers and helpsto ensure that the code they are writingbehaves according to their interpretation ofthe requirements The word interpretation isimportant there because while a modulemight achieve 100 percent code coverage andpass all of its unit tests it might still fail tomeet the needs of the application If the pro-grammer misunderstands the requirementthe tests he writes will be flawed and theresulting code will misbehave Thatrsquos amongthe reasons that automated unit testing is notthe only form of automated testing employedon an agile project

Automated unit tests are generally writtenin the same language as the application beingtested A family of testing tools exists to facili-tate automated unit testing Among the mostwell-knows is jUnit created for Java by KentBeck and Erich Gamma with ports now avail-able for most languages These tools provide

By Robert Walsh

Successful agile testing relies on a strategy built on four pillars automated unit test-ing automated acceptance testing automated regression testing and manualexploratory testing As with columns that hold up a building no single pillar cansupport the entire load The four are interdependent and each provides benefits

THE BUILDING BLOCKS OF AGILEmdash

AUTOMATED UNIT ACCEPTANCE AND

REGRESSION TESTING PAVE THE WAY TO

MANUAL EXPLORATORY TESTING

Robert Walsh is president and manager of applicationdevelopment at EnvisionWare which provides soft-ware solutions for public and academic libraries

14 bull Software Test amp Performance JULY 2009

language elements for defining testcases combining tests into suites andasserting the various exit conditions thatdetermine whether the tests pass or failMany also provide a run-time environ-ment in which the tests execute Theserange from simple command-line reportsshowing the results of the tests to com-plete GUIs that integrate directly withthe developerrsquos programming environ-ment Automated unit tests often areintegrated with the continuous build sys-tem and are run each and every time theproject is built This helps to ensure thatthe developers know instantaneouslywhen a recent check-in has caused thesuite of unit tests to fail

When working with automated unittests developers commonly find itnecessary to use mock objects fakeobjects or method stubs to stand in forother parts of the application Forexample if a module obtains informa-tion from a database it might beadvantageous to use a fake databaseobject to provide canneddata that is similar to thereal data available in theproduction environmentThis approach gives theprogrammer more controlover the test environ-ment and lets them dic-tate which data isreturned and how to rep-resent a variety of inter-esting situations Furtherthe technique allowsdevelopers to simulateproblems and failures thatmay be difficult to triggerwhen using a real data-base These might includeconnectivity issues bufferoverruns and other errorconditions

Automated unit testsprovide several key ben-efits in an agile environ-ment First they make itpossible to achieve highlevels of code coveragealmost as a side-effect Ifa programmer is disciplined aboutTDD then near 100 percent coveragejust happens Second automated unittests help to provide an early warningindicator that shows when behaviorwithin a module has changed This canbe used to identify unintended side-effects of other development and toserve as a kind of safety net for later

refactorings If after reworking a mod-ule all of the existing unit tests passthe developer can be confident that themodule will function properly in thecontext of the full application Finallyautomated unit tests also help to raisethe initial quality of the applicationdelivered to testers By using tests aspart of the coding process developersare able to catch many of the minorerrors and trivial mistakes that wouldotherwise fall to testers Reducingobvious coding errors through unittesting helps free up testers for theimportant job of manual exploratorytesting

AUTOMATED ACCEPTANCETESTINGAn essential element to the success ofan application project is how well itmeets the needs of its users A prod-uct might be absolutely flawless per-fect in every way and still not do thejob One way to ensure that the busi-

ness needs are met iswith automated accept-ance testing

With automated ac-ceptance testing the cus-tomer often with help ofQA writes tests for thekey features that arescheduled to be imple-mented in the upcomingiteration These tests arewritten in a natural lan-guage that non-technicalpeople can read and as aresult they become exe-cutable specifications thatcannot get out-of-syncwith the application beingtested

Automated accept-ance tests tend to focuson successful use casesthey often ignore the sadpaths and exceptional situ-ations For this reason atest strategy built entirelyon automated acceptancetesting will be inadequate

to thoroughly test the applicationSometimes the testers will augment thetest suite created by the customer withadditional acceptance tests that docover more of the corner cases

Automated acceptance tests pro-vide the primary measure of progress asthe product evolves over the course ofan iteration or even across an entire

release plan The acceptance tests arewritten early in (or even before) the iter-ation and running them clearly illustrateshow many of the stories have been com-pleted As part of an iteration reviewmeeting the tests are run again todemonstrate that the scope selected atthe start of the iteration was implement-ed according to the customerrsquos own def-inition of what is correct This process ismainly a formality it should be no sur-prise to anyone whether the tests willpass or fail when run during the iterationreview meeting

Two approaches to automatedacceptance testing are the Frameworkfor Integrated Test (FIT) developed byWard Cunningham and FitNesse awiki originally written in Java by MicahMartin Robert C Martin and MichaelFeathers FitNesse extends the princi-ples established by Cunninghamrsquos tooland provides a mechanism by whichtests may be created and maintained inaddition to being executed Both are

[With

automated

acceptance

testing

the

customer

writes

tests for

the key

features]

based on creating tables of input datathat are fed into the application undertest The programrsquos output is capturedand compared to the expected resultswhich is also provided in the tables

While tests themselves are definedusing a combination of narrative textand tabular data theyrsquore executed byfixture code that accepts the inputfrom the tables calls the appropriatemethods in the application and pro-vides the output back to the test run-ner These fixtures are generally writ-ten in the language of the applicationand must be implemented and main-tained by developers However thegoal is that they be thin and serve onlyto translate inputs and outputsbetween the test harness and theapplication Therefore the overallimpact that working with the fixturecode has on the developersrsquo productiv-ity should be minimal

As with automated unit tests auto-mated acceptance tests are likely tobe run against an incomplete applica-tion Not much functionality exists inearly iterations so it may be necessaryto test one part before another hasbeen built Again simulators are oftenused to provide the missing behaviorsIt is up to the project team to decidewhether the simulators are replacedwith real implementations as theybecome available in later iterationsEach approach has its advantagesUsing real components may providemore confidence that the applicationworks properly However using simula-tors may provide more control over thesystem during the tests and as withthe mocks and fakes used in unit test-ing itrsquos sometimes easier to recreatefailures to verify that the applicationhandles them properly

AUTOMATED REGRESSIONTESTINGResponsible thorough testing mustverify not only the functional require-ments but non-functional requirementsas well Automated regression testingwhich seeks to verify functions thathave worked in the past can be aneffective way to provide this additionalcoverage and fill in the gaps left by theacceptance tests

Unlike unit and acceptance testingwhich most heavily involve developersand customers automated regressiontesting is performed by testersAutomated regression tests should

JULY 2009 wwwstpcollaborativecom bull 15

CHALLENGES OF TESTING IN AGILE ENVIRONMENTS

The agile environment exaggerates many of the challenges faced in more convention-

al approaches to testing First the pace of the development effort requires that the

application be tested within each iteration in parallel with the actual coding

Conventional approaches that focus on full regression during a time when the code is

frozen are ineffective in agile environments Agile testing tends to focus on the new

functionality and providing rapid feedback to the programmers However regression

testing is still important and some mechanisms by which the entire application may

be exercised end-to-end must be implemented

Testing on agile projects begins almost immediately However in the early iterations

there may not be much of the application to test Conventional approaches that focus

on running the completed application do not work well in agile environments Even

when automated efforts are employed in conventional environments the emphasis

often is on exercising the application from the outside in In other words scripts drive

the application through the UI In a methodology where the UI evolves with the rest of

the application though it is not possible to test the application in this manner

Requirements in agile environments tend to be less formal than in conventional

methodologies Requirements also referred to as stories in some agile iterations often

begin life as a single phrase or short sentence describing at a very high level what the

application must do As part of the iteration or release planning process the project

teammdasha cross-functional mixture of programmers testers business analysts and

stakeholdersmdashcollaborate to flesh out the stories The stakeholders often referred to

collectively as the customer focus on the observable ldquohappy-pathrdquo outcomes while

the testers add details to cover the odd exceptional or ldquocornerrdquo cases In some agile

methodologies these details are captured in the form of acceptance tests Conventional

approaches that use requirements documents as the basis of test cases tend to struggle

in agile settings

Agile depends heavily on effective collaboration among the programmers testers and

stakeholders It is not uncommon in agile environments to find testers and developers

working closely together on a daily basis This tight coordination of the various groups

often is absent in more conventional methodologies where the product transitions

through the development life cycle as a series of handoffs The business analysts deliv-

er the requirements to the architects who in turn deliver a design to the programmers

who deliver an application to the testers who test it according to the initial require-

ments People working in conventional settings sometimes are uncomfortable collabo-

rating with those in other groups

Successful agile efforts must find ways to overcome each of these challenges

Automation is essential to addressing the issue of pace By building automation infra-

structure early in the project when there is little real functionality available to test it

becomes far easier to run more tests later in the project Further this approach helps

to provide the regression testing necessary to ensure the product is evolving stably and

not losing functionality once working However not all tests can nor should be auto-

mated Skilled and experienced professional testers can add significant value through

effective manual testing By automating what can be automated testers have more

time to apply their talents through manual testing efforts

attempt to cover as much of the appli-cation as possible and they shouldrely on mocks and simulators sparing-ly This allows the tests to provide afairly realistic view of the state of theprogram and to more accurately pre-dict how it will behave in a productionenvironment

Creating and maintaining a usefulsuite of automated regression tests maybe the most difficult aspect of testing inan agile environment In early iterationsnot much of the application exists sothere are few meaningful regression

tests that can be written and executedAs the program grows the scope andcomplexity of the project may becomeoverwhelming and it may be challengingto keep up with the pace of develop-ment It is usually helpful to spend earlyiterations creating a framework forautomated regression tests andpreparing outlines and test scripts forfunctionality that will be added overtime By investing in the scaffolding inthe beginning it becomes possible forthe testers to add new regressiontests incrementally Also the existing

regression tests help to ensure thatthe application continues to performaccording to the expectations estab-lished in prior iterations This makes itpossible for the testers to continue tofocus on new functionality rather thanconstantly reconfirming existing behav-ior is correct

In conventional development meth-odologies there are many categoriesof tests intended to exercise the appli-cation end-to-end These include per-formance tests load tests stresstests integration tests smoke testsand others In the end each of thesemay be thought of as a type of regres-sion test All attempts to determinehow well the product meets the func-tional and non-functional requirementsalso attempt to exercise the completeapplication and all help to ensure thatfunctionality is never lost once it hasbeen implemented

One of the key components ofagile methodologies is a focus on busi-ness value Therefore any of thesetypes of tests that provide valuableand meaningful feedback about thestate of the project should be givenattention in agile environments Forexample if performance is important tothe application then performancetests should be automated and made apart of the regression test suite Thesame is true for any of the other typesof tests mentioned here

Because automated regression

[Automated

regression tests

should cover

as much of the

application

as possible

and should

use mocks

sparingly]

16 bull Software Test amp Performance JULY 2009

tests may serve a variety of purposesthere is no standard approach to imple-menting them Often a combination oftools must be employed to address eachspecific situation Both commercial andfree tools are available for the differenttypes of testing and each may be bestsuited to a particular type of applicationIn addition some teams build customframeworks using Ruby Java PerlPython or another language to get thebest match for their project

MANUAL EXPLORATORY TESTINGThere are two main types ofdefects those under whichthe application fails to performaccording to expectations (asdefined by its requirements)and those that cause theapplication to behave improp-erly in response to an unex-pected event Automatedtests can be effective at find-ing the former For a given setof expected inputs the appli-cation should produce a specific output Howeverautomation is not particularlygood at finding the unexpect-ed

Consider for example anapplication that permits a userto perform tasks A B and CAs the product was beingdeveloped no one expected that A Band C were in any way related Howeverwhen the user performs these tasks in aspecific order the program crashes It isunlikely that the functional requirementsinclude a statement like ldquoThe systemshall not crash when the user performsA then C then Brdquo As a result it isunlikely that an automated test exists tosimulate the situation This is where man-ual exploratory testing can be beneficial

Manual exploratory testing gives thetester an opportunity to truly demon-strate their value to a project Testerstend to be good at coming up withunusual ldquowhat ifrdquo scenarios that repre-sent what actual users will do with theapplication in the production environ-ments These might be long series oftasks odd combinations of events orprolonged use of the program Oftenthese scenarios are too difficult or cost-ly to automate

Manual exploratory testing is morethan the tester wandering randomlythrough the application To be done

effectively the tester must start with aplan He needs to isolate a section ofthe program to explore and time box theexploration Once the tester begins theexploration he should create a record ofthe actions taken This is crucial to theability to recreate any problems foundScreen recording tools also might proveuseful for documenting exactly what thetester does during an exploratory testingsession When flaws are found duringthis process new acceptance or regres-sion tests should be written (and auto-mated) to expose the bad behavior pro-

vide a verification that it has been cor-rected and ensure that it does notreturn

For software testing to be success-ful in an agile development environmenta four-part strategy comprising compli-mentary approaches has given me thebest results

Automated unit testing puts theonus of removing obvious errors ondevelopers where it belongs and resultsin higher initial quality delivered totesters

Automated acceptance testingsolves the critical problem of engagingstakeholders to identify key functionalityand to define what it means for the pro-gram to work correctly and provide con-fidence that the core functionality isbeing properly implemented

Automated regression testing pro-vides thorough coverage ensures thatthe application meets functional and non-functional requirements and guaranteesthat functionality once working is not dis-abled in later iterations

Manual exploratory testing exposesunexpected behavior by letting testersexercise the application in ways too dif-ficult or costly to automate

This four-phase approach to testingenables a project team to effectively andthoroughly determine an applicationrsquosreadiness for release Consistent with

agile practices testingbegins early in the softwaredevelopment life cycle and itis done constantly through-out the project Every teammember participates in someway in the testing effort andthe mixture of testing typesminimizes the risks associat-ed with any single techniqueBy automating what can beautomated it becomes pos-sible to provide testers withsufficient time to fully applytheir unique skills and experi-ences yacute

REFERENCES AND FURTHERREADINGbull Cunningham Ward Framework

for Integrated Test October 122007 httpfitc2com

bull Dunkin Robert Burrell TheLanguage of Testing Dummies

Mocks Fakes and Stubs Weirdest UndreamtUse Case November 29 2008httpwwwjrollercomrobertburrelldonkinentrytdd_mocks_fakes_stubs

bull FitNesse Front Page FitNesse httpwwwfitnesseorg

bull Fowler Martin Mocks Arenrsquot Stubs MocksArenrsquot Stubs January 2 2007 httpwwwmartinfowlercomarticlesmocksArentStubshtml

bull jUnitorg Resources for Test DrivenDevelopment jUnit httpwwwjunitorg

bull Porter Robert Mock Objects in Test DrivenDesign BlogOfBob October 7 2007httpwwwrp2ccomblogofbobMockObjectsInTestDrivenDesignaspx

bull Rothman Johanna Does Exploratory TestingHave A Place On Agile Teams StickyMinds July7 2008httpwwwstickymindscomsitewideaspFunction=edetailampObjectType=COLampObjectId=13860

[Manual exploratory

testing is more than

wandering randomly

through the application

the tester must start

with a plan]

LEARN TEST AUTOMATION from Robert Walsh at the Software Test ampPerformance Conference this fall Hersquoll be with us in Cambridge MAOct 19-23 teaching ldquoFirst Steps to Test Automationrdquo an in-depth coursethat can automate your testing fast VViissiitt wwwwwwssttppccoonnccoomm

observed with clients using agile methodologies typi-cally Scrum While my associates and I have helpedclients overcome these challenges I will not present allof our suggested solutions here Instead Software Testamp Performance magazine has given test consultant andcertified Scrum Master Bob Galen an opportunity tooffer solutions

What I describe here are challenges we haveobserved or discussed with clients who told us they werefollowing an agile life cycle Some of these challengesmay be inherent in agile methodologies However somemay in fact actually arise from improper understanding orimplementation of the chosen methodology resulting inavoidable problems

VOLUME AND SPEED OF CHANGE One of the principles of agile development is that proj-ect teams should welcome changing requirementseven late in development Many testing strategies

especially analytical require-ments-based testing becomequite inefficient in such situa-tions While automated test-ing can help accommodateon-going change automationcan be fragile and high-main-tenance if not implementedproperly We find that risk-based testing coupled withreactive techniques likeexploratory testing and prop-

By Rex Black

Every software development model presents itsown testing implications Some ease the testingprocess others make it more challenging Whatfollows is a look at some challenges I have

Rex Black is founder and president of RBCS a software hardware andsystems consultancy and is a founding member of the STP CollaborativeStrategic Advisory Board

18 bull Software Test amp Performance JULY 2009

The AgileProcessCreatesTestingPuzzles

Which AreSolved When

Agile IsPerformedCorrectly

of traditional testing consultants I donrsquot believe theywere gathered from the perspective of an actual agiletester one with serious experience working in matureand high performing agile teams

Such views often come from the perspective of ascribe or observer of teams who at least from my per-spective arenrsquot practicing agility very well Or theyrsquoreanecdotal reactions to the Agile Manifesto or other writ-ings without trying to understand the underlying intentionsWhatrsquos discouraging is that superficial reactions from wellrespected sources can give the wrong impression

VOLUME AND SPEED OF CHANGEThe point of embracing change is the realization thatmost software projects are volatile Business conditionsare incredibly dynamic Customers may not actuallyknow what they need until they see the application startto coalesce as working software The developers maynot have understood the challenges associated with thearchitecture without actuallybuilding parts of it nor mighttesters understand the mostrisk-based areas to test with-out seeing the application run

The point of trying to freezerequirements at a historicalinstant in time where thingsare well understood withoutwriting a lick of software is afoolrsquos errand Agile methods tryto embrace change react to it

Iwas asked by Software Test amp Performance magazine to provide a rebuttal to Rex Blackrsquos perspectives on how ldquoagile challenges testingrdquoRexrsquos views and reactions to agility are typical

By Bob GalenBob Galen is principal consultant of the RGalen Consulting Group anda founding member of the STP Collaborative Strategic Advisory Board

JULY 2009 wwwstpcollaborativecom bull 19

erly-done automation works bestWhatrsquos more change-related challenges often arise from

changes in the definition of the product and its correct behav-ior When the test team is not kept informed of thesechanges or when the rate of change is very high this canimpose inefficiencies on the development execution andmaintenance of tests

SHORT ITERATIONS SQUEEZE TESTINGIn sequential life cycles if the organization involves the testteam early in the project the testers have lots of time todevelop and maintain their tests and can do so in parallel withthe development of the system and prior to the start of sys-tem test execution Some more formal iterative lifecycle mod-els such as Rapid Application Development and the RationalUnified Process often allow substantial periods of timebetween test execution periods for each iteration Theseintervals allow test teams develop and maintain their test sys-tems

Most teams using agile life cycles progress more quicklyeg Scrum uses sprints two to four week periods of shortfast-paced development iterations I have seen this pacesqueeze the test teamrsquos ability to develop and maintain testsystems Testing strategies that include an automation ele-ment also have proven particularly sensitive to this challenge

UNIT TESTING IS OFTEN INADEQUATE Most agile proponents stress good automated unit testing andframeworks such as JUnit have helped simplify and minimize

and make the changes visible across the team and to stakehold-ers while guiding the software toward the customersrsquo true needsIf the traditional testing approaches that Rex alludes to can notadapt to change effectively then Irsquod argue that they need tochange

SHORT ITERATIONS SQUEEZE TESTINGIf your parochial goal is simply testing then I agree that shortiterations can be quite challenging But if your goal is to deliversmall chunks of working software to gain customer and prod-uct quality feedback and to ultimately deliver high value andworking products applications then small iterations rock

To Rexrsquos point regarding ldquosqueezing test system mainte-nancerdquo all of the agile methods adhere to the notion of com-pleting a small set of features that meet rdquodonenessrdquo criteriaat the end of each Iteration or Sprint

There should be no work that is left outmdashfor exampleappropriate test automation development or maintenance oftest ldquosystemsrdquo

Itrsquos up to the testers to simply place this work on theProduct and Sprint Backlogs and plan its completion as partof each Iteration or Sprint

Sometimes this takes effort and courage to defend this workagainst feature pressure However all team members encounterthe same challenge and need to ensure that they deliver ldquodonerdquosoftware from each of their functional perspectives

UNIT TESTING IS OFTEN INADEQUATEFirst agile methods stress automated testing of all kindsmdashnotsimply unit tests FitNesse Selenium and Watir Watin are

examples of open source automation tools that are widelyused in the agile community and that focus on non-unit testing

We do stress unit tests as being an important part of theoverall coverage and as being a safety net for the teamsrsquo refac-toring efforts We want developers to be responsible for con-tributing to test automationmdashnot just testers And if a team isusing TDD as a practice wersquove learned that developing testscan be a powerful aid as one designs their software in smallincremental steps

No agile team to my knowledge views unit tests as theend-all of testing In fact as part of delivering ldquodonerdquo softwareincrements they must decide on the set of tests (unit func-

20 bull Software Test amp Performance JULY 2009

[I expect testers to pair with

developers to improve practices anddevelopers

to deliver solid unit tests before they

can be considered lsquodonersquo]

tional acceptance non-functional regression etc) thatshould be run within the Sprint in order to meet a customer-ready deliverable Every team member should be definingthese testing tasks with the testers taking the lead

To Rexrsquos last point that programmers donrsquot commit to doingeffective unit testing I expect the testers to pair with the devel-opers to improve these practices and I expect developers todeliver solid unit tests before they can be considered ldquodonerdquo

MANAGING THE INCREASED REGRESSION RISKThere is a continuing misperception that unit tests are theonly tests run on a Sprint-by-Sprint basis While itrsquos true thatthe team might not be able to run a full regression test with-in each Sprint it is their responsibility to apply risk-basedtechniques based upon whatrsquos been added or changed andto run appropriate functional non-functional exploratory andregression tests as part of delivering ldquodonerdquo software at theend of the Sprint

Clearly automation development at all levels helps withthis so the team should be investing time in improving theiroverall ability to automate all of the testing types I mention asappropriate

POOR OR MISSING TEST ORACLESI hate to break the news to Rex but the best test oracles arethe people within the team not the documentation Documen-tation almost always loses its relevance and accuracy Yes itrsquoshelpful but should never be considered the single source forall knowledge

What agility encourages is for the team to talkmdashto plan their

the costs Such automated unit tests allow what agile propo-nents call refactoring Refactoring is the redesign of majorchunks of code or even entire objects The automated unit

tests provide for quick regression testing of refactored codeSome agilists recommend substituting automated unit tests fordesign as in Test Driven Development While this soundsgood in theory there are two problems that I have observedwith this approach in practice

First unit testing has limited bug-finding utility In researchpublished in June 2002 in CrossTalk the Journal of Defense

Software Engineering author Capers Jones found unit testingto be only 25 to 30 percent effective at finding and removingdefects In RBCS assessments we have found that good sys-tem testing by an independent test team averages around 85percent effective at finding defects So organizations that careabout quality should have both good unit testing and good sys-tem testing

Second while unit testing is often thought of as thedeveloperrsquos job I have found that many programmers justdonrsquot do it or deliver cursory attempts This can create com-plications and delays for the system test team in its role asldquobug finder of last resortrdquo as they deal with buggy code dur-ing short test execution periods in the sprints This can alsolead to code churn during testing since so much code mustchange to fix the bugs

The amount of change can ultimately outstrip even theability of the best automated regression test system to keepup which then leads to lower defect detection effectivenessfor the test team

MANAGING THE INCREASED REGRESSION RISKCapers Jones also reported that regression accounts forabout seven percent of bugs In iterative life cycles such asScrum however code that worked in previous sprints getschurned by new features in each subsequent sprint Thisincreases the risk of regression

Agilists emphasize good automated unit testing in part tomanage the regression risk inherent in such churn But with eventhe best unit testing topping out at about 50 percent defect

JULY 2009 wwwstpcollaborativecom bull 21

[While unit testing is often thought

of as the developerrsquos job I have found that

they just donrsquot do it or [they] deliver

cursory attempts]

22 bull Software Test amp Performance JULY 2009

work together to cluster in small groups working on specific fea-tures to work with the customer gaining valuable feedbackmdashandto collaborate across disciplines From this they can decidewhere when and to what degree documentation is necessary

We expect team members to be professionals and to beresponsible and accountable We trust their judgment overblindly filling in a plan template or a requirement document orblindly running the same tests over and over again to meetsome perceived risk

All oracles have a cost We engage the customers in thesedecisions as wellmdashensuring that those oracles that are providedare actually used and kept current

THE SHIFTING TEST BASISIrsquom lost here Agile teams define requirements Some usePost-it Notes and cards or define them as User StoriesOthers describe use cases or use traditional requirementforms Many even add a construct called acceptance tests ona finely grained story level that helps define the critical ele-ments of each feature or requirement These are things thatthe customer will ldquotestrdquo as part of their acceptance at the endof a Sprint or Iteration

Irsquod guess that Rex has seen poor or missing requirements inhis agile experience and Irsquod say those were not mature or solid-ly performing agile teams ldquoGoodrdquo teams define requirementsat a level that assures the team will build what the customerneeds They ensure collaboration surrounding each require-mentmdashat the point of attack

To capture a key point here The agile methods try to adopta Just-in-Time and Just Enough approach to their requirement

removal effectiveness according to Jones automated regressiontesting via unit tests will miss at least half of the regression bugs

POOR OR MISSING TEST ORACLESAgile favors working software over written documentationand in my observations special scorn is reserved for specifi-cations The suggestion by the Agile Manifesto (wwwagile-manifesto org) that people should value ldquoworking softwareover comprehensive documentationrdquo can create real chal-lenges for a test team when taken too far Testers userequirements specifications and other documents as testoracles ie the means to determine correct behavior undera given test condition We have seen testers in agile situa-tions given documents with insufficient detail and in somecases given no documents at all

Even when testers are given adequate documents twoother agile development principles keep the test oracle chal-lenge alive First agile requires teams to embrace changeand espouses that ldquothe most efficient and effective methodof conveying information to and within a development team isface-to-face conversationrdquo

From what Irsquove seen these two principles allow the proj-ect team to change the definition of correct behavior at anytime including after testers have created tests to confirm aparticular behavior and even after testers have reported bugsagainst a particular behavior

Further the definition of correct behavior can change in ameeting or a discussion where a tester is not present If thatchange is not communicated to the tester the project suffers

false positives when testers report bugs against behaviornow defined as correct

THE SHIFTING TEST BASISRequirements-based testing strategies cannot handle vague ormissing requirements specifications Missing requirementsspecifications would mean a test team following a requirements-based testing strategy not only canrsquot say what it means for aparticular test to pass or fail they wouldnrsquot have a test basisupon which to base their tests

The test basis also provides a means to measure theresults In a requirements-based test strategy testers canrsquotreport test results accurately if the requirements are missing orpoor Testers canrsquot report the percentage of the test basis cov-ered by passed tests because the requirements wonrsquot provideenough detail for meaningful coverage analysis

TOO MANY MEETINGSA ldquoface-to-face conversationrdquo is just another way to describe ameeting A client once jokingly described Scrum as a heavy-weight process saying ldquoIrsquom surprised at the name agilemdashitshould be called couch potato There are too many meetingsTherersquos too much jawboning I find it really ironic that there are allthese books explaining how simple it isrdquo

Perhaps it was intentional not to use the dreaded word meet-ing in the Agile Manifesto but the result is often the same In someorganizations everyone gets invited to every meeting meetingsballoon in number and duration and managers and leads are notavailable to manage and lead their team because they are in meet-

ings much of the day Effectiveness and efficiency drop Anotherclient once told me of a senior manager who said in response toa complaint from a line manager about how attending meetingswas negatively impacting his ability to lead his team ldquoWersquore goingto continue to have these meetings until I find out why nothing isgetting done around hererdquo The senior manager was I am told theonly one in the room who didnrsquot get the irony

THE BOTTOM LINEIn RBCS client engagements we typically recommend blend-ed testing strategies that align well with Scrum and otheragile methodologies What seems most likely to work in manycases is a blend of risk-based testing maintainable automat-ed testing (at both the unit and system level) and a percent-age of effort spent on reactive testing (eg exploratory) Insome cases our methods will help mitigate the testing risksand reduce the testing challenges associated with thesemethodologies

Clearly there is no single way to perform effective soft-ware testing and as the industry evolves so too will the mosteffective methods that do the most good yacute

development An option of ldquono requirementsrdquo does not exist

TOO MANY MEETINGSThe point of face-to-face collaboration is that itrsquos the most effec-tive and richest way to communicate in technical teams Has anydeveloper ever implemented a requirement had a tester test itand a customer receive it with perfect alignment That wouldtruly be exceptional More oftenthe tester will say the softwareperforms differently than the testsandor the customer says I didnrsquotask for that even though theywere all working off of the exactsame requirement

The point is that technicalteams need to rely on written aswell as face-to-face collabora-tion to effectively reduce themisunderstandings and reworkthat written requirements alone often drive

Beyond this agile teams should be conducting incrediblyeffective meetings something perhaps lacking in the exam-ples Rex has seen I suspect this problem isnrsquot agile-centricand that those teams simply need a good facilitator

THE BOTTOM LINEEffectively testing todayrsquos software is incredibly challenging andwe all need to stop thinking of agile testing as somethingopposed to traditional testing Testers need every possible tool

to perform testing well Agile methods have emerged as a set ofpractices and a mindset that is context based and powerful Theytake a holistic view to testing and qualitymdashreinforcing the two asa whole Testers in this model have a powerful opportunity to notonly test but to champion and guide the overall teamrsquos focus ondelivering quality software

We need to stop our history of opposition to new testing app-roaches and begin embracing abroader set of testing practicesthat work in each projectrsquos uniquecontext I also believe we that weneed to stop quoting CapersJones as I suspect his data andstudies do not reflect the dynam-ics of mature agile teams Irsquom look-ing forward to encouraging Rex totry and gain deeper understandingof agile testing practices yacute

Bob will have ample oppor-tunity to do that in October when he and Rex meet for thefirst time as fellow members of STampP CollaborativersquosStrategic Advisory Board ndashEd

JULY 2009 wwwstpcollaborativecom bull 23

[Teams need to rely on written and face-to-face

collaboration to effectively reduce rework]

LEARN AGILE TESTING TECHNIQUESdirectly from Bob Galen himself this fall atSTPCon The Scrum Master will be on hand inCambridge MA Oct 19-23 to teach ldquoAgile andHigh-Speed Software Testing Techniquesrdquo anin-depth training course for bringing agile test-

ing to your organization VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

LEARN TEST MANAGEMENT from Rex Blackat the Software Test amp Performance Conferencethis fall in Cambridge Mass Learn about ldquoTestManagement A Risk-Based Approachrdquo the prac-tice that guides test activities based on carefulassessment and prioritization of the risks involved VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

ciplines and draw on a variety of skills to do their jobs And whilethere are many unsung testing heroes inside software develop-ment shops there are the beginnings in many organizations of agrowing recognition that test and performance engineers shouldbe considered peers of their development counterparts

We asked several testing gurus to share their thoughts onthe state of the testing industry and give us some insights on wherethey see things evolving One thing for anyone to realize is thatwhile testing as an industry is maturing it is still relatively new Thesoftware tester job isnt something that shows up on the US CensusWe are still very much a new profession says Scott Barber thefounder and chief technologist of PerfTestPlus in Palm Bay Fla Mostof our very first programmers are still alive we arent thousands ofyears old like other disciplinesrdquo

Perhaps the best rundown of the industryrsquos evolution is summed up by Michael Boltonwho lives in Toronto Testing has historically been confirmation validation and verificationhe says I believe that testing must become far more about exploration discovery investi-gation learning and communicating what youve learnedrdquo Bolton runs test consultancyDevelopSense and teaches rapid software testing methods ldquoI like to encourage people torecognize and manage uncertainty and ambiguity and instead of trying to eliminate them get-ting them down to levels that are suffi-ciently low for people to do use-ful workrdquo

While any gathering ofsoftware testers will find dis-agreement on overall pur-pose one thing is cleartesting isnt about how toimprove code qualityndashthat isthe role of the engineeringpeople who do the coding workand of management says BoltonTesting is about providing informationto those people so that better-informed decisionsabout quality are possible

The future careers for the current cropof test engineers seem to be on the rise

All this activity has made software testing a very big tent The industry has matured and grownto encompass numerous approaches to methods training and tools Because of this diver-sity there is a growing consensus that the best testers are those that can span multiple dis-

The software testing industry is at a historic crossroads Rich Internet applica-tions cloud computing the rise of virtualization and more capable mobiledevices have all made the job of producing reliable software more complexwhile at the same time enabling more sophistication for automated test tools

Wersquore All Part of the Story

We are a new profession most of our first programmers are

still alivendash Scott Barber

A skilledtester is theproduct of avariety of

experiencesndash Jon Bach

24 bull Software Test amp Performance

the business outcomes theyrsquove delivered he says He goes on to describe the best mix in his development teams A performance

engineering organization should be comprised of engineers that are just as capableor more capable than the core product engineers This allows for the performanceteam to be just as familiar with the product code base and for them to go as far asmaking performance enhancements to the product and testing the outcome of thosechanges

At Google employees involved in testing generally have one of three titles saysDavid Saff a Google software engineer in test whorsquos basedin Cambridge Mass Software engineers are expected to unit-test their own code and many take an active role in maintain-ing testing frameworks and system tests for their teams Testengineers look at the quality requirements of a project identifyholes and fill them by applying existing automation tools where pos-sible or managing manual testing teams where needed Softwareengineers in test look for holes in the tools available and serve as internal

tool developers and implementation consultants Any teamcant survive without high competence in each domain

CLOUDS ARE FORMINGThe move toward cloud computing has helped softwaretesting overall but also made for some new challengestoo Cloud computing adds layers of complexity to test-ing but it also makes things more interesting says BoltonThese layers include being able to predict latency and avail-ability of all components that reside in the cloud Barber saysthat Too many people take latency for granted still and many

application developers still take the entire network infrastruc-ture for granted and dont consider it part of their domain

so programmers just assume it is working unless theircode has a problem

Bartow agrees Network latency hasbecome even more important with the intro-duction of Rich Internet Application tech-nologies such as Adobe Flex Every value that you enter

David Strom is a freelance reporter and author living in St Louis

particularly within large software shops such as Intuit and Google There software testengineers are often more important than the development engineers who writethe original code and have their own career track with parallel compensationand benefits

At Intuit QA engineers are held to the exact same standards and expec-tations as product developersrdquo says Dan Bartow a manager of performanceengineering for the company which makes TurboTax and other financial softwareldquoIn fact my team of five full-time performance engineers is calibrated the sameway as the product engineers at the end of year in terms of their skill-sets and

Virtualizationmakes things

easierand harder

ndash Michael Bolton

wwwstpcollaborativecom bull 25

of the Story

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 8: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

wwwseapinecomstpswiftSatisfy your quality obsession[ [

copy 2009 Seapine Software Inc All rights reserved

Swiftcovercom cut their testing time in half with TestTrack Studio and QA Wizard Pro while still providing the quality their customers expectSeapinersquos end-to-end Software Quality Assurance (SQA) solutions help you deliver quality products faster Start with

QA Wizard Pro for automated testing and add TestTrack Studio for issue tracking and test case managementmdashintegrated quality assurance solutions that together reduce testing time saving you money and improving customer satisfaction

Reduce quality assurance costs with automated functional and regression testing

Manage test case development defect assignments and QA verification with one application

Track which test cases have been automated schedule script runs and link test results with defects

Learn how to test faster while protecting quality Visit wwwseapinecomstpswift

Save timewhile protectingsoftware quality

So much of success boils down to time QA Wizard Pro and TestTrack Studio allow us to be more protable because we do more in less time mdash Test Manager Swiftcover

Test

Trac

kregPr

o T e

stTr

ackreg

TCM

T e

stTr

ackreg

Stud

io

Surr

ound

SCM

regSe

apin

e C M

regQ

A W

izar

dregPr

oIss

ue M

anag

emen

t Te

st C

ase

Man

agem

ent

Test

Pla

nnin

g amp

T rac

king

Co

nfi g

urat

ion

Man

agem

ent

Chan

ge M

anag

emen

t Au

tom

ated

Test

ing

Illus

trat

ion

by S

teve

Nym

an

necessary for the others to succeed They alsoprovide a system of checks and balances thatallows the strengths of one to compensate forthe weaknesses of the others Each pillar is theresponsibility of a different group within theagile development team and when each groupdoes its part my experience has shown that theresult is usually a successful project

AUTOMATED UNIT TESTINGUnit testing is perhaps the easiest type of test-ing to automate Automated unit testing is apractice through which nearly every line andbranch of code is covered with an automatedtest that proves the code works as intendedAutomated unit tests focus on the discreteunits or modules within the application eachperforming its specific tasks in isolation fromthe rest Automated unit tests may be writtenbefore or after the code under test is writtenbut itrsquos most effective to write them beforehand(a process known as Test-Driven Developmentor TDD)

With TDD the code tends to more close-ly represent the developerrsquos intent becausefor the developer itrsquos easier to write justenough code to satisfy the single new failingtest Less code written means less code tomaintain and fewer places where bugs canhide When the unit tests are written after thecode the tests scan sometimes show onlythat the code works as written and not nec-

essarily as intended Further therersquos a ten-dency to write more code than is necessarybecause the programmer is speculating aboutwhat conditions might exist rather than start-ing from concrete test cases that highlight thetrue requirements

Automated unit testing falls almost solelyon the shoulders of the developers and helpsto ensure that the code they are writingbehaves according to their interpretation ofthe requirements The word interpretation isimportant there because while a modulemight achieve 100 percent code coverage andpass all of its unit tests it might still fail tomeet the needs of the application If the pro-grammer misunderstands the requirementthe tests he writes will be flawed and theresulting code will misbehave Thatrsquos amongthe reasons that automated unit testing is notthe only form of automated testing employedon an agile project

Automated unit tests are generally writtenin the same language as the application beingtested A family of testing tools exists to facili-tate automated unit testing Among the mostwell-knows is jUnit created for Java by KentBeck and Erich Gamma with ports now avail-able for most languages These tools provide

By Robert Walsh

Successful agile testing relies on a strategy built on four pillars automated unit test-ing automated acceptance testing automated regression testing and manualexploratory testing As with columns that hold up a building no single pillar cansupport the entire load The four are interdependent and each provides benefits

THE BUILDING BLOCKS OF AGILEmdash

AUTOMATED UNIT ACCEPTANCE AND

REGRESSION TESTING PAVE THE WAY TO

MANUAL EXPLORATORY TESTING

Robert Walsh is president and manager of applicationdevelopment at EnvisionWare which provides soft-ware solutions for public and academic libraries

14 bull Software Test amp Performance JULY 2009

language elements for defining testcases combining tests into suites andasserting the various exit conditions thatdetermine whether the tests pass or failMany also provide a run-time environ-ment in which the tests execute Theserange from simple command-line reportsshowing the results of the tests to com-plete GUIs that integrate directly withthe developerrsquos programming environ-ment Automated unit tests often areintegrated with the continuous build sys-tem and are run each and every time theproject is built This helps to ensure thatthe developers know instantaneouslywhen a recent check-in has caused thesuite of unit tests to fail

When working with automated unittests developers commonly find itnecessary to use mock objects fakeobjects or method stubs to stand in forother parts of the application Forexample if a module obtains informa-tion from a database it might beadvantageous to use a fake databaseobject to provide canneddata that is similar to thereal data available in theproduction environmentThis approach gives theprogrammer more controlover the test environ-ment and lets them dic-tate which data isreturned and how to rep-resent a variety of inter-esting situations Furtherthe technique allowsdevelopers to simulateproblems and failures thatmay be difficult to triggerwhen using a real data-base These might includeconnectivity issues bufferoverruns and other errorconditions

Automated unit testsprovide several key ben-efits in an agile environ-ment First they make itpossible to achieve highlevels of code coveragealmost as a side-effect Ifa programmer is disciplined aboutTDD then near 100 percent coveragejust happens Second automated unittests help to provide an early warningindicator that shows when behaviorwithin a module has changed This canbe used to identify unintended side-effects of other development and toserve as a kind of safety net for later

refactorings If after reworking a mod-ule all of the existing unit tests passthe developer can be confident that themodule will function properly in thecontext of the full application Finallyautomated unit tests also help to raisethe initial quality of the applicationdelivered to testers By using tests aspart of the coding process developersare able to catch many of the minorerrors and trivial mistakes that wouldotherwise fall to testers Reducingobvious coding errors through unittesting helps free up testers for theimportant job of manual exploratorytesting

AUTOMATED ACCEPTANCETESTINGAn essential element to the success ofan application project is how well itmeets the needs of its users A prod-uct might be absolutely flawless per-fect in every way and still not do thejob One way to ensure that the busi-

ness needs are met iswith automated accept-ance testing

With automated ac-ceptance testing the cus-tomer often with help ofQA writes tests for thekey features that arescheduled to be imple-mented in the upcomingiteration These tests arewritten in a natural lan-guage that non-technicalpeople can read and as aresult they become exe-cutable specifications thatcannot get out-of-syncwith the application beingtested

Automated accept-ance tests tend to focuson successful use casesthey often ignore the sadpaths and exceptional situ-ations For this reason atest strategy built entirelyon automated acceptancetesting will be inadequate

to thoroughly test the applicationSometimes the testers will augment thetest suite created by the customer withadditional acceptance tests that docover more of the corner cases

Automated acceptance tests pro-vide the primary measure of progress asthe product evolves over the course ofan iteration or even across an entire

release plan The acceptance tests arewritten early in (or even before) the iter-ation and running them clearly illustrateshow many of the stories have been com-pleted As part of an iteration reviewmeeting the tests are run again todemonstrate that the scope selected atthe start of the iteration was implement-ed according to the customerrsquos own def-inition of what is correct This process ismainly a formality it should be no sur-prise to anyone whether the tests willpass or fail when run during the iterationreview meeting

Two approaches to automatedacceptance testing are the Frameworkfor Integrated Test (FIT) developed byWard Cunningham and FitNesse awiki originally written in Java by MicahMartin Robert C Martin and MichaelFeathers FitNesse extends the princi-ples established by Cunninghamrsquos tooland provides a mechanism by whichtests may be created and maintained inaddition to being executed Both are

[With

automated

acceptance

testing

the

customer

writes

tests for

the key

features]

based on creating tables of input datathat are fed into the application undertest The programrsquos output is capturedand compared to the expected resultswhich is also provided in the tables

While tests themselves are definedusing a combination of narrative textand tabular data theyrsquore executed byfixture code that accepts the inputfrom the tables calls the appropriatemethods in the application and pro-vides the output back to the test run-ner These fixtures are generally writ-ten in the language of the applicationand must be implemented and main-tained by developers However thegoal is that they be thin and serve onlyto translate inputs and outputsbetween the test harness and theapplication Therefore the overallimpact that working with the fixturecode has on the developersrsquo productiv-ity should be minimal

As with automated unit tests auto-mated acceptance tests are likely tobe run against an incomplete applica-tion Not much functionality exists inearly iterations so it may be necessaryto test one part before another hasbeen built Again simulators are oftenused to provide the missing behaviorsIt is up to the project team to decidewhether the simulators are replacedwith real implementations as theybecome available in later iterationsEach approach has its advantagesUsing real components may providemore confidence that the applicationworks properly However using simula-tors may provide more control over thesystem during the tests and as withthe mocks and fakes used in unit test-ing itrsquos sometimes easier to recreatefailures to verify that the applicationhandles them properly

AUTOMATED REGRESSIONTESTINGResponsible thorough testing mustverify not only the functional require-ments but non-functional requirementsas well Automated regression testingwhich seeks to verify functions thathave worked in the past can be aneffective way to provide this additionalcoverage and fill in the gaps left by theacceptance tests

Unlike unit and acceptance testingwhich most heavily involve developersand customers automated regressiontesting is performed by testersAutomated regression tests should

JULY 2009 wwwstpcollaborativecom bull 15

CHALLENGES OF TESTING IN AGILE ENVIRONMENTS

The agile environment exaggerates many of the challenges faced in more convention-

al approaches to testing First the pace of the development effort requires that the

application be tested within each iteration in parallel with the actual coding

Conventional approaches that focus on full regression during a time when the code is

frozen are ineffective in agile environments Agile testing tends to focus on the new

functionality and providing rapid feedback to the programmers However regression

testing is still important and some mechanisms by which the entire application may

be exercised end-to-end must be implemented

Testing on agile projects begins almost immediately However in the early iterations

there may not be much of the application to test Conventional approaches that focus

on running the completed application do not work well in agile environments Even

when automated efforts are employed in conventional environments the emphasis

often is on exercising the application from the outside in In other words scripts drive

the application through the UI In a methodology where the UI evolves with the rest of

the application though it is not possible to test the application in this manner

Requirements in agile environments tend to be less formal than in conventional

methodologies Requirements also referred to as stories in some agile iterations often

begin life as a single phrase or short sentence describing at a very high level what the

application must do As part of the iteration or release planning process the project

teammdasha cross-functional mixture of programmers testers business analysts and

stakeholdersmdashcollaborate to flesh out the stories The stakeholders often referred to

collectively as the customer focus on the observable ldquohappy-pathrdquo outcomes while

the testers add details to cover the odd exceptional or ldquocornerrdquo cases In some agile

methodologies these details are captured in the form of acceptance tests Conventional

approaches that use requirements documents as the basis of test cases tend to struggle

in agile settings

Agile depends heavily on effective collaboration among the programmers testers and

stakeholders It is not uncommon in agile environments to find testers and developers

working closely together on a daily basis This tight coordination of the various groups

often is absent in more conventional methodologies where the product transitions

through the development life cycle as a series of handoffs The business analysts deliv-

er the requirements to the architects who in turn deliver a design to the programmers

who deliver an application to the testers who test it according to the initial require-

ments People working in conventional settings sometimes are uncomfortable collabo-

rating with those in other groups

Successful agile efforts must find ways to overcome each of these challenges

Automation is essential to addressing the issue of pace By building automation infra-

structure early in the project when there is little real functionality available to test it

becomes far easier to run more tests later in the project Further this approach helps

to provide the regression testing necessary to ensure the product is evolving stably and

not losing functionality once working However not all tests can nor should be auto-

mated Skilled and experienced professional testers can add significant value through

effective manual testing By automating what can be automated testers have more

time to apply their talents through manual testing efforts

attempt to cover as much of the appli-cation as possible and they shouldrely on mocks and simulators sparing-ly This allows the tests to provide afairly realistic view of the state of theprogram and to more accurately pre-dict how it will behave in a productionenvironment

Creating and maintaining a usefulsuite of automated regression tests maybe the most difficult aspect of testing inan agile environment In early iterationsnot much of the application exists sothere are few meaningful regression

tests that can be written and executedAs the program grows the scope andcomplexity of the project may becomeoverwhelming and it may be challengingto keep up with the pace of develop-ment It is usually helpful to spend earlyiterations creating a framework forautomated regression tests andpreparing outlines and test scripts forfunctionality that will be added overtime By investing in the scaffolding inthe beginning it becomes possible forthe testers to add new regressiontests incrementally Also the existing

regression tests help to ensure thatthe application continues to performaccording to the expectations estab-lished in prior iterations This makes itpossible for the testers to continue tofocus on new functionality rather thanconstantly reconfirming existing behav-ior is correct

In conventional development meth-odologies there are many categoriesof tests intended to exercise the appli-cation end-to-end These include per-formance tests load tests stresstests integration tests smoke testsand others In the end each of thesemay be thought of as a type of regres-sion test All attempts to determinehow well the product meets the func-tional and non-functional requirementsalso attempt to exercise the completeapplication and all help to ensure thatfunctionality is never lost once it hasbeen implemented

One of the key components ofagile methodologies is a focus on busi-ness value Therefore any of thesetypes of tests that provide valuableand meaningful feedback about thestate of the project should be givenattention in agile environments Forexample if performance is important tothe application then performancetests should be automated and made apart of the regression test suite Thesame is true for any of the other typesof tests mentioned here

Because automated regression

[Automated

regression tests

should cover

as much of the

application

as possible

and should

use mocks

sparingly]

16 bull Software Test amp Performance JULY 2009

tests may serve a variety of purposesthere is no standard approach to imple-menting them Often a combination oftools must be employed to address eachspecific situation Both commercial andfree tools are available for the differenttypes of testing and each may be bestsuited to a particular type of applicationIn addition some teams build customframeworks using Ruby Java PerlPython or another language to get thebest match for their project

MANUAL EXPLORATORY TESTINGThere are two main types ofdefects those under whichthe application fails to performaccording to expectations (asdefined by its requirements)and those that cause theapplication to behave improp-erly in response to an unex-pected event Automatedtests can be effective at find-ing the former For a given setof expected inputs the appli-cation should produce a specific output Howeverautomation is not particularlygood at finding the unexpect-ed

Consider for example anapplication that permits a userto perform tasks A B and CAs the product was beingdeveloped no one expected that A Band C were in any way related Howeverwhen the user performs these tasks in aspecific order the program crashes It isunlikely that the functional requirementsinclude a statement like ldquoThe systemshall not crash when the user performsA then C then Brdquo As a result it isunlikely that an automated test exists tosimulate the situation This is where man-ual exploratory testing can be beneficial

Manual exploratory testing gives thetester an opportunity to truly demon-strate their value to a project Testerstend to be good at coming up withunusual ldquowhat ifrdquo scenarios that repre-sent what actual users will do with theapplication in the production environ-ments These might be long series oftasks odd combinations of events orprolonged use of the program Oftenthese scenarios are too difficult or cost-ly to automate

Manual exploratory testing is morethan the tester wandering randomlythrough the application To be done

effectively the tester must start with aplan He needs to isolate a section ofthe program to explore and time box theexploration Once the tester begins theexploration he should create a record ofthe actions taken This is crucial to theability to recreate any problems foundScreen recording tools also might proveuseful for documenting exactly what thetester does during an exploratory testingsession When flaws are found duringthis process new acceptance or regres-sion tests should be written (and auto-mated) to expose the bad behavior pro-

vide a verification that it has been cor-rected and ensure that it does notreturn

For software testing to be success-ful in an agile development environmenta four-part strategy comprising compli-mentary approaches has given me thebest results

Automated unit testing puts theonus of removing obvious errors ondevelopers where it belongs and resultsin higher initial quality delivered totesters

Automated acceptance testingsolves the critical problem of engagingstakeholders to identify key functionalityand to define what it means for the pro-gram to work correctly and provide con-fidence that the core functionality isbeing properly implemented

Automated regression testing pro-vides thorough coverage ensures thatthe application meets functional and non-functional requirements and guaranteesthat functionality once working is not dis-abled in later iterations

Manual exploratory testing exposesunexpected behavior by letting testersexercise the application in ways too dif-ficult or costly to automate

This four-phase approach to testingenables a project team to effectively andthoroughly determine an applicationrsquosreadiness for release Consistent with

agile practices testingbegins early in the softwaredevelopment life cycle and itis done constantly through-out the project Every teammember participates in someway in the testing effort andthe mixture of testing typesminimizes the risks associat-ed with any single techniqueBy automating what can beautomated it becomes pos-sible to provide testers withsufficient time to fully applytheir unique skills and experi-ences yacute

REFERENCES AND FURTHERREADINGbull Cunningham Ward Framework

for Integrated Test October 122007 httpfitc2com

bull Dunkin Robert Burrell TheLanguage of Testing Dummies

Mocks Fakes and Stubs Weirdest UndreamtUse Case November 29 2008httpwwwjrollercomrobertburrelldonkinentrytdd_mocks_fakes_stubs

bull FitNesse Front Page FitNesse httpwwwfitnesseorg

bull Fowler Martin Mocks Arenrsquot Stubs MocksArenrsquot Stubs January 2 2007 httpwwwmartinfowlercomarticlesmocksArentStubshtml

bull jUnitorg Resources for Test DrivenDevelopment jUnit httpwwwjunitorg

bull Porter Robert Mock Objects in Test DrivenDesign BlogOfBob October 7 2007httpwwwrp2ccomblogofbobMockObjectsInTestDrivenDesignaspx

bull Rothman Johanna Does Exploratory TestingHave A Place On Agile Teams StickyMinds July7 2008httpwwwstickymindscomsitewideaspFunction=edetailampObjectType=COLampObjectId=13860

[Manual exploratory

testing is more than

wandering randomly

through the application

the tester must start

with a plan]

LEARN TEST AUTOMATION from Robert Walsh at the Software Test ampPerformance Conference this fall Hersquoll be with us in Cambridge MAOct 19-23 teaching ldquoFirst Steps to Test Automationrdquo an in-depth coursethat can automate your testing fast VViissiitt wwwwwwssttppccoonnccoomm

observed with clients using agile methodologies typi-cally Scrum While my associates and I have helpedclients overcome these challenges I will not present allof our suggested solutions here Instead Software Testamp Performance magazine has given test consultant andcertified Scrum Master Bob Galen an opportunity tooffer solutions

What I describe here are challenges we haveobserved or discussed with clients who told us they werefollowing an agile life cycle Some of these challengesmay be inherent in agile methodologies However somemay in fact actually arise from improper understanding orimplementation of the chosen methodology resulting inavoidable problems

VOLUME AND SPEED OF CHANGE One of the principles of agile development is that proj-ect teams should welcome changing requirementseven late in development Many testing strategies

especially analytical require-ments-based testing becomequite inefficient in such situa-tions While automated test-ing can help accommodateon-going change automationcan be fragile and high-main-tenance if not implementedproperly We find that risk-based testing coupled withreactive techniques likeexploratory testing and prop-

By Rex Black

Every software development model presents itsown testing implications Some ease the testingprocess others make it more challenging Whatfollows is a look at some challenges I have

Rex Black is founder and president of RBCS a software hardware andsystems consultancy and is a founding member of the STP CollaborativeStrategic Advisory Board

18 bull Software Test amp Performance JULY 2009

The AgileProcessCreatesTestingPuzzles

Which AreSolved When

Agile IsPerformedCorrectly

of traditional testing consultants I donrsquot believe theywere gathered from the perspective of an actual agiletester one with serious experience working in matureand high performing agile teams

Such views often come from the perspective of ascribe or observer of teams who at least from my per-spective arenrsquot practicing agility very well Or theyrsquoreanecdotal reactions to the Agile Manifesto or other writ-ings without trying to understand the underlying intentionsWhatrsquos discouraging is that superficial reactions from wellrespected sources can give the wrong impression

VOLUME AND SPEED OF CHANGEThe point of embracing change is the realization thatmost software projects are volatile Business conditionsare incredibly dynamic Customers may not actuallyknow what they need until they see the application startto coalesce as working software The developers maynot have understood the challenges associated with thearchitecture without actuallybuilding parts of it nor mighttesters understand the mostrisk-based areas to test with-out seeing the application run

The point of trying to freezerequirements at a historicalinstant in time where thingsare well understood withoutwriting a lick of software is afoolrsquos errand Agile methods tryto embrace change react to it

Iwas asked by Software Test amp Performance magazine to provide a rebuttal to Rex Blackrsquos perspectives on how ldquoagile challenges testingrdquoRexrsquos views and reactions to agility are typical

By Bob GalenBob Galen is principal consultant of the RGalen Consulting Group anda founding member of the STP Collaborative Strategic Advisory Board

JULY 2009 wwwstpcollaborativecom bull 19

erly-done automation works bestWhatrsquos more change-related challenges often arise from

changes in the definition of the product and its correct behav-ior When the test team is not kept informed of thesechanges or when the rate of change is very high this canimpose inefficiencies on the development execution andmaintenance of tests

SHORT ITERATIONS SQUEEZE TESTINGIn sequential life cycles if the organization involves the testteam early in the project the testers have lots of time todevelop and maintain their tests and can do so in parallel withthe development of the system and prior to the start of sys-tem test execution Some more formal iterative lifecycle mod-els such as Rapid Application Development and the RationalUnified Process often allow substantial periods of timebetween test execution periods for each iteration Theseintervals allow test teams develop and maintain their test sys-tems

Most teams using agile life cycles progress more quicklyeg Scrum uses sprints two to four week periods of shortfast-paced development iterations I have seen this pacesqueeze the test teamrsquos ability to develop and maintain testsystems Testing strategies that include an automation ele-ment also have proven particularly sensitive to this challenge

UNIT TESTING IS OFTEN INADEQUATE Most agile proponents stress good automated unit testing andframeworks such as JUnit have helped simplify and minimize

and make the changes visible across the team and to stakehold-ers while guiding the software toward the customersrsquo true needsIf the traditional testing approaches that Rex alludes to can notadapt to change effectively then Irsquod argue that they need tochange

SHORT ITERATIONS SQUEEZE TESTINGIf your parochial goal is simply testing then I agree that shortiterations can be quite challenging But if your goal is to deliversmall chunks of working software to gain customer and prod-uct quality feedback and to ultimately deliver high value andworking products applications then small iterations rock

To Rexrsquos point regarding ldquosqueezing test system mainte-nancerdquo all of the agile methods adhere to the notion of com-pleting a small set of features that meet rdquodonenessrdquo criteriaat the end of each Iteration or Sprint

There should be no work that is left outmdashfor exampleappropriate test automation development or maintenance oftest ldquosystemsrdquo

Itrsquos up to the testers to simply place this work on theProduct and Sprint Backlogs and plan its completion as partof each Iteration or Sprint

Sometimes this takes effort and courage to defend this workagainst feature pressure However all team members encounterthe same challenge and need to ensure that they deliver ldquodonerdquosoftware from each of their functional perspectives

UNIT TESTING IS OFTEN INADEQUATEFirst agile methods stress automated testing of all kindsmdashnotsimply unit tests FitNesse Selenium and Watir Watin are

examples of open source automation tools that are widelyused in the agile community and that focus on non-unit testing

We do stress unit tests as being an important part of theoverall coverage and as being a safety net for the teamsrsquo refac-toring efforts We want developers to be responsible for con-tributing to test automationmdashnot just testers And if a team isusing TDD as a practice wersquove learned that developing testscan be a powerful aid as one designs their software in smallincremental steps

No agile team to my knowledge views unit tests as theend-all of testing In fact as part of delivering ldquodonerdquo softwareincrements they must decide on the set of tests (unit func-

20 bull Software Test amp Performance JULY 2009

[I expect testers to pair with

developers to improve practices anddevelopers

to deliver solid unit tests before they

can be considered lsquodonersquo]

tional acceptance non-functional regression etc) thatshould be run within the Sprint in order to meet a customer-ready deliverable Every team member should be definingthese testing tasks with the testers taking the lead

To Rexrsquos last point that programmers donrsquot commit to doingeffective unit testing I expect the testers to pair with the devel-opers to improve these practices and I expect developers todeliver solid unit tests before they can be considered ldquodonerdquo

MANAGING THE INCREASED REGRESSION RISKThere is a continuing misperception that unit tests are theonly tests run on a Sprint-by-Sprint basis While itrsquos true thatthe team might not be able to run a full regression test with-in each Sprint it is their responsibility to apply risk-basedtechniques based upon whatrsquos been added or changed andto run appropriate functional non-functional exploratory andregression tests as part of delivering ldquodonerdquo software at theend of the Sprint

Clearly automation development at all levels helps withthis so the team should be investing time in improving theiroverall ability to automate all of the testing types I mention asappropriate

POOR OR MISSING TEST ORACLESI hate to break the news to Rex but the best test oracles arethe people within the team not the documentation Documen-tation almost always loses its relevance and accuracy Yes itrsquoshelpful but should never be considered the single source forall knowledge

What agility encourages is for the team to talkmdashto plan their

the costs Such automated unit tests allow what agile propo-nents call refactoring Refactoring is the redesign of majorchunks of code or even entire objects The automated unit

tests provide for quick regression testing of refactored codeSome agilists recommend substituting automated unit tests fordesign as in Test Driven Development While this soundsgood in theory there are two problems that I have observedwith this approach in practice

First unit testing has limited bug-finding utility In researchpublished in June 2002 in CrossTalk the Journal of Defense

Software Engineering author Capers Jones found unit testingto be only 25 to 30 percent effective at finding and removingdefects In RBCS assessments we have found that good sys-tem testing by an independent test team averages around 85percent effective at finding defects So organizations that careabout quality should have both good unit testing and good sys-tem testing

Second while unit testing is often thought of as thedeveloperrsquos job I have found that many programmers justdonrsquot do it or deliver cursory attempts This can create com-plications and delays for the system test team in its role asldquobug finder of last resortrdquo as they deal with buggy code dur-ing short test execution periods in the sprints This can alsolead to code churn during testing since so much code mustchange to fix the bugs

The amount of change can ultimately outstrip even theability of the best automated regression test system to keepup which then leads to lower defect detection effectivenessfor the test team

MANAGING THE INCREASED REGRESSION RISKCapers Jones also reported that regression accounts forabout seven percent of bugs In iterative life cycles such asScrum however code that worked in previous sprints getschurned by new features in each subsequent sprint Thisincreases the risk of regression

Agilists emphasize good automated unit testing in part tomanage the regression risk inherent in such churn But with eventhe best unit testing topping out at about 50 percent defect

JULY 2009 wwwstpcollaborativecom bull 21

[While unit testing is often thought

of as the developerrsquos job I have found that

they just donrsquot do it or [they] deliver

cursory attempts]

22 bull Software Test amp Performance JULY 2009

work together to cluster in small groups working on specific fea-tures to work with the customer gaining valuable feedbackmdashandto collaborate across disciplines From this they can decidewhere when and to what degree documentation is necessary

We expect team members to be professionals and to beresponsible and accountable We trust their judgment overblindly filling in a plan template or a requirement document orblindly running the same tests over and over again to meetsome perceived risk

All oracles have a cost We engage the customers in thesedecisions as wellmdashensuring that those oracles that are providedare actually used and kept current

THE SHIFTING TEST BASISIrsquom lost here Agile teams define requirements Some usePost-it Notes and cards or define them as User StoriesOthers describe use cases or use traditional requirementforms Many even add a construct called acceptance tests ona finely grained story level that helps define the critical ele-ments of each feature or requirement These are things thatthe customer will ldquotestrdquo as part of their acceptance at the endof a Sprint or Iteration

Irsquod guess that Rex has seen poor or missing requirements inhis agile experience and Irsquod say those were not mature or solid-ly performing agile teams ldquoGoodrdquo teams define requirementsat a level that assures the team will build what the customerneeds They ensure collaboration surrounding each require-mentmdashat the point of attack

To capture a key point here The agile methods try to adopta Just-in-Time and Just Enough approach to their requirement

removal effectiveness according to Jones automated regressiontesting via unit tests will miss at least half of the regression bugs

POOR OR MISSING TEST ORACLESAgile favors working software over written documentationand in my observations special scorn is reserved for specifi-cations The suggestion by the Agile Manifesto (wwwagile-manifesto org) that people should value ldquoworking softwareover comprehensive documentationrdquo can create real chal-lenges for a test team when taken too far Testers userequirements specifications and other documents as testoracles ie the means to determine correct behavior undera given test condition We have seen testers in agile situa-tions given documents with insufficient detail and in somecases given no documents at all

Even when testers are given adequate documents twoother agile development principles keep the test oracle chal-lenge alive First agile requires teams to embrace changeand espouses that ldquothe most efficient and effective methodof conveying information to and within a development team isface-to-face conversationrdquo

From what Irsquove seen these two principles allow the proj-ect team to change the definition of correct behavior at anytime including after testers have created tests to confirm aparticular behavior and even after testers have reported bugsagainst a particular behavior

Further the definition of correct behavior can change in ameeting or a discussion where a tester is not present If thatchange is not communicated to the tester the project suffers

false positives when testers report bugs against behaviornow defined as correct

THE SHIFTING TEST BASISRequirements-based testing strategies cannot handle vague ormissing requirements specifications Missing requirementsspecifications would mean a test team following a requirements-based testing strategy not only canrsquot say what it means for aparticular test to pass or fail they wouldnrsquot have a test basisupon which to base their tests

The test basis also provides a means to measure theresults In a requirements-based test strategy testers canrsquotreport test results accurately if the requirements are missing orpoor Testers canrsquot report the percentage of the test basis cov-ered by passed tests because the requirements wonrsquot provideenough detail for meaningful coverage analysis

TOO MANY MEETINGSA ldquoface-to-face conversationrdquo is just another way to describe ameeting A client once jokingly described Scrum as a heavy-weight process saying ldquoIrsquom surprised at the name agilemdashitshould be called couch potato There are too many meetingsTherersquos too much jawboning I find it really ironic that there are allthese books explaining how simple it isrdquo

Perhaps it was intentional not to use the dreaded word meet-ing in the Agile Manifesto but the result is often the same In someorganizations everyone gets invited to every meeting meetingsballoon in number and duration and managers and leads are notavailable to manage and lead their team because they are in meet-

ings much of the day Effectiveness and efficiency drop Anotherclient once told me of a senior manager who said in response toa complaint from a line manager about how attending meetingswas negatively impacting his ability to lead his team ldquoWersquore goingto continue to have these meetings until I find out why nothing isgetting done around hererdquo The senior manager was I am told theonly one in the room who didnrsquot get the irony

THE BOTTOM LINEIn RBCS client engagements we typically recommend blend-ed testing strategies that align well with Scrum and otheragile methodologies What seems most likely to work in manycases is a blend of risk-based testing maintainable automat-ed testing (at both the unit and system level) and a percent-age of effort spent on reactive testing (eg exploratory) Insome cases our methods will help mitigate the testing risksand reduce the testing challenges associated with thesemethodologies

Clearly there is no single way to perform effective soft-ware testing and as the industry evolves so too will the mosteffective methods that do the most good yacute

development An option of ldquono requirementsrdquo does not exist

TOO MANY MEETINGSThe point of face-to-face collaboration is that itrsquos the most effec-tive and richest way to communicate in technical teams Has anydeveloper ever implemented a requirement had a tester test itand a customer receive it with perfect alignment That wouldtruly be exceptional More oftenthe tester will say the softwareperforms differently than the testsandor the customer says I didnrsquotask for that even though theywere all working off of the exactsame requirement

The point is that technicalteams need to rely on written aswell as face-to-face collabora-tion to effectively reduce themisunderstandings and reworkthat written requirements alone often drive

Beyond this agile teams should be conducting incrediblyeffective meetings something perhaps lacking in the exam-ples Rex has seen I suspect this problem isnrsquot agile-centricand that those teams simply need a good facilitator

THE BOTTOM LINEEffectively testing todayrsquos software is incredibly challenging andwe all need to stop thinking of agile testing as somethingopposed to traditional testing Testers need every possible tool

to perform testing well Agile methods have emerged as a set ofpractices and a mindset that is context based and powerful Theytake a holistic view to testing and qualitymdashreinforcing the two asa whole Testers in this model have a powerful opportunity to notonly test but to champion and guide the overall teamrsquos focus ondelivering quality software

We need to stop our history of opposition to new testing app-roaches and begin embracing abroader set of testing practicesthat work in each projectrsquos uniquecontext I also believe we that weneed to stop quoting CapersJones as I suspect his data andstudies do not reflect the dynam-ics of mature agile teams Irsquom look-ing forward to encouraging Rex totry and gain deeper understandingof agile testing practices yacute

Bob will have ample oppor-tunity to do that in October when he and Rex meet for thefirst time as fellow members of STampP CollaborativersquosStrategic Advisory Board ndashEd

JULY 2009 wwwstpcollaborativecom bull 23

[Teams need to rely on written and face-to-face

collaboration to effectively reduce rework]

LEARN AGILE TESTING TECHNIQUESdirectly from Bob Galen himself this fall atSTPCon The Scrum Master will be on hand inCambridge MA Oct 19-23 to teach ldquoAgile andHigh-Speed Software Testing Techniquesrdquo anin-depth training course for bringing agile test-

ing to your organization VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

LEARN TEST MANAGEMENT from Rex Blackat the Software Test amp Performance Conferencethis fall in Cambridge Mass Learn about ldquoTestManagement A Risk-Based Approachrdquo the prac-tice that guides test activities based on carefulassessment and prioritization of the risks involved VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

ciplines and draw on a variety of skills to do their jobs And whilethere are many unsung testing heroes inside software develop-ment shops there are the beginnings in many organizations of agrowing recognition that test and performance engineers shouldbe considered peers of their development counterparts

We asked several testing gurus to share their thoughts onthe state of the testing industry and give us some insights on wherethey see things evolving One thing for anyone to realize is thatwhile testing as an industry is maturing it is still relatively new Thesoftware tester job isnt something that shows up on the US CensusWe are still very much a new profession says Scott Barber thefounder and chief technologist of PerfTestPlus in Palm Bay Fla Mostof our very first programmers are still alive we arent thousands ofyears old like other disciplinesrdquo

Perhaps the best rundown of the industryrsquos evolution is summed up by Michael Boltonwho lives in Toronto Testing has historically been confirmation validation and verificationhe says I believe that testing must become far more about exploration discovery investi-gation learning and communicating what youve learnedrdquo Bolton runs test consultancyDevelopSense and teaches rapid software testing methods ldquoI like to encourage people torecognize and manage uncertainty and ambiguity and instead of trying to eliminate them get-ting them down to levels that are suffi-ciently low for people to do use-ful workrdquo

While any gathering ofsoftware testers will find dis-agreement on overall pur-pose one thing is cleartesting isnt about how toimprove code qualityndashthat isthe role of the engineeringpeople who do the coding workand of management says BoltonTesting is about providing informationto those people so that better-informed decisionsabout quality are possible

The future careers for the current cropof test engineers seem to be on the rise

All this activity has made software testing a very big tent The industry has matured and grownto encompass numerous approaches to methods training and tools Because of this diver-sity there is a growing consensus that the best testers are those that can span multiple dis-

The software testing industry is at a historic crossroads Rich Internet applica-tions cloud computing the rise of virtualization and more capable mobiledevices have all made the job of producing reliable software more complexwhile at the same time enabling more sophistication for automated test tools

Wersquore All Part of the Story

We are a new profession most of our first programmers are

still alivendash Scott Barber

A skilledtester is theproduct of avariety of

experiencesndash Jon Bach

24 bull Software Test amp Performance

the business outcomes theyrsquove delivered he says He goes on to describe the best mix in his development teams A performance

engineering organization should be comprised of engineers that are just as capableor more capable than the core product engineers This allows for the performanceteam to be just as familiar with the product code base and for them to go as far asmaking performance enhancements to the product and testing the outcome of thosechanges

At Google employees involved in testing generally have one of three titles saysDavid Saff a Google software engineer in test whorsquos basedin Cambridge Mass Software engineers are expected to unit-test their own code and many take an active role in maintain-ing testing frameworks and system tests for their teams Testengineers look at the quality requirements of a project identifyholes and fill them by applying existing automation tools where pos-sible or managing manual testing teams where needed Softwareengineers in test look for holes in the tools available and serve as internal

tool developers and implementation consultants Any teamcant survive without high competence in each domain

CLOUDS ARE FORMINGThe move toward cloud computing has helped softwaretesting overall but also made for some new challengestoo Cloud computing adds layers of complexity to test-ing but it also makes things more interesting says BoltonThese layers include being able to predict latency and avail-ability of all components that reside in the cloud Barber saysthat Too many people take latency for granted still and many

application developers still take the entire network infrastruc-ture for granted and dont consider it part of their domain

so programmers just assume it is working unless theircode has a problem

Bartow agrees Network latency hasbecome even more important with the intro-duction of Rich Internet Application tech-nologies such as Adobe Flex Every value that you enter

David Strom is a freelance reporter and author living in St Louis

particularly within large software shops such as Intuit and Google There software testengineers are often more important than the development engineers who writethe original code and have their own career track with parallel compensationand benefits

At Intuit QA engineers are held to the exact same standards and expec-tations as product developersrdquo says Dan Bartow a manager of performanceengineering for the company which makes TurboTax and other financial softwareldquoIn fact my team of five full-time performance engineers is calibrated the sameway as the product engineers at the end of year in terms of their skill-sets and

Virtualizationmakes things

easierand harder

ndash Michael Bolton

wwwstpcollaborativecom bull 25

of the Story

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 9: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

Illus

trat

ion

by S

teve

Nym

an

necessary for the others to succeed They alsoprovide a system of checks and balances thatallows the strengths of one to compensate forthe weaknesses of the others Each pillar is theresponsibility of a different group within theagile development team and when each groupdoes its part my experience has shown that theresult is usually a successful project

AUTOMATED UNIT TESTINGUnit testing is perhaps the easiest type of test-ing to automate Automated unit testing is apractice through which nearly every line andbranch of code is covered with an automatedtest that proves the code works as intendedAutomated unit tests focus on the discreteunits or modules within the application eachperforming its specific tasks in isolation fromthe rest Automated unit tests may be writtenbefore or after the code under test is writtenbut itrsquos most effective to write them beforehand(a process known as Test-Driven Developmentor TDD)

With TDD the code tends to more close-ly represent the developerrsquos intent becausefor the developer itrsquos easier to write justenough code to satisfy the single new failingtest Less code written means less code tomaintain and fewer places where bugs canhide When the unit tests are written after thecode the tests scan sometimes show onlythat the code works as written and not nec-

essarily as intended Further therersquos a ten-dency to write more code than is necessarybecause the programmer is speculating aboutwhat conditions might exist rather than start-ing from concrete test cases that highlight thetrue requirements

Automated unit testing falls almost solelyon the shoulders of the developers and helpsto ensure that the code they are writingbehaves according to their interpretation ofthe requirements The word interpretation isimportant there because while a modulemight achieve 100 percent code coverage andpass all of its unit tests it might still fail tomeet the needs of the application If the pro-grammer misunderstands the requirementthe tests he writes will be flawed and theresulting code will misbehave Thatrsquos amongthe reasons that automated unit testing is notthe only form of automated testing employedon an agile project

Automated unit tests are generally writtenin the same language as the application beingtested A family of testing tools exists to facili-tate automated unit testing Among the mostwell-knows is jUnit created for Java by KentBeck and Erich Gamma with ports now avail-able for most languages These tools provide

By Robert Walsh

Successful agile testing relies on a strategy built on four pillars automated unit test-ing automated acceptance testing automated regression testing and manualexploratory testing As with columns that hold up a building no single pillar cansupport the entire load The four are interdependent and each provides benefits

THE BUILDING BLOCKS OF AGILEmdash

AUTOMATED UNIT ACCEPTANCE AND

REGRESSION TESTING PAVE THE WAY TO

MANUAL EXPLORATORY TESTING

Robert Walsh is president and manager of applicationdevelopment at EnvisionWare which provides soft-ware solutions for public and academic libraries

14 bull Software Test amp Performance JULY 2009

language elements for defining testcases combining tests into suites andasserting the various exit conditions thatdetermine whether the tests pass or failMany also provide a run-time environ-ment in which the tests execute Theserange from simple command-line reportsshowing the results of the tests to com-plete GUIs that integrate directly withthe developerrsquos programming environ-ment Automated unit tests often areintegrated with the continuous build sys-tem and are run each and every time theproject is built This helps to ensure thatthe developers know instantaneouslywhen a recent check-in has caused thesuite of unit tests to fail

When working with automated unittests developers commonly find itnecessary to use mock objects fakeobjects or method stubs to stand in forother parts of the application Forexample if a module obtains informa-tion from a database it might beadvantageous to use a fake databaseobject to provide canneddata that is similar to thereal data available in theproduction environmentThis approach gives theprogrammer more controlover the test environ-ment and lets them dic-tate which data isreturned and how to rep-resent a variety of inter-esting situations Furtherthe technique allowsdevelopers to simulateproblems and failures thatmay be difficult to triggerwhen using a real data-base These might includeconnectivity issues bufferoverruns and other errorconditions

Automated unit testsprovide several key ben-efits in an agile environ-ment First they make itpossible to achieve highlevels of code coveragealmost as a side-effect Ifa programmer is disciplined aboutTDD then near 100 percent coveragejust happens Second automated unittests help to provide an early warningindicator that shows when behaviorwithin a module has changed This canbe used to identify unintended side-effects of other development and toserve as a kind of safety net for later

refactorings If after reworking a mod-ule all of the existing unit tests passthe developer can be confident that themodule will function properly in thecontext of the full application Finallyautomated unit tests also help to raisethe initial quality of the applicationdelivered to testers By using tests aspart of the coding process developersare able to catch many of the minorerrors and trivial mistakes that wouldotherwise fall to testers Reducingobvious coding errors through unittesting helps free up testers for theimportant job of manual exploratorytesting

AUTOMATED ACCEPTANCETESTINGAn essential element to the success ofan application project is how well itmeets the needs of its users A prod-uct might be absolutely flawless per-fect in every way and still not do thejob One way to ensure that the busi-

ness needs are met iswith automated accept-ance testing

With automated ac-ceptance testing the cus-tomer often with help ofQA writes tests for thekey features that arescheduled to be imple-mented in the upcomingiteration These tests arewritten in a natural lan-guage that non-technicalpeople can read and as aresult they become exe-cutable specifications thatcannot get out-of-syncwith the application beingtested

Automated accept-ance tests tend to focuson successful use casesthey often ignore the sadpaths and exceptional situ-ations For this reason atest strategy built entirelyon automated acceptancetesting will be inadequate

to thoroughly test the applicationSometimes the testers will augment thetest suite created by the customer withadditional acceptance tests that docover more of the corner cases

Automated acceptance tests pro-vide the primary measure of progress asthe product evolves over the course ofan iteration or even across an entire

release plan The acceptance tests arewritten early in (or even before) the iter-ation and running them clearly illustrateshow many of the stories have been com-pleted As part of an iteration reviewmeeting the tests are run again todemonstrate that the scope selected atthe start of the iteration was implement-ed according to the customerrsquos own def-inition of what is correct This process ismainly a formality it should be no sur-prise to anyone whether the tests willpass or fail when run during the iterationreview meeting

Two approaches to automatedacceptance testing are the Frameworkfor Integrated Test (FIT) developed byWard Cunningham and FitNesse awiki originally written in Java by MicahMartin Robert C Martin and MichaelFeathers FitNesse extends the princi-ples established by Cunninghamrsquos tooland provides a mechanism by whichtests may be created and maintained inaddition to being executed Both are

[With

automated

acceptance

testing

the

customer

writes

tests for

the key

features]

based on creating tables of input datathat are fed into the application undertest The programrsquos output is capturedand compared to the expected resultswhich is also provided in the tables

While tests themselves are definedusing a combination of narrative textand tabular data theyrsquore executed byfixture code that accepts the inputfrom the tables calls the appropriatemethods in the application and pro-vides the output back to the test run-ner These fixtures are generally writ-ten in the language of the applicationand must be implemented and main-tained by developers However thegoal is that they be thin and serve onlyto translate inputs and outputsbetween the test harness and theapplication Therefore the overallimpact that working with the fixturecode has on the developersrsquo productiv-ity should be minimal

As with automated unit tests auto-mated acceptance tests are likely tobe run against an incomplete applica-tion Not much functionality exists inearly iterations so it may be necessaryto test one part before another hasbeen built Again simulators are oftenused to provide the missing behaviorsIt is up to the project team to decidewhether the simulators are replacedwith real implementations as theybecome available in later iterationsEach approach has its advantagesUsing real components may providemore confidence that the applicationworks properly However using simula-tors may provide more control over thesystem during the tests and as withthe mocks and fakes used in unit test-ing itrsquos sometimes easier to recreatefailures to verify that the applicationhandles them properly

AUTOMATED REGRESSIONTESTINGResponsible thorough testing mustverify not only the functional require-ments but non-functional requirementsas well Automated regression testingwhich seeks to verify functions thathave worked in the past can be aneffective way to provide this additionalcoverage and fill in the gaps left by theacceptance tests

Unlike unit and acceptance testingwhich most heavily involve developersand customers automated regressiontesting is performed by testersAutomated regression tests should

JULY 2009 wwwstpcollaborativecom bull 15

CHALLENGES OF TESTING IN AGILE ENVIRONMENTS

The agile environment exaggerates many of the challenges faced in more convention-

al approaches to testing First the pace of the development effort requires that the

application be tested within each iteration in parallel with the actual coding

Conventional approaches that focus on full regression during a time when the code is

frozen are ineffective in agile environments Agile testing tends to focus on the new

functionality and providing rapid feedback to the programmers However regression

testing is still important and some mechanisms by which the entire application may

be exercised end-to-end must be implemented

Testing on agile projects begins almost immediately However in the early iterations

there may not be much of the application to test Conventional approaches that focus

on running the completed application do not work well in agile environments Even

when automated efforts are employed in conventional environments the emphasis

often is on exercising the application from the outside in In other words scripts drive

the application through the UI In a methodology where the UI evolves with the rest of

the application though it is not possible to test the application in this manner

Requirements in agile environments tend to be less formal than in conventional

methodologies Requirements also referred to as stories in some agile iterations often

begin life as a single phrase or short sentence describing at a very high level what the

application must do As part of the iteration or release planning process the project

teammdasha cross-functional mixture of programmers testers business analysts and

stakeholdersmdashcollaborate to flesh out the stories The stakeholders often referred to

collectively as the customer focus on the observable ldquohappy-pathrdquo outcomes while

the testers add details to cover the odd exceptional or ldquocornerrdquo cases In some agile

methodologies these details are captured in the form of acceptance tests Conventional

approaches that use requirements documents as the basis of test cases tend to struggle

in agile settings

Agile depends heavily on effective collaboration among the programmers testers and

stakeholders It is not uncommon in agile environments to find testers and developers

working closely together on a daily basis This tight coordination of the various groups

often is absent in more conventional methodologies where the product transitions

through the development life cycle as a series of handoffs The business analysts deliv-

er the requirements to the architects who in turn deliver a design to the programmers

who deliver an application to the testers who test it according to the initial require-

ments People working in conventional settings sometimes are uncomfortable collabo-

rating with those in other groups

Successful agile efforts must find ways to overcome each of these challenges

Automation is essential to addressing the issue of pace By building automation infra-

structure early in the project when there is little real functionality available to test it

becomes far easier to run more tests later in the project Further this approach helps

to provide the regression testing necessary to ensure the product is evolving stably and

not losing functionality once working However not all tests can nor should be auto-

mated Skilled and experienced professional testers can add significant value through

effective manual testing By automating what can be automated testers have more

time to apply their talents through manual testing efforts

attempt to cover as much of the appli-cation as possible and they shouldrely on mocks and simulators sparing-ly This allows the tests to provide afairly realistic view of the state of theprogram and to more accurately pre-dict how it will behave in a productionenvironment

Creating and maintaining a usefulsuite of automated regression tests maybe the most difficult aspect of testing inan agile environment In early iterationsnot much of the application exists sothere are few meaningful regression

tests that can be written and executedAs the program grows the scope andcomplexity of the project may becomeoverwhelming and it may be challengingto keep up with the pace of develop-ment It is usually helpful to spend earlyiterations creating a framework forautomated regression tests andpreparing outlines and test scripts forfunctionality that will be added overtime By investing in the scaffolding inthe beginning it becomes possible forthe testers to add new regressiontests incrementally Also the existing

regression tests help to ensure thatthe application continues to performaccording to the expectations estab-lished in prior iterations This makes itpossible for the testers to continue tofocus on new functionality rather thanconstantly reconfirming existing behav-ior is correct

In conventional development meth-odologies there are many categoriesof tests intended to exercise the appli-cation end-to-end These include per-formance tests load tests stresstests integration tests smoke testsand others In the end each of thesemay be thought of as a type of regres-sion test All attempts to determinehow well the product meets the func-tional and non-functional requirementsalso attempt to exercise the completeapplication and all help to ensure thatfunctionality is never lost once it hasbeen implemented

One of the key components ofagile methodologies is a focus on busi-ness value Therefore any of thesetypes of tests that provide valuableand meaningful feedback about thestate of the project should be givenattention in agile environments Forexample if performance is important tothe application then performancetests should be automated and made apart of the regression test suite Thesame is true for any of the other typesof tests mentioned here

Because automated regression

[Automated

regression tests

should cover

as much of the

application

as possible

and should

use mocks

sparingly]

16 bull Software Test amp Performance JULY 2009

tests may serve a variety of purposesthere is no standard approach to imple-menting them Often a combination oftools must be employed to address eachspecific situation Both commercial andfree tools are available for the differenttypes of testing and each may be bestsuited to a particular type of applicationIn addition some teams build customframeworks using Ruby Java PerlPython or another language to get thebest match for their project

MANUAL EXPLORATORY TESTINGThere are two main types ofdefects those under whichthe application fails to performaccording to expectations (asdefined by its requirements)and those that cause theapplication to behave improp-erly in response to an unex-pected event Automatedtests can be effective at find-ing the former For a given setof expected inputs the appli-cation should produce a specific output Howeverautomation is not particularlygood at finding the unexpect-ed

Consider for example anapplication that permits a userto perform tasks A B and CAs the product was beingdeveloped no one expected that A Band C were in any way related Howeverwhen the user performs these tasks in aspecific order the program crashes It isunlikely that the functional requirementsinclude a statement like ldquoThe systemshall not crash when the user performsA then C then Brdquo As a result it isunlikely that an automated test exists tosimulate the situation This is where man-ual exploratory testing can be beneficial

Manual exploratory testing gives thetester an opportunity to truly demon-strate their value to a project Testerstend to be good at coming up withunusual ldquowhat ifrdquo scenarios that repre-sent what actual users will do with theapplication in the production environ-ments These might be long series oftasks odd combinations of events orprolonged use of the program Oftenthese scenarios are too difficult or cost-ly to automate

Manual exploratory testing is morethan the tester wandering randomlythrough the application To be done

effectively the tester must start with aplan He needs to isolate a section ofthe program to explore and time box theexploration Once the tester begins theexploration he should create a record ofthe actions taken This is crucial to theability to recreate any problems foundScreen recording tools also might proveuseful for documenting exactly what thetester does during an exploratory testingsession When flaws are found duringthis process new acceptance or regres-sion tests should be written (and auto-mated) to expose the bad behavior pro-

vide a verification that it has been cor-rected and ensure that it does notreturn

For software testing to be success-ful in an agile development environmenta four-part strategy comprising compli-mentary approaches has given me thebest results

Automated unit testing puts theonus of removing obvious errors ondevelopers where it belongs and resultsin higher initial quality delivered totesters

Automated acceptance testingsolves the critical problem of engagingstakeholders to identify key functionalityand to define what it means for the pro-gram to work correctly and provide con-fidence that the core functionality isbeing properly implemented

Automated regression testing pro-vides thorough coverage ensures thatthe application meets functional and non-functional requirements and guaranteesthat functionality once working is not dis-abled in later iterations

Manual exploratory testing exposesunexpected behavior by letting testersexercise the application in ways too dif-ficult or costly to automate

This four-phase approach to testingenables a project team to effectively andthoroughly determine an applicationrsquosreadiness for release Consistent with

agile practices testingbegins early in the softwaredevelopment life cycle and itis done constantly through-out the project Every teammember participates in someway in the testing effort andthe mixture of testing typesminimizes the risks associat-ed with any single techniqueBy automating what can beautomated it becomes pos-sible to provide testers withsufficient time to fully applytheir unique skills and experi-ences yacute

REFERENCES AND FURTHERREADINGbull Cunningham Ward Framework

for Integrated Test October 122007 httpfitc2com

bull Dunkin Robert Burrell TheLanguage of Testing Dummies

Mocks Fakes and Stubs Weirdest UndreamtUse Case November 29 2008httpwwwjrollercomrobertburrelldonkinentrytdd_mocks_fakes_stubs

bull FitNesse Front Page FitNesse httpwwwfitnesseorg

bull Fowler Martin Mocks Arenrsquot Stubs MocksArenrsquot Stubs January 2 2007 httpwwwmartinfowlercomarticlesmocksArentStubshtml

bull jUnitorg Resources for Test DrivenDevelopment jUnit httpwwwjunitorg

bull Porter Robert Mock Objects in Test DrivenDesign BlogOfBob October 7 2007httpwwwrp2ccomblogofbobMockObjectsInTestDrivenDesignaspx

bull Rothman Johanna Does Exploratory TestingHave A Place On Agile Teams StickyMinds July7 2008httpwwwstickymindscomsitewideaspFunction=edetailampObjectType=COLampObjectId=13860

[Manual exploratory

testing is more than

wandering randomly

through the application

the tester must start

with a plan]

LEARN TEST AUTOMATION from Robert Walsh at the Software Test ampPerformance Conference this fall Hersquoll be with us in Cambridge MAOct 19-23 teaching ldquoFirst Steps to Test Automationrdquo an in-depth coursethat can automate your testing fast VViissiitt wwwwwwssttppccoonnccoomm

observed with clients using agile methodologies typi-cally Scrum While my associates and I have helpedclients overcome these challenges I will not present allof our suggested solutions here Instead Software Testamp Performance magazine has given test consultant andcertified Scrum Master Bob Galen an opportunity tooffer solutions

What I describe here are challenges we haveobserved or discussed with clients who told us they werefollowing an agile life cycle Some of these challengesmay be inherent in agile methodologies However somemay in fact actually arise from improper understanding orimplementation of the chosen methodology resulting inavoidable problems

VOLUME AND SPEED OF CHANGE One of the principles of agile development is that proj-ect teams should welcome changing requirementseven late in development Many testing strategies

especially analytical require-ments-based testing becomequite inefficient in such situa-tions While automated test-ing can help accommodateon-going change automationcan be fragile and high-main-tenance if not implementedproperly We find that risk-based testing coupled withreactive techniques likeexploratory testing and prop-

By Rex Black

Every software development model presents itsown testing implications Some ease the testingprocess others make it more challenging Whatfollows is a look at some challenges I have

Rex Black is founder and president of RBCS a software hardware andsystems consultancy and is a founding member of the STP CollaborativeStrategic Advisory Board

18 bull Software Test amp Performance JULY 2009

The AgileProcessCreatesTestingPuzzles

Which AreSolved When

Agile IsPerformedCorrectly

of traditional testing consultants I donrsquot believe theywere gathered from the perspective of an actual agiletester one with serious experience working in matureand high performing agile teams

Such views often come from the perspective of ascribe or observer of teams who at least from my per-spective arenrsquot practicing agility very well Or theyrsquoreanecdotal reactions to the Agile Manifesto or other writ-ings without trying to understand the underlying intentionsWhatrsquos discouraging is that superficial reactions from wellrespected sources can give the wrong impression

VOLUME AND SPEED OF CHANGEThe point of embracing change is the realization thatmost software projects are volatile Business conditionsare incredibly dynamic Customers may not actuallyknow what they need until they see the application startto coalesce as working software The developers maynot have understood the challenges associated with thearchitecture without actuallybuilding parts of it nor mighttesters understand the mostrisk-based areas to test with-out seeing the application run

The point of trying to freezerequirements at a historicalinstant in time where thingsare well understood withoutwriting a lick of software is afoolrsquos errand Agile methods tryto embrace change react to it

Iwas asked by Software Test amp Performance magazine to provide a rebuttal to Rex Blackrsquos perspectives on how ldquoagile challenges testingrdquoRexrsquos views and reactions to agility are typical

By Bob GalenBob Galen is principal consultant of the RGalen Consulting Group anda founding member of the STP Collaborative Strategic Advisory Board

JULY 2009 wwwstpcollaborativecom bull 19

erly-done automation works bestWhatrsquos more change-related challenges often arise from

changes in the definition of the product and its correct behav-ior When the test team is not kept informed of thesechanges or when the rate of change is very high this canimpose inefficiencies on the development execution andmaintenance of tests

SHORT ITERATIONS SQUEEZE TESTINGIn sequential life cycles if the organization involves the testteam early in the project the testers have lots of time todevelop and maintain their tests and can do so in parallel withthe development of the system and prior to the start of sys-tem test execution Some more formal iterative lifecycle mod-els such as Rapid Application Development and the RationalUnified Process often allow substantial periods of timebetween test execution periods for each iteration Theseintervals allow test teams develop and maintain their test sys-tems

Most teams using agile life cycles progress more quicklyeg Scrum uses sprints two to four week periods of shortfast-paced development iterations I have seen this pacesqueeze the test teamrsquos ability to develop and maintain testsystems Testing strategies that include an automation ele-ment also have proven particularly sensitive to this challenge

UNIT TESTING IS OFTEN INADEQUATE Most agile proponents stress good automated unit testing andframeworks such as JUnit have helped simplify and minimize

and make the changes visible across the team and to stakehold-ers while guiding the software toward the customersrsquo true needsIf the traditional testing approaches that Rex alludes to can notadapt to change effectively then Irsquod argue that they need tochange

SHORT ITERATIONS SQUEEZE TESTINGIf your parochial goal is simply testing then I agree that shortiterations can be quite challenging But if your goal is to deliversmall chunks of working software to gain customer and prod-uct quality feedback and to ultimately deliver high value andworking products applications then small iterations rock

To Rexrsquos point regarding ldquosqueezing test system mainte-nancerdquo all of the agile methods adhere to the notion of com-pleting a small set of features that meet rdquodonenessrdquo criteriaat the end of each Iteration or Sprint

There should be no work that is left outmdashfor exampleappropriate test automation development or maintenance oftest ldquosystemsrdquo

Itrsquos up to the testers to simply place this work on theProduct and Sprint Backlogs and plan its completion as partof each Iteration or Sprint

Sometimes this takes effort and courage to defend this workagainst feature pressure However all team members encounterthe same challenge and need to ensure that they deliver ldquodonerdquosoftware from each of their functional perspectives

UNIT TESTING IS OFTEN INADEQUATEFirst agile methods stress automated testing of all kindsmdashnotsimply unit tests FitNesse Selenium and Watir Watin are

examples of open source automation tools that are widelyused in the agile community and that focus on non-unit testing

We do stress unit tests as being an important part of theoverall coverage and as being a safety net for the teamsrsquo refac-toring efforts We want developers to be responsible for con-tributing to test automationmdashnot just testers And if a team isusing TDD as a practice wersquove learned that developing testscan be a powerful aid as one designs their software in smallincremental steps

No agile team to my knowledge views unit tests as theend-all of testing In fact as part of delivering ldquodonerdquo softwareincrements they must decide on the set of tests (unit func-

20 bull Software Test amp Performance JULY 2009

[I expect testers to pair with

developers to improve practices anddevelopers

to deliver solid unit tests before they

can be considered lsquodonersquo]

tional acceptance non-functional regression etc) thatshould be run within the Sprint in order to meet a customer-ready deliverable Every team member should be definingthese testing tasks with the testers taking the lead

To Rexrsquos last point that programmers donrsquot commit to doingeffective unit testing I expect the testers to pair with the devel-opers to improve these practices and I expect developers todeliver solid unit tests before they can be considered ldquodonerdquo

MANAGING THE INCREASED REGRESSION RISKThere is a continuing misperception that unit tests are theonly tests run on a Sprint-by-Sprint basis While itrsquos true thatthe team might not be able to run a full regression test with-in each Sprint it is their responsibility to apply risk-basedtechniques based upon whatrsquos been added or changed andto run appropriate functional non-functional exploratory andregression tests as part of delivering ldquodonerdquo software at theend of the Sprint

Clearly automation development at all levels helps withthis so the team should be investing time in improving theiroverall ability to automate all of the testing types I mention asappropriate

POOR OR MISSING TEST ORACLESI hate to break the news to Rex but the best test oracles arethe people within the team not the documentation Documen-tation almost always loses its relevance and accuracy Yes itrsquoshelpful but should never be considered the single source forall knowledge

What agility encourages is for the team to talkmdashto plan their

the costs Such automated unit tests allow what agile propo-nents call refactoring Refactoring is the redesign of majorchunks of code or even entire objects The automated unit

tests provide for quick regression testing of refactored codeSome agilists recommend substituting automated unit tests fordesign as in Test Driven Development While this soundsgood in theory there are two problems that I have observedwith this approach in practice

First unit testing has limited bug-finding utility In researchpublished in June 2002 in CrossTalk the Journal of Defense

Software Engineering author Capers Jones found unit testingto be only 25 to 30 percent effective at finding and removingdefects In RBCS assessments we have found that good sys-tem testing by an independent test team averages around 85percent effective at finding defects So organizations that careabout quality should have both good unit testing and good sys-tem testing

Second while unit testing is often thought of as thedeveloperrsquos job I have found that many programmers justdonrsquot do it or deliver cursory attempts This can create com-plications and delays for the system test team in its role asldquobug finder of last resortrdquo as they deal with buggy code dur-ing short test execution periods in the sprints This can alsolead to code churn during testing since so much code mustchange to fix the bugs

The amount of change can ultimately outstrip even theability of the best automated regression test system to keepup which then leads to lower defect detection effectivenessfor the test team

MANAGING THE INCREASED REGRESSION RISKCapers Jones also reported that regression accounts forabout seven percent of bugs In iterative life cycles such asScrum however code that worked in previous sprints getschurned by new features in each subsequent sprint Thisincreases the risk of regression

Agilists emphasize good automated unit testing in part tomanage the regression risk inherent in such churn But with eventhe best unit testing topping out at about 50 percent defect

JULY 2009 wwwstpcollaborativecom bull 21

[While unit testing is often thought

of as the developerrsquos job I have found that

they just donrsquot do it or [they] deliver

cursory attempts]

22 bull Software Test amp Performance JULY 2009

work together to cluster in small groups working on specific fea-tures to work with the customer gaining valuable feedbackmdashandto collaborate across disciplines From this they can decidewhere when and to what degree documentation is necessary

We expect team members to be professionals and to beresponsible and accountable We trust their judgment overblindly filling in a plan template or a requirement document orblindly running the same tests over and over again to meetsome perceived risk

All oracles have a cost We engage the customers in thesedecisions as wellmdashensuring that those oracles that are providedare actually used and kept current

THE SHIFTING TEST BASISIrsquom lost here Agile teams define requirements Some usePost-it Notes and cards or define them as User StoriesOthers describe use cases or use traditional requirementforms Many even add a construct called acceptance tests ona finely grained story level that helps define the critical ele-ments of each feature or requirement These are things thatthe customer will ldquotestrdquo as part of their acceptance at the endof a Sprint or Iteration

Irsquod guess that Rex has seen poor or missing requirements inhis agile experience and Irsquod say those were not mature or solid-ly performing agile teams ldquoGoodrdquo teams define requirementsat a level that assures the team will build what the customerneeds They ensure collaboration surrounding each require-mentmdashat the point of attack

To capture a key point here The agile methods try to adopta Just-in-Time and Just Enough approach to their requirement

removal effectiveness according to Jones automated regressiontesting via unit tests will miss at least half of the regression bugs

POOR OR MISSING TEST ORACLESAgile favors working software over written documentationand in my observations special scorn is reserved for specifi-cations The suggestion by the Agile Manifesto (wwwagile-manifesto org) that people should value ldquoworking softwareover comprehensive documentationrdquo can create real chal-lenges for a test team when taken too far Testers userequirements specifications and other documents as testoracles ie the means to determine correct behavior undera given test condition We have seen testers in agile situa-tions given documents with insufficient detail and in somecases given no documents at all

Even when testers are given adequate documents twoother agile development principles keep the test oracle chal-lenge alive First agile requires teams to embrace changeand espouses that ldquothe most efficient and effective methodof conveying information to and within a development team isface-to-face conversationrdquo

From what Irsquove seen these two principles allow the proj-ect team to change the definition of correct behavior at anytime including after testers have created tests to confirm aparticular behavior and even after testers have reported bugsagainst a particular behavior

Further the definition of correct behavior can change in ameeting or a discussion where a tester is not present If thatchange is not communicated to the tester the project suffers

false positives when testers report bugs against behaviornow defined as correct

THE SHIFTING TEST BASISRequirements-based testing strategies cannot handle vague ormissing requirements specifications Missing requirementsspecifications would mean a test team following a requirements-based testing strategy not only canrsquot say what it means for aparticular test to pass or fail they wouldnrsquot have a test basisupon which to base their tests

The test basis also provides a means to measure theresults In a requirements-based test strategy testers canrsquotreport test results accurately if the requirements are missing orpoor Testers canrsquot report the percentage of the test basis cov-ered by passed tests because the requirements wonrsquot provideenough detail for meaningful coverage analysis

TOO MANY MEETINGSA ldquoface-to-face conversationrdquo is just another way to describe ameeting A client once jokingly described Scrum as a heavy-weight process saying ldquoIrsquom surprised at the name agilemdashitshould be called couch potato There are too many meetingsTherersquos too much jawboning I find it really ironic that there are allthese books explaining how simple it isrdquo

Perhaps it was intentional not to use the dreaded word meet-ing in the Agile Manifesto but the result is often the same In someorganizations everyone gets invited to every meeting meetingsballoon in number and duration and managers and leads are notavailable to manage and lead their team because they are in meet-

ings much of the day Effectiveness and efficiency drop Anotherclient once told me of a senior manager who said in response toa complaint from a line manager about how attending meetingswas negatively impacting his ability to lead his team ldquoWersquore goingto continue to have these meetings until I find out why nothing isgetting done around hererdquo The senior manager was I am told theonly one in the room who didnrsquot get the irony

THE BOTTOM LINEIn RBCS client engagements we typically recommend blend-ed testing strategies that align well with Scrum and otheragile methodologies What seems most likely to work in manycases is a blend of risk-based testing maintainable automat-ed testing (at both the unit and system level) and a percent-age of effort spent on reactive testing (eg exploratory) Insome cases our methods will help mitigate the testing risksand reduce the testing challenges associated with thesemethodologies

Clearly there is no single way to perform effective soft-ware testing and as the industry evolves so too will the mosteffective methods that do the most good yacute

development An option of ldquono requirementsrdquo does not exist

TOO MANY MEETINGSThe point of face-to-face collaboration is that itrsquos the most effec-tive and richest way to communicate in technical teams Has anydeveloper ever implemented a requirement had a tester test itand a customer receive it with perfect alignment That wouldtruly be exceptional More oftenthe tester will say the softwareperforms differently than the testsandor the customer says I didnrsquotask for that even though theywere all working off of the exactsame requirement

The point is that technicalteams need to rely on written aswell as face-to-face collabora-tion to effectively reduce themisunderstandings and reworkthat written requirements alone often drive

Beyond this agile teams should be conducting incrediblyeffective meetings something perhaps lacking in the exam-ples Rex has seen I suspect this problem isnrsquot agile-centricand that those teams simply need a good facilitator

THE BOTTOM LINEEffectively testing todayrsquos software is incredibly challenging andwe all need to stop thinking of agile testing as somethingopposed to traditional testing Testers need every possible tool

to perform testing well Agile methods have emerged as a set ofpractices and a mindset that is context based and powerful Theytake a holistic view to testing and qualitymdashreinforcing the two asa whole Testers in this model have a powerful opportunity to notonly test but to champion and guide the overall teamrsquos focus ondelivering quality software

We need to stop our history of opposition to new testing app-roaches and begin embracing abroader set of testing practicesthat work in each projectrsquos uniquecontext I also believe we that weneed to stop quoting CapersJones as I suspect his data andstudies do not reflect the dynam-ics of mature agile teams Irsquom look-ing forward to encouraging Rex totry and gain deeper understandingof agile testing practices yacute

Bob will have ample oppor-tunity to do that in October when he and Rex meet for thefirst time as fellow members of STampP CollaborativersquosStrategic Advisory Board ndashEd

JULY 2009 wwwstpcollaborativecom bull 23

[Teams need to rely on written and face-to-face

collaboration to effectively reduce rework]

LEARN AGILE TESTING TECHNIQUESdirectly from Bob Galen himself this fall atSTPCon The Scrum Master will be on hand inCambridge MA Oct 19-23 to teach ldquoAgile andHigh-Speed Software Testing Techniquesrdquo anin-depth training course for bringing agile test-

ing to your organization VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

LEARN TEST MANAGEMENT from Rex Blackat the Software Test amp Performance Conferencethis fall in Cambridge Mass Learn about ldquoTestManagement A Risk-Based Approachrdquo the prac-tice that guides test activities based on carefulassessment and prioritization of the risks involved VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

ciplines and draw on a variety of skills to do their jobs And whilethere are many unsung testing heroes inside software develop-ment shops there are the beginnings in many organizations of agrowing recognition that test and performance engineers shouldbe considered peers of their development counterparts

We asked several testing gurus to share their thoughts onthe state of the testing industry and give us some insights on wherethey see things evolving One thing for anyone to realize is thatwhile testing as an industry is maturing it is still relatively new Thesoftware tester job isnt something that shows up on the US CensusWe are still very much a new profession says Scott Barber thefounder and chief technologist of PerfTestPlus in Palm Bay Fla Mostof our very first programmers are still alive we arent thousands ofyears old like other disciplinesrdquo

Perhaps the best rundown of the industryrsquos evolution is summed up by Michael Boltonwho lives in Toronto Testing has historically been confirmation validation and verificationhe says I believe that testing must become far more about exploration discovery investi-gation learning and communicating what youve learnedrdquo Bolton runs test consultancyDevelopSense and teaches rapid software testing methods ldquoI like to encourage people torecognize and manage uncertainty and ambiguity and instead of trying to eliminate them get-ting them down to levels that are suffi-ciently low for people to do use-ful workrdquo

While any gathering ofsoftware testers will find dis-agreement on overall pur-pose one thing is cleartesting isnt about how toimprove code qualityndashthat isthe role of the engineeringpeople who do the coding workand of management says BoltonTesting is about providing informationto those people so that better-informed decisionsabout quality are possible

The future careers for the current cropof test engineers seem to be on the rise

All this activity has made software testing a very big tent The industry has matured and grownto encompass numerous approaches to methods training and tools Because of this diver-sity there is a growing consensus that the best testers are those that can span multiple dis-

The software testing industry is at a historic crossroads Rich Internet applica-tions cloud computing the rise of virtualization and more capable mobiledevices have all made the job of producing reliable software more complexwhile at the same time enabling more sophistication for automated test tools

Wersquore All Part of the Story

We are a new profession most of our first programmers are

still alivendash Scott Barber

A skilledtester is theproduct of avariety of

experiencesndash Jon Bach

24 bull Software Test amp Performance

the business outcomes theyrsquove delivered he says He goes on to describe the best mix in his development teams A performance

engineering organization should be comprised of engineers that are just as capableor more capable than the core product engineers This allows for the performanceteam to be just as familiar with the product code base and for them to go as far asmaking performance enhancements to the product and testing the outcome of thosechanges

At Google employees involved in testing generally have one of three titles saysDavid Saff a Google software engineer in test whorsquos basedin Cambridge Mass Software engineers are expected to unit-test their own code and many take an active role in maintain-ing testing frameworks and system tests for their teams Testengineers look at the quality requirements of a project identifyholes and fill them by applying existing automation tools where pos-sible or managing manual testing teams where needed Softwareengineers in test look for holes in the tools available and serve as internal

tool developers and implementation consultants Any teamcant survive without high competence in each domain

CLOUDS ARE FORMINGThe move toward cloud computing has helped softwaretesting overall but also made for some new challengestoo Cloud computing adds layers of complexity to test-ing but it also makes things more interesting says BoltonThese layers include being able to predict latency and avail-ability of all components that reside in the cloud Barber saysthat Too many people take latency for granted still and many

application developers still take the entire network infrastruc-ture for granted and dont consider it part of their domain

so programmers just assume it is working unless theircode has a problem

Bartow agrees Network latency hasbecome even more important with the intro-duction of Rich Internet Application tech-nologies such as Adobe Flex Every value that you enter

David Strom is a freelance reporter and author living in St Louis

particularly within large software shops such as Intuit and Google There software testengineers are often more important than the development engineers who writethe original code and have their own career track with parallel compensationand benefits

At Intuit QA engineers are held to the exact same standards and expec-tations as product developersrdquo says Dan Bartow a manager of performanceengineering for the company which makes TurboTax and other financial softwareldquoIn fact my team of five full-time performance engineers is calibrated the sameway as the product engineers at the end of year in terms of their skill-sets and

Virtualizationmakes things

easierand harder

ndash Michael Bolton

wwwstpcollaborativecom bull 25

of the Story

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 10: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

14 bull Software Test amp Performance JULY 2009

language elements for defining testcases combining tests into suites andasserting the various exit conditions thatdetermine whether the tests pass or failMany also provide a run-time environ-ment in which the tests execute Theserange from simple command-line reportsshowing the results of the tests to com-plete GUIs that integrate directly withthe developerrsquos programming environ-ment Automated unit tests often areintegrated with the continuous build sys-tem and are run each and every time theproject is built This helps to ensure thatthe developers know instantaneouslywhen a recent check-in has caused thesuite of unit tests to fail

When working with automated unittests developers commonly find itnecessary to use mock objects fakeobjects or method stubs to stand in forother parts of the application Forexample if a module obtains informa-tion from a database it might beadvantageous to use a fake databaseobject to provide canneddata that is similar to thereal data available in theproduction environmentThis approach gives theprogrammer more controlover the test environ-ment and lets them dic-tate which data isreturned and how to rep-resent a variety of inter-esting situations Furtherthe technique allowsdevelopers to simulateproblems and failures thatmay be difficult to triggerwhen using a real data-base These might includeconnectivity issues bufferoverruns and other errorconditions

Automated unit testsprovide several key ben-efits in an agile environ-ment First they make itpossible to achieve highlevels of code coveragealmost as a side-effect Ifa programmer is disciplined aboutTDD then near 100 percent coveragejust happens Second automated unittests help to provide an early warningindicator that shows when behaviorwithin a module has changed This canbe used to identify unintended side-effects of other development and toserve as a kind of safety net for later

refactorings If after reworking a mod-ule all of the existing unit tests passthe developer can be confident that themodule will function properly in thecontext of the full application Finallyautomated unit tests also help to raisethe initial quality of the applicationdelivered to testers By using tests aspart of the coding process developersare able to catch many of the minorerrors and trivial mistakes that wouldotherwise fall to testers Reducingobvious coding errors through unittesting helps free up testers for theimportant job of manual exploratorytesting

AUTOMATED ACCEPTANCETESTINGAn essential element to the success ofan application project is how well itmeets the needs of its users A prod-uct might be absolutely flawless per-fect in every way and still not do thejob One way to ensure that the busi-

ness needs are met iswith automated accept-ance testing

With automated ac-ceptance testing the cus-tomer often with help ofQA writes tests for thekey features that arescheduled to be imple-mented in the upcomingiteration These tests arewritten in a natural lan-guage that non-technicalpeople can read and as aresult they become exe-cutable specifications thatcannot get out-of-syncwith the application beingtested

Automated accept-ance tests tend to focuson successful use casesthey often ignore the sadpaths and exceptional situ-ations For this reason atest strategy built entirelyon automated acceptancetesting will be inadequate

to thoroughly test the applicationSometimes the testers will augment thetest suite created by the customer withadditional acceptance tests that docover more of the corner cases

Automated acceptance tests pro-vide the primary measure of progress asthe product evolves over the course ofan iteration or even across an entire

release plan The acceptance tests arewritten early in (or even before) the iter-ation and running them clearly illustrateshow many of the stories have been com-pleted As part of an iteration reviewmeeting the tests are run again todemonstrate that the scope selected atthe start of the iteration was implement-ed according to the customerrsquos own def-inition of what is correct This process ismainly a formality it should be no sur-prise to anyone whether the tests willpass or fail when run during the iterationreview meeting

Two approaches to automatedacceptance testing are the Frameworkfor Integrated Test (FIT) developed byWard Cunningham and FitNesse awiki originally written in Java by MicahMartin Robert C Martin and MichaelFeathers FitNesse extends the princi-ples established by Cunninghamrsquos tooland provides a mechanism by whichtests may be created and maintained inaddition to being executed Both are

[With

automated

acceptance

testing

the

customer

writes

tests for

the key

features]

based on creating tables of input datathat are fed into the application undertest The programrsquos output is capturedand compared to the expected resultswhich is also provided in the tables

While tests themselves are definedusing a combination of narrative textand tabular data theyrsquore executed byfixture code that accepts the inputfrom the tables calls the appropriatemethods in the application and pro-vides the output back to the test run-ner These fixtures are generally writ-ten in the language of the applicationand must be implemented and main-tained by developers However thegoal is that they be thin and serve onlyto translate inputs and outputsbetween the test harness and theapplication Therefore the overallimpact that working with the fixturecode has on the developersrsquo productiv-ity should be minimal

As with automated unit tests auto-mated acceptance tests are likely tobe run against an incomplete applica-tion Not much functionality exists inearly iterations so it may be necessaryto test one part before another hasbeen built Again simulators are oftenused to provide the missing behaviorsIt is up to the project team to decidewhether the simulators are replacedwith real implementations as theybecome available in later iterationsEach approach has its advantagesUsing real components may providemore confidence that the applicationworks properly However using simula-tors may provide more control over thesystem during the tests and as withthe mocks and fakes used in unit test-ing itrsquos sometimes easier to recreatefailures to verify that the applicationhandles them properly

AUTOMATED REGRESSIONTESTINGResponsible thorough testing mustverify not only the functional require-ments but non-functional requirementsas well Automated regression testingwhich seeks to verify functions thathave worked in the past can be aneffective way to provide this additionalcoverage and fill in the gaps left by theacceptance tests

Unlike unit and acceptance testingwhich most heavily involve developersand customers automated regressiontesting is performed by testersAutomated regression tests should

JULY 2009 wwwstpcollaborativecom bull 15

CHALLENGES OF TESTING IN AGILE ENVIRONMENTS

The agile environment exaggerates many of the challenges faced in more convention-

al approaches to testing First the pace of the development effort requires that the

application be tested within each iteration in parallel with the actual coding

Conventional approaches that focus on full regression during a time when the code is

frozen are ineffective in agile environments Agile testing tends to focus on the new

functionality and providing rapid feedback to the programmers However regression

testing is still important and some mechanisms by which the entire application may

be exercised end-to-end must be implemented

Testing on agile projects begins almost immediately However in the early iterations

there may not be much of the application to test Conventional approaches that focus

on running the completed application do not work well in agile environments Even

when automated efforts are employed in conventional environments the emphasis

often is on exercising the application from the outside in In other words scripts drive

the application through the UI In a methodology where the UI evolves with the rest of

the application though it is not possible to test the application in this manner

Requirements in agile environments tend to be less formal than in conventional

methodologies Requirements also referred to as stories in some agile iterations often

begin life as a single phrase or short sentence describing at a very high level what the

application must do As part of the iteration or release planning process the project

teammdasha cross-functional mixture of programmers testers business analysts and

stakeholdersmdashcollaborate to flesh out the stories The stakeholders often referred to

collectively as the customer focus on the observable ldquohappy-pathrdquo outcomes while

the testers add details to cover the odd exceptional or ldquocornerrdquo cases In some agile

methodologies these details are captured in the form of acceptance tests Conventional

approaches that use requirements documents as the basis of test cases tend to struggle

in agile settings

Agile depends heavily on effective collaboration among the programmers testers and

stakeholders It is not uncommon in agile environments to find testers and developers

working closely together on a daily basis This tight coordination of the various groups

often is absent in more conventional methodologies where the product transitions

through the development life cycle as a series of handoffs The business analysts deliv-

er the requirements to the architects who in turn deliver a design to the programmers

who deliver an application to the testers who test it according to the initial require-

ments People working in conventional settings sometimes are uncomfortable collabo-

rating with those in other groups

Successful agile efforts must find ways to overcome each of these challenges

Automation is essential to addressing the issue of pace By building automation infra-

structure early in the project when there is little real functionality available to test it

becomes far easier to run more tests later in the project Further this approach helps

to provide the regression testing necessary to ensure the product is evolving stably and

not losing functionality once working However not all tests can nor should be auto-

mated Skilled and experienced professional testers can add significant value through

effective manual testing By automating what can be automated testers have more

time to apply their talents through manual testing efforts

attempt to cover as much of the appli-cation as possible and they shouldrely on mocks and simulators sparing-ly This allows the tests to provide afairly realistic view of the state of theprogram and to more accurately pre-dict how it will behave in a productionenvironment

Creating and maintaining a usefulsuite of automated regression tests maybe the most difficult aspect of testing inan agile environment In early iterationsnot much of the application exists sothere are few meaningful regression

tests that can be written and executedAs the program grows the scope andcomplexity of the project may becomeoverwhelming and it may be challengingto keep up with the pace of develop-ment It is usually helpful to spend earlyiterations creating a framework forautomated regression tests andpreparing outlines and test scripts forfunctionality that will be added overtime By investing in the scaffolding inthe beginning it becomes possible forthe testers to add new regressiontests incrementally Also the existing

regression tests help to ensure thatthe application continues to performaccording to the expectations estab-lished in prior iterations This makes itpossible for the testers to continue tofocus on new functionality rather thanconstantly reconfirming existing behav-ior is correct

In conventional development meth-odologies there are many categoriesof tests intended to exercise the appli-cation end-to-end These include per-formance tests load tests stresstests integration tests smoke testsand others In the end each of thesemay be thought of as a type of regres-sion test All attempts to determinehow well the product meets the func-tional and non-functional requirementsalso attempt to exercise the completeapplication and all help to ensure thatfunctionality is never lost once it hasbeen implemented

One of the key components ofagile methodologies is a focus on busi-ness value Therefore any of thesetypes of tests that provide valuableand meaningful feedback about thestate of the project should be givenattention in agile environments Forexample if performance is important tothe application then performancetests should be automated and made apart of the regression test suite Thesame is true for any of the other typesof tests mentioned here

Because automated regression

[Automated

regression tests

should cover

as much of the

application

as possible

and should

use mocks

sparingly]

16 bull Software Test amp Performance JULY 2009

tests may serve a variety of purposesthere is no standard approach to imple-menting them Often a combination oftools must be employed to address eachspecific situation Both commercial andfree tools are available for the differenttypes of testing and each may be bestsuited to a particular type of applicationIn addition some teams build customframeworks using Ruby Java PerlPython or another language to get thebest match for their project

MANUAL EXPLORATORY TESTINGThere are two main types ofdefects those under whichthe application fails to performaccording to expectations (asdefined by its requirements)and those that cause theapplication to behave improp-erly in response to an unex-pected event Automatedtests can be effective at find-ing the former For a given setof expected inputs the appli-cation should produce a specific output Howeverautomation is not particularlygood at finding the unexpect-ed

Consider for example anapplication that permits a userto perform tasks A B and CAs the product was beingdeveloped no one expected that A Band C were in any way related Howeverwhen the user performs these tasks in aspecific order the program crashes It isunlikely that the functional requirementsinclude a statement like ldquoThe systemshall not crash when the user performsA then C then Brdquo As a result it isunlikely that an automated test exists tosimulate the situation This is where man-ual exploratory testing can be beneficial

Manual exploratory testing gives thetester an opportunity to truly demon-strate their value to a project Testerstend to be good at coming up withunusual ldquowhat ifrdquo scenarios that repre-sent what actual users will do with theapplication in the production environ-ments These might be long series oftasks odd combinations of events orprolonged use of the program Oftenthese scenarios are too difficult or cost-ly to automate

Manual exploratory testing is morethan the tester wandering randomlythrough the application To be done

effectively the tester must start with aplan He needs to isolate a section ofthe program to explore and time box theexploration Once the tester begins theexploration he should create a record ofthe actions taken This is crucial to theability to recreate any problems foundScreen recording tools also might proveuseful for documenting exactly what thetester does during an exploratory testingsession When flaws are found duringthis process new acceptance or regres-sion tests should be written (and auto-mated) to expose the bad behavior pro-

vide a verification that it has been cor-rected and ensure that it does notreturn

For software testing to be success-ful in an agile development environmenta four-part strategy comprising compli-mentary approaches has given me thebest results

Automated unit testing puts theonus of removing obvious errors ondevelopers where it belongs and resultsin higher initial quality delivered totesters

Automated acceptance testingsolves the critical problem of engagingstakeholders to identify key functionalityand to define what it means for the pro-gram to work correctly and provide con-fidence that the core functionality isbeing properly implemented

Automated regression testing pro-vides thorough coverage ensures thatthe application meets functional and non-functional requirements and guaranteesthat functionality once working is not dis-abled in later iterations

Manual exploratory testing exposesunexpected behavior by letting testersexercise the application in ways too dif-ficult or costly to automate

This four-phase approach to testingenables a project team to effectively andthoroughly determine an applicationrsquosreadiness for release Consistent with

agile practices testingbegins early in the softwaredevelopment life cycle and itis done constantly through-out the project Every teammember participates in someway in the testing effort andthe mixture of testing typesminimizes the risks associat-ed with any single techniqueBy automating what can beautomated it becomes pos-sible to provide testers withsufficient time to fully applytheir unique skills and experi-ences yacute

REFERENCES AND FURTHERREADINGbull Cunningham Ward Framework

for Integrated Test October 122007 httpfitc2com

bull Dunkin Robert Burrell TheLanguage of Testing Dummies

Mocks Fakes and Stubs Weirdest UndreamtUse Case November 29 2008httpwwwjrollercomrobertburrelldonkinentrytdd_mocks_fakes_stubs

bull FitNesse Front Page FitNesse httpwwwfitnesseorg

bull Fowler Martin Mocks Arenrsquot Stubs MocksArenrsquot Stubs January 2 2007 httpwwwmartinfowlercomarticlesmocksArentStubshtml

bull jUnitorg Resources for Test DrivenDevelopment jUnit httpwwwjunitorg

bull Porter Robert Mock Objects in Test DrivenDesign BlogOfBob October 7 2007httpwwwrp2ccomblogofbobMockObjectsInTestDrivenDesignaspx

bull Rothman Johanna Does Exploratory TestingHave A Place On Agile Teams StickyMinds July7 2008httpwwwstickymindscomsitewideaspFunction=edetailampObjectType=COLampObjectId=13860

[Manual exploratory

testing is more than

wandering randomly

through the application

the tester must start

with a plan]

LEARN TEST AUTOMATION from Robert Walsh at the Software Test ampPerformance Conference this fall Hersquoll be with us in Cambridge MAOct 19-23 teaching ldquoFirst Steps to Test Automationrdquo an in-depth coursethat can automate your testing fast VViissiitt wwwwwwssttppccoonnccoomm

observed with clients using agile methodologies typi-cally Scrum While my associates and I have helpedclients overcome these challenges I will not present allof our suggested solutions here Instead Software Testamp Performance magazine has given test consultant andcertified Scrum Master Bob Galen an opportunity tooffer solutions

What I describe here are challenges we haveobserved or discussed with clients who told us they werefollowing an agile life cycle Some of these challengesmay be inherent in agile methodologies However somemay in fact actually arise from improper understanding orimplementation of the chosen methodology resulting inavoidable problems

VOLUME AND SPEED OF CHANGE One of the principles of agile development is that proj-ect teams should welcome changing requirementseven late in development Many testing strategies

especially analytical require-ments-based testing becomequite inefficient in such situa-tions While automated test-ing can help accommodateon-going change automationcan be fragile and high-main-tenance if not implementedproperly We find that risk-based testing coupled withreactive techniques likeexploratory testing and prop-

By Rex Black

Every software development model presents itsown testing implications Some ease the testingprocess others make it more challenging Whatfollows is a look at some challenges I have

Rex Black is founder and president of RBCS a software hardware andsystems consultancy and is a founding member of the STP CollaborativeStrategic Advisory Board

18 bull Software Test amp Performance JULY 2009

The AgileProcessCreatesTestingPuzzles

Which AreSolved When

Agile IsPerformedCorrectly

of traditional testing consultants I donrsquot believe theywere gathered from the perspective of an actual agiletester one with serious experience working in matureand high performing agile teams

Such views often come from the perspective of ascribe or observer of teams who at least from my per-spective arenrsquot practicing agility very well Or theyrsquoreanecdotal reactions to the Agile Manifesto or other writ-ings without trying to understand the underlying intentionsWhatrsquos discouraging is that superficial reactions from wellrespected sources can give the wrong impression

VOLUME AND SPEED OF CHANGEThe point of embracing change is the realization thatmost software projects are volatile Business conditionsare incredibly dynamic Customers may not actuallyknow what they need until they see the application startto coalesce as working software The developers maynot have understood the challenges associated with thearchitecture without actuallybuilding parts of it nor mighttesters understand the mostrisk-based areas to test with-out seeing the application run

The point of trying to freezerequirements at a historicalinstant in time where thingsare well understood withoutwriting a lick of software is afoolrsquos errand Agile methods tryto embrace change react to it

Iwas asked by Software Test amp Performance magazine to provide a rebuttal to Rex Blackrsquos perspectives on how ldquoagile challenges testingrdquoRexrsquos views and reactions to agility are typical

By Bob GalenBob Galen is principal consultant of the RGalen Consulting Group anda founding member of the STP Collaborative Strategic Advisory Board

JULY 2009 wwwstpcollaborativecom bull 19

erly-done automation works bestWhatrsquos more change-related challenges often arise from

changes in the definition of the product and its correct behav-ior When the test team is not kept informed of thesechanges or when the rate of change is very high this canimpose inefficiencies on the development execution andmaintenance of tests

SHORT ITERATIONS SQUEEZE TESTINGIn sequential life cycles if the organization involves the testteam early in the project the testers have lots of time todevelop and maintain their tests and can do so in parallel withthe development of the system and prior to the start of sys-tem test execution Some more formal iterative lifecycle mod-els such as Rapid Application Development and the RationalUnified Process often allow substantial periods of timebetween test execution periods for each iteration Theseintervals allow test teams develop and maintain their test sys-tems

Most teams using agile life cycles progress more quicklyeg Scrum uses sprints two to four week periods of shortfast-paced development iterations I have seen this pacesqueeze the test teamrsquos ability to develop and maintain testsystems Testing strategies that include an automation ele-ment also have proven particularly sensitive to this challenge

UNIT TESTING IS OFTEN INADEQUATE Most agile proponents stress good automated unit testing andframeworks such as JUnit have helped simplify and minimize

and make the changes visible across the team and to stakehold-ers while guiding the software toward the customersrsquo true needsIf the traditional testing approaches that Rex alludes to can notadapt to change effectively then Irsquod argue that they need tochange

SHORT ITERATIONS SQUEEZE TESTINGIf your parochial goal is simply testing then I agree that shortiterations can be quite challenging But if your goal is to deliversmall chunks of working software to gain customer and prod-uct quality feedback and to ultimately deliver high value andworking products applications then small iterations rock

To Rexrsquos point regarding ldquosqueezing test system mainte-nancerdquo all of the agile methods adhere to the notion of com-pleting a small set of features that meet rdquodonenessrdquo criteriaat the end of each Iteration or Sprint

There should be no work that is left outmdashfor exampleappropriate test automation development or maintenance oftest ldquosystemsrdquo

Itrsquos up to the testers to simply place this work on theProduct and Sprint Backlogs and plan its completion as partof each Iteration or Sprint

Sometimes this takes effort and courage to defend this workagainst feature pressure However all team members encounterthe same challenge and need to ensure that they deliver ldquodonerdquosoftware from each of their functional perspectives

UNIT TESTING IS OFTEN INADEQUATEFirst agile methods stress automated testing of all kindsmdashnotsimply unit tests FitNesse Selenium and Watir Watin are

examples of open source automation tools that are widelyused in the agile community and that focus on non-unit testing

We do stress unit tests as being an important part of theoverall coverage and as being a safety net for the teamsrsquo refac-toring efforts We want developers to be responsible for con-tributing to test automationmdashnot just testers And if a team isusing TDD as a practice wersquove learned that developing testscan be a powerful aid as one designs their software in smallincremental steps

No agile team to my knowledge views unit tests as theend-all of testing In fact as part of delivering ldquodonerdquo softwareincrements they must decide on the set of tests (unit func-

20 bull Software Test amp Performance JULY 2009

[I expect testers to pair with

developers to improve practices anddevelopers

to deliver solid unit tests before they

can be considered lsquodonersquo]

tional acceptance non-functional regression etc) thatshould be run within the Sprint in order to meet a customer-ready deliverable Every team member should be definingthese testing tasks with the testers taking the lead

To Rexrsquos last point that programmers donrsquot commit to doingeffective unit testing I expect the testers to pair with the devel-opers to improve these practices and I expect developers todeliver solid unit tests before they can be considered ldquodonerdquo

MANAGING THE INCREASED REGRESSION RISKThere is a continuing misperception that unit tests are theonly tests run on a Sprint-by-Sprint basis While itrsquos true thatthe team might not be able to run a full regression test with-in each Sprint it is their responsibility to apply risk-basedtechniques based upon whatrsquos been added or changed andto run appropriate functional non-functional exploratory andregression tests as part of delivering ldquodonerdquo software at theend of the Sprint

Clearly automation development at all levels helps withthis so the team should be investing time in improving theiroverall ability to automate all of the testing types I mention asappropriate

POOR OR MISSING TEST ORACLESI hate to break the news to Rex but the best test oracles arethe people within the team not the documentation Documen-tation almost always loses its relevance and accuracy Yes itrsquoshelpful but should never be considered the single source forall knowledge

What agility encourages is for the team to talkmdashto plan their

the costs Such automated unit tests allow what agile propo-nents call refactoring Refactoring is the redesign of majorchunks of code or even entire objects The automated unit

tests provide for quick regression testing of refactored codeSome agilists recommend substituting automated unit tests fordesign as in Test Driven Development While this soundsgood in theory there are two problems that I have observedwith this approach in practice

First unit testing has limited bug-finding utility In researchpublished in June 2002 in CrossTalk the Journal of Defense

Software Engineering author Capers Jones found unit testingto be only 25 to 30 percent effective at finding and removingdefects In RBCS assessments we have found that good sys-tem testing by an independent test team averages around 85percent effective at finding defects So organizations that careabout quality should have both good unit testing and good sys-tem testing

Second while unit testing is often thought of as thedeveloperrsquos job I have found that many programmers justdonrsquot do it or deliver cursory attempts This can create com-plications and delays for the system test team in its role asldquobug finder of last resortrdquo as they deal with buggy code dur-ing short test execution periods in the sprints This can alsolead to code churn during testing since so much code mustchange to fix the bugs

The amount of change can ultimately outstrip even theability of the best automated regression test system to keepup which then leads to lower defect detection effectivenessfor the test team

MANAGING THE INCREASED REGRESSION RISKCapers Jones also reported that regression accounts forabout seven percent of bugs In iterative life cycles such asScrum however code that worked in previous sprints getschurned by new features in each subsequent sprint Thisincreases the risk of regression

Agilists emphasize good automated unit testing in part tomanage the regression risk inherent in such churn But with eventhe best unit testing topping out at about 50 percent defect

JULY 2009 wwwstpcollaborativecom bull 21

[While unit testing is often thought

of as the developerrsquos job I have found that

they just donrsquot do it or [they] deliver

cursory attempts]

22 bull Software Test amp Performance JULY 2009

work together to cluster in small groups working on specific fea-tures to work with the customer gaining valuable feedbackmdashandto collaborate across disciplines From this they can decidewhere when and to what degree documentation is necessary

We expect team members to be professionals and to beresponsible and accountable We trust their judgment overblindly filling in a plan template or a requirement document orblindly running the same tests over and over again to meetsome perceived risk

All oracles have a cost We engage the customers in thesedecisions as wellmdashensuring that those oracles that are providedare actually used and kept current

THE SHIFTING TEST BASISIrsquom lost here Agile teams define requirements Some usePost-it Notes and cards or define them as User StoriesOthers describe use cases or use traditional requirementforms Many even add a construct called acceptance tests ona finely grained story level that helps define the critical ele-ments of each feature or requirement These are things thatthe customer will ldquotestrdquo as part of their acceptance at the endof a Sprint or Iteration

Irsquod guess that Rex has seen poor or missing requirements inhis agile experience and Irsquod say those were not mature or solid-ly performing agile teams ldquoGoodrdquo teams define requirementsat a level that assures the team will build what the customerneeds They ensure collaboration surrounding each require-mentmdashat the point of attack

To capture a key point here The agile methods try to adopta Just-in-Time and Just Enough approach to their requirement

removal effectiveness according to Jones automated regressiontesting via unit tests will miss at least half of the regression bugs

POOR OR MISSING TEST ORACLESAgile favors working software over written documentationand in my observations special scorn is reserved for specifi-cations The suggestion by the Agile Manifesto (wwwagile-manifesto org) that people should value ldquoworking softwareover comprehensive documentationrdquo can create real chal-lenges for a test team when taken too far Testers userequirements specifications and other documents as testoracles ie the means to determine correct behavior undera given test condition We have seen testers in agile situa-tions given documents with insufficient detail and in somecases given no documents at all

Even when testers are given adequate documents twoother agile development principles keep the test oracle chal-lenge alive First agile requires teams to embrace changeand espouses that ldquothe most efficient and effective methodof conveying information to and within a development team isface-to-face conversationrdquo

From what Irsquove seen these two principles allow the proj-ect team to change the definition of correct behavior at anytime including after testers have created tests to confirm aparticular behavior and even after testers have reported bugsagainst a particular behavior

Further the definition of correct behavior can change in ameeting or a discussion where a tester is not present If thatchange is not communicated to the tester the project suffers

false positives when testers report bugs against behaviornow defined as correct

THE SHIFTING TEST BASISRequirements-based testing strategies cannot handle vague ormissing requirements specifications Missing requirementsspecifications would mean a test team following a requirements-based testing strategy not only canrsquot say what it means for aparticular test to pass or fail they wouldnrsquot have a test basisupon which to base their tests

The test basis also provides a means to measure theresults In a requirements-based test strategy testers canrsquotreport test results accurately if the requirements are missing orpoor Testers canrsquot report the percentage of the test basis cov-ered by passed tests because the requirements wonrsquot provideenough detail for meaningful coverage analysis

TOO MANY MEETINGSA ldquoface-to-face conversationrdquo is just another way to describe ameeting A client once jokingly described Scrum as a heavy-weight process saying ldquoIrsquom surprised at the name agilemdashitshould be called couch potato There are too many meetingsTherersquos too much jawboning I find it really ironic that there are allthese books explaining how simple it isrdquo

Perhaps it was intentional not to use the dreaded word meet-ing in the Agile Manifesto but the result is often the same In someorganizations everyone gets invited to every meeting meetingsballoon in number and duration and managers and leads are notavailable to manage and lead their team because they are in meet-

ings much of the day Effectiveness and efficiency drop Anotherclient once told me of a senior manager who said in response toa complaint from a line manager about how attending meetingswas negatively impacting his ability to lead his team ldquoWersquore goingto continue to have these meetings until I find out why nothing isgetting done around hererdquo The senior manager was I am told theonly one in the room who didnrsquot get the irony

THE BOTTOM LINEIn RBCS client engagements we typically recommend blend-ed testing strategies that align well with Scrum and otheragile methodologies What seems most likely to work in manycases is a blend of risk-based testing maintainable automat-ed testing (at both the unit and system level) and a percent-age of effort spent on reactive testing (eg exploratory) Insome cases our methods will help mitigate the testing risksand reduce the testing challenges associated with thesemethodologies

Clearly there is no single way to perform effective soft-ware testing and as the industry evolves so too will the mosteffective methods that do the most good yacute

development An option of ldquono requirementsrdquo does not exist

TOO MANY MEETINGSThe point of face-to-face collaboration is that itrsquos the most effec-tive and richest way to communicate in technical teams Has anydeveloper ever implemented a requirement had a tester test itand a customer receive it with perfect alignment That wouldtruly be exceptional More oftenthe tester will say the softwareperforms differently than the testsandor the customer says I didnrsquotask for that even though theywere all working off of the exactsame requirement

The point is that technicalteams need to rely on written aswell as face-to-face collabora-tion to effectively reduce themisunderstandings and reworkthat written requirements alone often drive

Beyond this agile teams should be conducting incrediblyeffective meetings something perhaps lacking in the exam-ples Rex has seen I suspect this problem isnrsquot agile-centricand that those teams simply need a good facilitator

THE BOTTOM LINEEffectively testing todayrsquos software is incredibly challenging andwe all need to stop thinking of agile testing as somethingopposed to traditional testing Testers need every possible tool

to perform testing well Agile methods have emerged as a set ofpractices and a mindset that is context based and powerful Theytake a holistic view to testing and qualitymdashreinforcing the two asa whole Testers in this model have a powerful opportunity to notonly test but to champion and guide the overall teamrsquos focus ondelivering quality software

We need to stop our history of opposition to new testing app-roaches and begin embracing abroader set of testing practicesthat work in each projectrsquos uniquecontext I also believe we that weneed to stop quoting CapersJones as I suspect his data andstudies do not reflect the dynam-ics of mature agile teams Irsquom look-ing forward to encouraging Rex totry and gain deeper understandingof agile testing practices yacute

Bob will have ample oppor-tunity to do that in October when he and Rex meet for thefirst time as fellow members of STampP CollaborativersquosStrategic Advisory Board ndashEd

JULY 2009 wwwstpcollaborativecom bull 23

[Teams need to rely on written and face-to-face

collaboration to effectively reduce rework]

LEARN AGILE TESTING TECHNIQUESdirectly from Bob Galen himself this fall atSTPCon The Scrum Master will be on hand inCambridge MA Oct 19-23 to teach ldquoAgile andHigh-Speed Software Testing Techniquesrdquo anin-depth training course for bringing agile test-

ing to your organization VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

LEARN TEST MANAGEMENT from Rex Blackat the Software Test amp Performance Conferencethis fall in Cambridge Mass Learn about ldquoTestManagement A Risk-Based Approachrdquo the prac-tice that guides test activities based on carefulassessment and prioritization of the risks involved VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

ciplines and draw on a variety of skills to do their jobs And whilethere are many unsung testing heroes inside software develop-ment shops there are the beginnings in many organizations of agrowing recognition that test and performance engineers shouldbe considered peers of their development counterparts

We asked several testing gurus to share their thoughts onthe state of the testing industry and give us some insights on wherethey see things evolving One thing for anyone to realize is thatwhile testing as an industry is maturing it is still relatively new Thesoftware tester job isnt something that shows up on the US CensusWe are still very much a new profession says Scott Barber thefounder and chief technologist of PerfTestPlus in Palm Bay Fla Mostof our very first programmers are still alive we arent thousands ofyears old like other disciplinesrdquo

Perhaps the best rundown of the industryrsquos evolution is summed up by Michael Boltonwho lives in Toronto Testing has historically been confirmation validation and verificationhe says I believe that testing must become far more about exploration discovery investi-gation learning and communicating what youve learnedrdquo Bolton runs test consultancyDevelopSense and teaches rapid software testing methods ldquoI like to encourage people torecognize and manage uncertainty and ambiguity and instead of trying to eliminate them get-ting them down to levels that are suffi-ciently low for people to do use-ful workrdquo

While any gathering ofsoftware testers will find dis-agreement on overall pur-pose one thing is cleartesting isnt about how toimprove code qualityndashthat isthe role of the engineeringpeople who do the coding workand of management says BoltonTesting is about providing informationto those people so that better-informed decisionsabout quality are possible

The future careers for the current cropof test engineers seem to be on the rise

All this activity has made software testing a very big tent The industry has matured and grownto encompass numerous approaches to methods training and tools Because of this diver-sity there is a growing consensus that the best testers are those that can span multiple dis-

The software testing industry is at a historic crossroads Rich Internet applica-tions cloud computing the rise of virtualization and more capable mobiledevices have all made the job of producing reliable software more complexwhile at the same time enabling more sophistication for automated test tools

Wersquore All Part of the Story

We are a new profession most of our first programmers are

still alivendash Scott Barber

A skilledtester is theproduct of avariety of

experiencesndash Jon Bach

24 bull Software Test amp Performance

the business outcomes theyrsquove delivered he says He goes on to describe the best mix in his development teams A performance

engineering organization should be comprised of engineers that are just as capableor more capable than the core product engineers This allows for the performanceteam to be just as familiar with the product code base and for them to go as far asmaking performance enhancements to the product and testing the outcome of thosechanges

At Google employees involved in testing generally have one of three titles saysDavid Saff a Google software engineer in test whorsquos basedin Cambridge Mass Software engineers are expected to unit-test their own code and many take an active role in maintain-ing testing frameworks and system tests for their teams Testengineers look at the quality requirements of a project identifyholes and fill them by applying existing automation tools where pos-sible or managing manual testing teams where needed Softwareengineers in test look for holes in the tools available and serve as internal

tool developers and implementation consultants Any teamcant survive without high competence in each domain

CLOUDS ARE FORMINGThe move toward cloud computing has helped softwaretesting overall but also made for some new challengestoo Cloud computing adds layers of complexity to test-ing but it also makes things more interesting says BoltonThese layers include being able to predict latency and avail-ability of all components that reside in the cloud Barber saysthat Too many people take latency for granted still and many

application developers still take the entire network infrastruc-ture for granted and dont consider it part of their domain

so programmers just assume it is working unless theircode has a problem

Bartow agrees Network latency hasbecome even more important with the intro-duction of Rich Internet Application tech-nologies such as Adobe Flex Every value that you enter

David Strom is a freelance reporter and author living in St Louis

particularly within large software shops such as Intuit and Google There software testengineers are often more important than the development engineers who writethe original code and have their own career track with parallel compensationand benefits

At Intuit QA engineers are held to the exact same standards and expec-tations as product developersrdquo says Dan Bartow a manager of performanceengineering for the company which makes TurboTax and other financial softwareldquoIn fact my team of five full-time performance engineers is calibrated the sameway as the product engineers at the end of year in terms of their skill-sets and

Virtualizationmakes things

easierand harder

ndash Michael Bolton

wwwstpcollaborativecom bull 25

of the Story

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 11: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

based on creating tables of input datathat are fed into the application undertest The programrsquos output is capturedand compared to the expected resultswhich is also provided in the tables

While tests themselves are definedusing a combination of narrative textand tabular data theyrsquore executed byfixture code that accepts the inputfrom the tables calls the appropriatemethods in the application and pro-vides the output back to the test run-ner These fixtures are generally writ-ten in the language of the applicationand must be implemented and main-tained by developers However thegoal is that they be thin and serve onlyto translate inputs and outputsbetween the test harness and theapplication Therefore the overallimpact that working with the fixturecode has on the developersrsquo productiv-ity should be minimal

As with automated unit tests auto-mated acceptance tests are likely tobe run against an incomplete applica-tion Not much functionality exists inearly iterations so it may be necessaryto test one part before another hasbeen built Again simulators are oftenused to provide the missing behaviorsIt is up to the project team to decidewhether the simulators are replacedwith real implementations as theybecome available in later iterationsEach approach has its advantagesUsing real components may providemore confidence that the applicationworks properly However using simula-tors may provide more control over thesystem during the tests and as withthe mocks and fakes used in unit test-ing itrsquos sometimes easier to recreatefailures to verify that the applicationhandles them properly

AUTOMATED REGRESSIONTESTINGResponsible thorough testing mustverify not only the functional require-ments but non-functional requirementsas well Automated regression testingwhich seeks to verify functions thathave worked in the past can be aneffective way to provide this additionalcoverage and fill in the gaps left by theacceptance tests

Unlike unit and acceptance testingwhich most heavily involve developersand customers automated regressiontesting is performed by testersAutomated regression tests should

JULY 2009 wwwstpcollaborativecom bull 15

CHALLENGES OF TESTING IN AGILE ENVIRONMENTS

The agile environment exaggerates many of the challenges faced in more convention-

al approaches to testing First the pace of the development effort requires that the

application be tested within each iteration in parallel with the actual coding

Conventional approaches that focus on full regression during a time when the code is

frozen are ineffective in agile environments Agile testing tends to focus on the new

functionality and providing rapid feedback to the programmers However regression

testing is still important and some mechanisms by which the entire application may

be exercised end-to-end must be implemented

Testing on agile projects begins almost immediately However in the early iterations

there may not be much of the application to test Conventional approaches that focus

on running the completed application do not work well in agile environments Even

when automated efforts are employed in conventional environments the emphasis

often is on exercising the application from the outside in In other words scripts drive

the application through the UI In a methodology where the UI evolves with the rest of

the application though it is not possible to test the application in this manner

Requirements in agile environments tend to be less formal than in conventional

methodologies Requirements also referred to as stories in some agile iterations often

begin life as a single phrase or short sentence describing at a very high level what the

application must do As part of the iteration or release planning process the project

teammdasha cross-functional mixture of programmers testers business analysts and

stakeholdersmdashcollaborate to flesh out the stories The stakeholders often referred to

collectively as the customer focus on the observable ldquohappy-pathrdquo outcomes while

the testers add details to cover the odd exceptional or ldquocornerrdquo cases In some agile

methodologies these details are captured in the form of acceptance tests Conventional

approaches that use requirements documents as the basis of test cases tend to struggle

in agile settings

Agile depends heavily on effective collaboration among the programmers testers and

stakeholders It is not uncommon in agile environments to find testers and developers

working closely together on a daily basis This tight coordination of the various groups

often is absent in more conventional methodologies where the product transitions

through the development life cycle as a series of handoffs The business analysts deliv-

er the requirements to the architects who in turn deliver a design to the programmers

who deliver an application to the testers who test it according to the initial require-

ments People working in conventional settings sometimes are uncomfortable collabo-

rating with those in other groups

Successful agile efforts must find ways to overcome each of these challenges

Automation is essential to addressing the issue of pace By building automation infra-

structure early in the project when there is little real functionality available to test it

becomes far easier to run more tests later in the project Further this approach helps

to provide the regression testing necessary to ensure the product is evolving stably and

not losing functionality once working However not all tests can nor should be auto-

mated Skilled and experienced professional testers can add significant value through

effective manual testing By automating what can be automated testers have more

time to apply their talents through manual testing efforts

attempt to cover as much of the appli-cation as possible and they shouldrely on mocks and simulators sparing-ly This allows the tests to provide afairly realistic view of the state of theprogram and to more accurately pre-dict how it will behave in a productionenvironment

Creating and maintaining a usefulsuite of automated regression tests maybe the most difficult aspect of testing inan agile environment In early iterationsnot much of the application exists sothere are few meaningful regression

tests that can be written and executedAs the program grows the scope andcomplexity of the project may becomeoverwhelming and it may be challengingto keep up with the pace of develop-ment It is usually helpful to spend earlyiterations creating a framework forautomated regression tests andpreparing outlines and test scripts forfunctionality that will be added overtime By investing in the scaffolding inthe beginning it becomes possible forthe testers to add new regressiontests incrementally Also the existing

regression tests help to ensure thatthe application continues to performaccording to the expectations estab-lished in prior iterations This makes itpossible for the testers to continue tofocus on new functionality rather thanconstantly reconfirming existing behav-ior is correct

In conventional development meth-odologies there are many categoriesof tests intended to exercise the appli-cation end-to-end These include per-formance tests load tests stresstests integration tests smoke testsand others In the end each of thesemay be thought of as a type of regres-sion test All attempts to determinehow well the product meets the func-tional and non-functional requirementsalso attempt to exercise the completeapplication and all help to ensure thatfunctionality is never lost once it hasbeen implemented

One of the key components ofagile methodologies is a focus on busi-ness value Therefore any of thesetypes of tests that provide valuableand meaningful feedback about thestate of the project should be givenattention in agile environments Forexample if performance is important tothe application then performancetests should be automated and made apart of the regression test suite Thesame is true for any of the other typesof tests mentioned here

Because automated regression

[Automated

regression tests

should cover

as much of the

application

as possible

and should

use mocks

sparingly]

16 bull Software Test amp Performance JULY 2009

tests may serve a variety of purposesthere is no standard approach to imple-menting them Often a combination oftools must be employed to address eachspecific situation Both commercial andfree tools are available for the differenttypes of testing and each may be bestsuited to a particular type of applicationIn addition some teams build customframeworks using Ruby Java PerlPython or another language to get thebest match for their project

MANUAL EXPLORATORY TESTINGThere are two main types ofdefects those under whichthe application fails to performaccording to expectations (asdefined by its requirements)and those that cause theapplication to behave improp-erly in response to an unex-pected event Automatedtests can be effective at find-ing the former For a given setof expected inputs the appli-cation should produce a specific output Howeverautomation is not particularlygood at finding the unexpect-ed

Consider for example anapplication that permits a userto perform tasks A B and CAs the product was beingdeveloped no one expected that A Band C were in any way related Howeverwhen the user performs these tasks in aspecific order the program crashes It isunlikely that the functional requirementsinclude a statement like ldquoThe systemshall not crash when the user performsA then C then Brdquo As a result it isunlikely that an automated test exists tosimulate the situation This is where man-ual exploratory testing can be beneficial

Manual exploratory testing gives thetester an opportunity to truly demon-strate their value to a project Testerstend to be good at coming up withunusual ldquowhat ifrdquo scenarios that repre-sent what actual users will do with theapplication in the production environ-ments These might be long series oftasks odd combinations of events orprolonged use of the program Oftenthese scenarios are too difficult or cost-ly to automate

Manual exploratory testing is morethan the tester wandering randomlythrough the application To be done

effectively the tester must start with aplan He needs to isolate a section ofthe program to explore and time box theexploration Once the tester begins theexploration he should create a record ofthe actions taken This is crucial to theability to recreate any problems foundScreen recording tools also might proveuseful for documenting exactly what thetester does during an exploratory testingsession When flaws are found duringthis process new acceptance or regres-sion tests should be written (and auto-mated) to expose the bad behavior pro-

vide a verification that it has been cor-rected and ensure that it does notreturn

For software testing to be success-ful in an agile development environmenta four-part strategy comprising compli-mentary approaches has given me thebest results

Automated unit testing puts theonus of removing obvious errors ondevelopers where it belongs and resultsin higher initial quality delivered totesters

Automated acceptance testingsolves the critical problem of engagingstakeholders to identify key functionalityand to define what it means for the pro-gram to work correctly and provide con-fidence that the core functionality isbeing properly implemented

Automated regression testing pro-vides thorough coverage ensures thatthe application meets functional and non-functional requirements and guaranteesthat functionality once working is not dis-abled in later iterations

Manual exploratory testing exposesunexpected behavior by letting testersexercise the application in ways too dif-ficult or costly to automate

This four-phase approach to testingenables a project team to effectively andthoroughly determine an applicationrsquosreadiness for release Consistent with

agile practices testingbegins early in the softwaredevelopment life cycle and itis done constantly through-out the project Every teammember participates in someway in the testing effort andthe mixture of testing typesminimizes the risks associat-ed with any single techniqueBy automating what can beautomated it becomes pos-sible to provide testers withsufficient time to fully applytheir unique skills and experi-ences yacute

REFERENCES AND FURTHERREADINGbull Cunningham Ward Framework

for Integrated Test October 122007 httpfitc2com

bull Dunkin Robert Burrell TheLanguage of Testing Dummies

Mocks Fakes and Stubs Weirdest UndreamtUse Case November 29 2008httpwwwjrollercomrobertburrelldonkinentrytdd_mocks_fakes_stubs

bull FitNesse Front Page FitNesse httpwwwfitnesseorg

bull Fowler Martin Mocks Arenrsquot Stubs MocksArenrsquot Stubs January 2 2007 httpwwwmartinfowlercomarticlesmocksArentStubshtml

bull jUnitorg Resources for Test DrivenDevelopment jUnit httpwwwjunitorg

bull Porter Robert Mock Objects in Test DrivenDesign BlogOfBob October 7 2007httpwwwrp2ccomblogofbobMockObjectsInTestDrivenDesignaspx

bull Rothman Johanna Does Exploratory TestingHave A Place On Agile Teams StickyMinds July7 2008httpwwwstickymindscomsitewideaspFunction=edetailampObjectType=COLampObjectId=13860

[Manual exploratory

testing is more than

wandering randomly

through the application

the tester must start

with a plan]

LEARN TEST AUTOMATION from Robert Walsh at the Software Test ampPerformance Conference this fall Hersquoll be with us in Cambridge MAOct 19-23 teaching ldquoFirst Steps to Test Automationrdquo an in-depth coursethat can automate your testing fast VViissiitt wwwwwwssttppccoonnccoomm

observed with clients using agile methodologies typi-cally Scrum While my associates and I have helpedclients overcome these challenges I will not present allof our suggested solutions here Instead Software Testamp Performance magazine has given test consultant andcertified Scrum Master Bob Galen an opportunity tooffer solutions

What I describe here are challenges we haveobserved or discussed with clients who told us they werefollowing an agile life cycle Some of these challengesmay be inherent in agile methodologies However somemay in fact actually arise from improper understanding orimplementation of the chosen methodology resulting inavoidable problems

VOLUME AND SPEED OF CHANGE One of the principles of agile development is that proj-ect teams should welcome changing requirementseven late in development Many testing strategies

especially analytical require-ments-based testing becomequite inefficient in such situa-tions While automated test-ing can help accommodateon-going change automationcan be fragile and high-main-tenance if not implementedproperly We find that risk-based testing coupled withreactive techniques likeexploratory testing and prop-

By Rex Black

Every software development model presents itsown testing implications Some ease the testingprocess others make it more challenging Whatfollows is a look at some challenges I have

Rex Black is founder and president of RBCS a software hardware andsystems consultancy and is a founding member of the STP CollaborativeStrategic Advisory Board

18 bull Software Test amp Performance JULY 2009

The AgileProcessCreatesTestingPuzzles

Which AreSolved When

Agile IsPerformedCorrectly

of traditional testing consultants I donrsquot believe theywere gathered from the perspective of an actual agiletester one with serious experience working in matureand high performing agile teams

Such views often come from the perspective of ascribe or observer of teams who at least from my per-spective arenrsquot practicing agility very well Or theyrsquoreanecdotal reactions to the Agile Manifesto or other writ-ings without trying to understand the underlying intentionsWhatrsquos discouraging is that superficial reactions from wellrespected sources can give the wrong impression

VOLUME AND SPEED OF CHANGEThe point of embracing change is the realization thatmost software projects are volatile Business conditionsare incredibly dynamic Customers may not actuallyknow what they need until they see the application startto coalesce as working software The developers maynot have understood the challenges associated with thearchitecture without actuallybuilding parts of it nor mighttesters understand the mostrisk-based areas to test with-out seeing the application run

The point of trying to freezerequirements at a historicalinstant in time where thingsare well understood withoutwriting a lick of software is afoolrsquos errand Agile methods tryto embrace change react to it

Iwas asked by Software Test amp Performance magazine to provide a rebuttal to Rex Blackrsquos perspectives on how ldquoagile challenges testingrdquoRexrsquos views and reactions to agility are typical

By Bob GalenBob Galen is principal consultant of the RGalen Consulting Group anda founding member of the STP Collaborative Strategic Advisory Board

JULY 2009 wwwstpcollaborativecom bull 19

erly-done automation works bestWhatrsquos more change-related challenges often arise from

changes in the definition of the product and its correct behav-ior When the test team is not kept informed of thesechanges or when the rate of change is very high this canimpose inefficiencies on the development execution andmaintenance of tests

SHORT ITERATIONS SQUEEZE TESTINGIn sequential life cycles if the organization involves the testteam early in the project the testers have lots of time todevelop and maintain their tests and can do so in parallel withthe development of the system and prior to the start of sys-tem test execution Some more formal iterative lifecycle mod-els such as Rapid Application Development and the RationalUnified Process often allow substantial periods of timebetween test execution periods for each iteration Theseintervals allow test teams develop and maintain their test sys-tems

Most teams using agile life cycles progress more quicklyeg Scrum uses sprints two to four week periods of shortfast-paced development iterations I have seen this pacesqueeze the test teamrsquos ability to develop and maintain testsystems Testing strategies that include an automation ele-ment also have proven particularly sensitive to this challenge

UNIT TESTING IS OFTEN INADEQUATE Most agile proponents stress good automated unit testing andframeworks such as JUnit have helped simplify and minimize

and make the changes visible across the team and to stakehold-ers while guiding the software toward the customersrsquo true needsIf the traditional testing approaches that Rex alludes to can notadapt to change effectively then Irsquod argue that they need tochange

SHORT ITERATIONS SQUEEZE TESTINGIf your parochial goal is simply testing then I agree that shortiterations can be quite challenging But if your goal is to deliversmall chunks of working software to gain customer and prod-uct quality feedback and to ultimately deliver high value andworking products applications then small iterations rock

To Rexrsquos point regarding ldquosqueezing test system mainte-nancerdquo all of the agile methods adhere to the notion of com-pleting a small set of features that meet rdquodonenessrdquo criteriaat the end of each Iteration or Sprint

There should be no work that is left outmdashfor exampleappropriate test automation development or maintenance oftest ldquosystemsrdquo

Itrsquos up to the testers to simply place this work on theProduct and Sprint Backlogs and plan its completion as partof each Iteration or Sprint

Sometimes this takes effort and courage to defend this workagainst feature pressure However all team members encounterthe same challenge and need to ensure that they deliver ldquodonerdquosoftware from each of their functional perspectives

UNIT TESTING IS OFTEN INADEQUATEFirst agile methods stress automated testing of all kindsmdashnotsimply unit tests FitNesse Selenium and Watir Watin are

examples of open source automation tools that are widelyused in the agile community and that focus on non-unit testing

We do stress unit tests as being an important part of theoverall coverage and as being a safety net for the teamsrsquo refac-toring efforts We want developers to be responsible for con-tributing to test automationmdashnot just testers And if a team isusing TDD as a practice wersquove learned that developing testscan be a powerful aid as one designs their software in smallincremental steps

No agile team to my knowledge views unit tests as theend-all of testing In fact as part of delivering ldquodonerdquo softwareincrements they must decide on the set of tests (unit func-

20 bull Software Test amp Performance JULY 2009

[I expect testers to pair with

developers to improve practices anddevelopers

to deliver solid unit tests before they

can be considered lsquodonersquo]

tional acceptance non-functional regression etc) thatshould be run within the Sprint in order to meet a customer-ready deliverable Every team member should be definingthese testing tasks with the testers taking the lead

To Rexrsquos last point that programmers donrsquot commit to doingeffective unit testing I expect the testers to pair with the devel-opers to improve these practices and I expect developers todeliver solid unit tests before they can be considered ldquodonerdquo

MANAGING THE INCREASED REGRESSION RISKThere is a continuing misperception that unit tests are theonly tests run on a Sprint-by-Sprint basis While itrsquos true thatthe team might not be able to run a full regression test with-in each Sprint it is their responsibility to apply risk-basedtechniques based upon whatrsquos been added or changed andto run appropriate functional non-functional exploratory andregression tests as part of delivering ldquodonerdquo software at theend of the Sprint

Clearly automation development at all levels helps withthis so the team should be investing time in improving theiroverall ability to automate all of the testing types I mention asappropriate

POOR OR MISSING TEST ORACLESI hate to break the news to Rex but the best test oracles arethe people within the team not the documentation Documen-tation almost always loses its relevance and accuracy Yes itrsquoshelpful but should never be considered the single source forall knowledge

What agility encourages is for the team to talkmdashto plan their

the costs Such automated unit tests allow what agile propo-nents call refactoring Refactoring is the redesign of majorchunks of code or even entire objects The automated unit

tests provide for quick regression testing of refactored codeSome agilists recommend substituting automated unit tests fordesign as in Test Driven Development While this soundsgood in theory there are two problems that I have observedwith this approach in practice

First unit testing has limited bug-finding utility In researchpublished in June 2002 in CrossTalk the Journal of Defense

Software Engineering author Capers Jones found unit testingto be only 25 to 30 percent effective at finding and removingdefects In RBCS assessments we have found that good sys-tem testing by an independent test team averages around 85percent effective at finding defects So organizations that careabout quality should have both good unit testing and good sys-tem testing

Second while unit testing is often thought of as thedeveloperrsquos job I have found that many programmers justdonrsquot do it or deliver cursory attempts This can create com-plications and delays for the system test team in its role asldquobug finder of last resortrdquo as they deal with buggy code dur-ing short test execution periods in the sprints This can alsolead to code churn during testing since so much code mustchange to fix the bugs

The amount of change can ultimately outstrip even theability of the best automated regression test system to keepup which then leads to lower defect detection effectivenessfor the test team

MANAGING THE INCREASED REGRESSION RISKCapers Jones also reported that regression accounts forabout seven percent of bugs In iterative life cycles such asScrum however code that worked in previous sprints getschurned by new features in each subsequent sprint Thisincreases the risk of regression

Agilists emphasize good automated unit testing in part tomanage the regression risk inherent in such churn But with eventhe best unit testing topping out at about 50 percent defect

JULY 2009 wwwstpcollaborativecom bull 21

[While unit testing is often thought

of as the developerrsquos job I have found that

they just donrsquot do it or [they] deliver

cursory attempts]

22 bull Software Test amp Performance JULY 2009

work together to cluster in small groups working on specific fea-tures to work with the customer gaining valuable feedbackmdashandto collaborate across disciplines From this they can decidewhere when and to what degree documentation is necessary

We expect team members to be professionals and to beresponsible and accountable We trust their judgment overblindly filling in a plan template or a requirement document orblindly running the same tests over and over again to meetsome perceived risk

All oracles have a cost We engage the customers in thesedecisions as wellmdashensuring that those oracles that are providedare actually used and kept current

THE SHIFTING TEST BASISIrsquom lost here Agile teams define requirements Some usePost-it Notes and cards or define them as User StoriesOthers describe use cases or use traditional requirementforms Many even add a construct called acceptance tests ona finely grained story level that helps define the critical ele-ments of each feature or requirement These are things thatthe customer will ldquotestrdquo as part of their acceptance at the endof a Sprint or Iteration

Irsquod guess that Rex has seen poor or missing requirements inhis agile experience and Irsquod say those were not mature or solid-ly performing agile teams ldquoGoodrdquo teams define requirementsat a level that assures the team will build what the customerneeds They ensure collaboration surrounding each require-mentmdashat the point of attack

To capture a key point here The agile methods try to adopta Just-in-Time and Just Enough approach to their requirement

removal effectiveness according to Jones automated regressiontesting via unit tests will miss at least half of the regression bugs

POOR OR MISSING TEST ORACLESAgile favors working software over written documentationand in my observations special scorn is reserved for specifi-cations The suggestion by the Agile Manifesto (wwwagile-manifesto org) that people should value ldquoworking softwareover comprehensive documentationrdquo can create real chal-lenges for a test team when taken too far Testers userequirements specifications and other documents as testoracles ie the means to determine correct behavior undera given test condition We have seen testers in agile situa-tions given documents with insufficient detail and in somecases given no documents at all

Even when testers are given adequate documents twoother agile development principles keep the test oracle chal-lenge alive First agile requires teams to embrace changeand espouses that ldquothe most efficient and effective methodof conveying information to and within a development team isface-to-face conversationrdquo

From what Irsquove seen these two principles allow the proj-ect team to change the definition of correct behavior at anytime including after testers have created tests to confirm aparticular behavior and even after testers have reported bugsagainst a particular behavior

Further the definition of correct behavior can change in ameeting or a discussion where a tester is not present If thatchange is not communicated to the tester the project suffers

false positives when testers report bugs against behaviornow defined as correct

THE SHIFTING TEST BASISRequirements-based testing strategies cannot handle vague ormissing requirements specifications Missing requirementsspecifications would mean a test team following a requirements-based testing strategy not only canrsquot say what it means for aparticular test to pass or fail they wouldnrsquot have a test basisupon which to base their tests

The test basis also provides a means to measure theresults In a requirements-based test strategy testers canrsquotreport test results accurately if the requirements are missing orpoor Testers canrsquot report the percentage of the test basis cov-ered by passed tests because the requirements wonrsquot provideenough detail for meaningful coverage analysis

TOO MANY MEETINGSA ldquoface-to-face conversationrdquo is just another way to describe ameeting A client once jokingly described Scrum as a heavy-weight process saying ldquoIrsquom surprised at the name agilemdashitshould be called couch potato There are too many meetingsTherersquos too much jawboning I find it really ironic that there are allthese books explaining how simple it isrdquo

Perhaps it was intentional not to use the dreaded word meet-ing in the Agile Manifesto but the result is often the same In someorganizations everyone gets invited to every meeting meetingsballoon in number and duration and managers and leads are notavailable to manage and lead their team because they are in meet-

ings much of the day Effectiveness and efficiency drop Anotherclient once told me of a senior manager who said in response toa complaint from a line manager about how attending meetingswas negatively impacting his ability to lead his team ldquoWersquore goingto continue to have these meetings until I find out why nothing isgetting done around hererdquo The senior manager was I am told theonly one in the room who didnrsquot get the irony

THE BOTTOM LINEIn RBCS client engagements we typically recommend blend-ed testing strategies that align well with Scrum and otheragile methodologies What seems most likely to work in manycases is a blend of risk-based testing maintainable automat-ed testing (at both the unit and system level) and a percent-age of effort spent on reactive testing (eg exploratory) Insome cases our methods will help mitigate the testing risksand reduce the testing challenges associated with thesemethodologies

Clearly there is no single way to perform effective soft-ware testing and as the industry evolves so too will the mosteffective methods that do the most good yacute

development An option of ldquono requirementsrdquo does not exist

TOO MANY MEETINGSThe point of face-to-face collaboration is that itrsquos the most effec-tive and richest way to communicate in technical teams Has anydeveloper ever implemented a requirement had a tester test itand a customer receive it with perfect alignment That wouldtruly be exceptional More oftenthe tester will say the softwareperforms differently than the testsandor the customer says I didnrsquotask for that even though theywere all working off of the exactsame requirement

The point is that technicalteams need to rely on written aswell as face-to-face collabora-tion to effectively reduce themisunderstandings and reworkthat written requirements alone often drive

Beyond this agile teams should be conducting incrediblyeffective meetings something perhaps lacking in the exam-ples Rex has seen I suspect this problem isnrsquot agile-centricand that those teams simply need a good facilitator

THE BOTTOM LINEEffectively testing todayrsquos software is incredibly challenging andwe all need to stop thinking of agile testing as somethingopposed to traditional testing Testers need every possible tool

to perform testing well Agile methods have emerged as a set ofpractices and a mindset that is context based and powerful Theytake a holistic view to testing and qualitymdashreinforcing the two asa whole Testers in this model have a powerful opportunity to notonly test but to champion and guide the overall teamrsquos focus ondelivering quality software

We need to stop our history of opposition to new testing app-roaches and begin embracing abroader set of testing practicesthat work in each projectrsquos uniquecontext I also believe we that weneed to stop quoting CapersJones as I suspect his data andstudies do not reflect the dynam-ics of mature agile teams Irsquom look-ing forward to encouraging Rex totry and gain deeper understandingof agile testing practices yacute

Bob will have ample oppor-tunity to do that in October when he and Rex meet for thefirst time as fellow members of STampP CollaborativersquosStrategic Advisory Board ndashEd

JULY 2009 wwwstpcollaborativecom bull 23

[Teams need to rely on written and face-to-face

collaboration to effectively reduce rework]

LEARN AGILE TESTING TECHNIQUESdirectly from Bob Galen himself this fall atSTPCon The Scrum Master will be on hand inCambridge MA Oct 19-23 to teach ldquoAgile andHigh-Speed Software Testing Techniquesrdquo anin-depth training course for bringing agile test-

ing to your organization VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

LEARN TEST MANAGEMENT from Rex Blackat the Software Test amp Performance Conferencethis fall in Cambridge Mass Learn about ldquoTestManagement A Risk-Based Approachrdquo the prac-tice that guides test activities based on carefulassessment and prioritization of the risks involved VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

ciplines and draw on a variety of skills to do their jobs And whilethere are many unsung testing heroes inside software develop-ment shops there are the beginnings in many organizations of agrowing recognition that test and performance engineers shouldbe considered peers of their development counterparts

We asked several testing gurus to share their thoughts onthe state of the testing industry and give us some insights on wherethey see things evolving One thing for anyone to realize is thatwhile testing as an industry is maturing it is still relatively new Thesoftware tester job isnt something that shows up on the US CensusWe are still very much a new profession says Scott Barber thefounder and chief technologist of PerfTestPlus in Palm Bay Fla Mostof our very first programmers are still alive we arent thousands ofyears old like other disciplinesrdquo

Perhaps the best rundown of the industryrsquos evolution is summed up by Michael Boltonwho lives in Toronto Testing has historically been confirmation validation and verificationhe says I believe that testing must become far more about exploration discovery investi-gation learning and communicating what youve learnedrdquo Bolton runs test consultancyDevelopSense and teaches rapid software testing methods ldquoI like to encourage people torecognize and manage uncertainty and ambiguity and instead of trying to eliminate them get-ting them down to levels that are suffi-ciently low for people to do use-ful workrdquo

While any gathering ofsoftware testers will find dis-agreement on overall pur-pose one thing is cleartesting isnt about how toimprove code qualityndashthat isthe role of the engineeringpeople who do the coding workand of management says BoltonTesting is about providing informationto those people so that better-informed decisionsabout quality are possible

The future careers for the current cropof test engineers seem to be on the rise

All this activity has made software testing a very big tent The industry has matured and grownto encompass numerous approaches to methods training and tools Because of this diver-sity there is a growing consensus that the best testers are those that can span multiple dis-

The software testing industry is at a historic crossroads Rich Internet applica-tions cloud computing the rise of virtualization and more capable mobiledevices have all made the job of producing reliable software more complexwhile at the same time enabling more sophistication for automated test tools

Wersquore All Part of the Story

We are a new profession most of our first programmers are

still alivendash Scott Barber

A skilledtester is theproduct of avariety of

experiencesndash Jon Bach

24 bull Software Test amp Performance

the business outcomes theyrsquove delivered he says He goes on to describe the best mix in his development teams A performance

engineering organization should be comprised of engineers that are just as capableor more capable than the core product engineers This allows for the performanceteam to be just as familiar with the product code base and for them to go as far asmaking performance enhancements to the product and testing the outcome of thosechanges

At Google employees involved in testing generally have one of three titles saysDavid Saff a Google software engineer in test whorsquos basedin Cambridge Mass Software engineers are expected to unit-test their own code and many take an active role in maintain-ing testing frameworks and system tests for their teams Testengineers look at the quality requirements of a project identifyholes and fill them by applying existing automation tools where pos-sible or managing manual testing teams where needed Softwareengineers in test look for holes in the tools available and serve as internal

tool developers and implementation consultants Any teamcant survive without high competence in each domain

CLOUDS ARE FORMINGThe move toward cloud computing has helped softwaretesting overall but also made for some new challengestoo Cloud computing adds layers of complexity to test-ing but it also makes things more interesting says BoltonThese layers include being able to predict latency and avail-ability of all components that reside in the cloud Barber saysthat Too many people take latency for granted still and many

application developers still take the entire network infrastruc-ture for granted and dont consider it part of their domain

so programmers just assume it is working unless theircode has a problem

Bartow agrees Network latency hasbecome even more important with the intro-duction of Rich Internet Application tech-nologies such as Adobe Flex Every value that you enter

David Strom is a freelance reporter and author living in St Louis

particularly within large software shops such as Intuit and Google There software testengineers are often more important than the development engineers who writethe original code and have their own career track with parallel compensationand benefits

At Intuit QA engineers are held to the exact same standards and expec-tations as product developersrdquo says Dan Bartow a manager of performanceengineering for the company which makes TurboTax and other financial softwareldquoIn fact my team of five full-time performance engineers is calibrated the sameway as the product engineers at the end of year in terms of their skill-sets and

Virtualizationmakes things

easierand harder

ndash Michael Bolton

wwwstpcollaborativecom bull 25

of the Story

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 12: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

attempt to cover as much of the appli-cation as possible and they shouldrely on mocks and simulators sparing-ly This allows the tests to provide afairly realistic view of the state of theprogram and to more accurately pre-dict how it will behave in a productionenvironment

Creating and maintaining a usefulsuite of automated regression tests maybe the most difficult aspect of testing inan agile environment In early iterationsnot much of the application exists sothere are few meaningful regression

tests that can be written and executedAs the program grows the scope andcomplexity of the project may becomeoverwhelming and it may be challengingto keep up with the pace of develop-ment It is usually helpful to spend earlyiterations creating a framework forautomated regression tests andpreparing outlines and test scripts forfunctionality that will be added overtime By investing in the scaffolding inthe beginning it becomes possible forthe testers to add new regressiontests incrementally Also the existing

regression tests help to ensure thatthe application continues to performaccording to the expectations estab-lished in prior iterations This makes itpossible for the testers to continue tofocus on new functionality rather thanconstantly reconfirming existing behav-ior is correct

In conventional development meth-odologies there are many categoriesof tests intended to exercise the appli-cation end-to-end These include per-formance tests load tests stresstests integration tests smoke testsand others In the end each of thesemay be thought of as a type of regres-sion test All attempts to determinehow well the product meets the func-tional and non-functional requirementsalso attempt to exercise the completeapplication and all help to ensure thatfunctionality is never lost once it hasbeen implemented

One of the key components ofagile methodologies is a focus on busi-ness value Therefore any of thesetypes of tests that provide valuableand meaningful feedback about thestate of the project should be givenattention in agile environments Forexample if performance is important tothe application then performancetests should be automated and made apart of the regression test suite Thesame is true for any of the other typesof tests mentioned here

Because automated regression

[Automated

regression tests

should cover

as much of the

application

as possible

and should

use mocks

sparingly]

16 bull Software Test amp Performance JULY 2009

tests may serve a variety of purposesthere is no standard approach to imple-menting them Often a combination oftools must be employed to address eachspecific situation Both commercial andfree tools are available for the differenttypes of testing and each may be bestsuited to a particular type of applicationIn addition some teams build customframeworks using Ruby Java PerlPython or another language to get thebest match for their project

MANUAL EXPLORATORY TESTINGThere are two main types ofdefects those under whichthe application fails to performaccording to expectations (asdefined by its requirements)and those that cause theapplication to behave improp-erly in response to an unex-pected event Automatedtests can be effective at find-ing the former For a given setof expected inputs the appli-cation should produce a specific output Howeverautomation is not particularlygood at finding the unexpect-ed

Consider for example anapplication that permits a userto perform tasks A B and CAs the product was beingdeveloped no one expected that A Band C were in any way related Howeverwhen the user performs these tasks in aspecific order the program crashes It isunlikely that the functional requirementsinclude a statement like ldquoThe systemshall not crash when the user performsA then C then Brdquo As a result it isunlikely that an automated test exists tosimulate the situation This is where man-ual exploratory testing can be beneficial

Manual exploratory testing gives thetester an opportunity to truly demon-strate their value to a project Testerstend to be good at coming up withunusual ldquowhat ifrdquo scenarios that repre-sent what actual users will do with theapplication in the production environ-ments These might be long series oftasks odd combinations of events orprolonged use of the program Oftenthese scenarios are too difficult or cost-ly to automate

Manual exploratory testing is morethan the tester wandering randomlythrough the application To be done

effectively the tester must start with aplan He needs to isolate a section ofthe program to explore and time box theexploration Once the tester begins theexploration he should create a record ofthe actions taken This is crucial to theability to recreate any problems foundScreen recording tools also might proveuseful for documenting exactly what thetester does during an exploratory testingsession When flaws are found duringthis process new acceptance or regres-sion tests should be written (and auto-mated) to expose the bad behavior pro-

vide a verification that it has been cor-rected and ensure that it does notreturn

For software testing to be success-ful in an agile development environmenta four-part strategy comprising compli-mentary approaches has given me thebest results

Automated unit testing puts theonus of removing obvious errors ondevelopers where it belongs and resultsin higher initial quality delivered totesters

Automated acceptance testingsolves the critical problem of engagingstakeholders to identify key functionalityand to define what it means for the pro-gram to work correctly and provide con-fidence that the core functionality isbeing properly implemented

Automated regression testing pro-vides thorough coverage ensures thatthe application meets functional and non-functional requirements and guaranteesthat functionality once working is not dis-abled in later iterations

Manual exploratory testing exposesunexpected behavior by letting testersexercise the application in ways too dif-ficult or costly to automate

This four-phase approach to testingenables a project team to effectively andthoroughly determine an applicationrsquosreadiness for release Consistent with

agile practices testingbegins early in the softwaredevelopment life cycle and itis done constantly through-out the project Every teammember participates in someway in the testing effort andthe mixture of testing typesminimizes the risks associat-ed with any single techniqueBy automating what can beautomated it becomes pos-sible to provide testers withsufficient time to fully applytheir unique skills and experi-ences yacute

REFERENCES AND FURTHERREADINGbull Cunningham Ward Framework

for Integrated Test October 122007 httpfitc2com

bull Dunkin Robert Burrell TheLanguage of Testing Dummies

Mocks Fakes and Stubs Weirdest UndreamtUse Case November 29 2008httpwwwjrollercomrobertburrelldonkinentrytdd_mocks_fakes_stubs

bull FitNesse Front Page FitNesse httpwwwfitnesseorg

bull Fowler Martin Mocks Arenrsquot Stubs MocksArenrsquot Stubs January 2 2007 httpwwwmartinfowlercomarticlesmocksArentStubshtml

bull jUnitorg Resources for Test DrivenDevelopment jUnit httpwwwjunitorg

bull Porter Robert Mock Objects in Test DrivenDesign BlogOfBob October 7 2007httpwwwrp2ccomblogofbobMockObjectsInTestDrivenDesignaspx

bull Rothman Johanna Does Exploratory TestingHave A Place On Agile Teams StickyMinds July7 2008httpwwwstickymindscomsitewideaspFunction=edetailampObjectType=COLampObjectId=13860

[Manual exploratory

testing is more than

wandering randomly

through the application

the tester must start

with a plan]

LEARN TEST AUTOMATION from Robert Walsh at the Software Test ampPerformance Conference this fall Hersquoll be with us in Cambridge MAOct 19-23 teaching ldquoFirst Steps to Test Automationrdquo an in-depth coursethat can automate your testing fast VViissiitt wwwwwwssttppccoonnccoomm

observed with clients using agile methodologies typi-cally Scrum While my associates and I have helpedclients overcome these challenges I will not present allof our suggested solutions here Instead Software Testamp Performance magazine has given test consultant andcertified Scrum Master Bob Galen an opportunity tooffer solutions

What I describe here are challenges we haveobserved or discussed with clients who told us they werefollowing an agile life cycle Some of these challengesmay be inherent in agile methodologies However somemay in fact actually arise from improper understanding orimplementation of the chosen methodology resulting inavoidable problems

VOLUME AND SPEED OF CHANGE One of the principles of agile development is that proj-ect teams should welcome changing requirementseven late in development Many testing strategies

especially analytical require-ments-based testing becomequite inefficient in such situa-tions While automated test-ing can help accommodateon-going change automationcan be fragile and high-main-tenance if not implementedproperly We find that risk-based testing coupled withreactive techniques likeexploratory testing and prop-

By Rex Black

Every software development model presents itsown testing implications Some ease the testingprocess others make it more challenging Whatfollows is a look at some challenges I have

Rex Black is founder and president of RBCS a software hardware andsystems consultancy and is a founding member of the STP CollaborativeStrategic Advisory Board

18 bull Software Test amp Performance JULY 2009

The AgileProcessCreatesTestingPuzzles

Which AreSolved When

Agile IsPerformedCorrectly

of traditional testing consultants I donrsquot believe theywere gathered from the perspective of an actual agiletester one with serious experience working in matureand high performing agile teams

Such views often come from the perspective of ascribe or observer of teams who at least from my per-spective arenrsquot practicing agility very well Or theyrsquoreanecdotal reactions to the Agile Manifesto or other writ-ings without trying to understand the underlying intentionsWhatrsquos discouraging is that superficial reactions from wellrespected sources can give the wrong impression

VOLUME AND SPEED OF CHANGEThe point of embracing change is the realization thatmost software projects are volatile Business conditionsare incredibly dynamic Customers may not actuallyknow what they need until they see the application startto coalesce as working software The developers maynot have understood the challenges associated with thearchitecture without actuallybuilding parts of it nor mighttesters understand the mostrisk-based areas to test with-out seeing the application run

The point of trying to freezerequirements at a historicalinstant in time where thingsare well understood withoutwriting a lick of software is afoolrsquos errand Agile methods tryto embrace change react to it

Iwas asked by Software Test amp Performance magazine to provide a rebuttal to Rex Blackrsquos perspectives on how ldquoagile challenges testingrdquoRexrsquos views and reactions to agility are typical

By Bob GalenBob Galen is principal consultant of the RGalen Consulting Group anda founding member of the STP Collaborative Strategic Advisory Board

JULY 2009 wwwstpcollaborativecom bull 19

erly-done automation works bestWhatrsquos more change-related challenges often arise from

changes in the definition of the product and its correct behav-ior When the test team is not kept informed of thesechanges or when the rate of change is very high this canimpose inefficiencies on the development execution andmaintenance of tests

SHORT ITERATIONS SQUEEZE TESTINGIn sequential life cycles if the organization involves the testteam early in the project the testers have lots of time todevelop and maintain their tests and can do so in parallel withthe development of the system and prior to the start of sys-tem test execution Some more formal iterative lifecycle mod-els such as Rapid Application Development and the RationalUnified Process often allow substantial periods of timebetween test execution periods for each iteration Theseintervals allow test teams develop and maintain their test sys-tems

Most teams using agile life cycles progress more quicklyeg Scrum uses sprints two to four week periods of shortfast-paced development iterations I have seen this pacesqueeze the test teamrsquos ability to develop and maintain testsystems Testing strategies that include an automation ele-ment also have proven particularly sensitive to this challenge

UNIT TESTING IS OFTEN INADEQUATE Most agile proponents stress good automated unit testing andframeworks such as JUnit have helped simplify and minimize

and make the changes visible across the team and to stakehold-ers while guiding the software toward the customersrsquo true needsIf the traditional testing approaches that Rex alludes to can notadapt to change effectively then Irsquod argue that they need tochange

SHORT ITERATIONS SQUEEZE TESTINGIf your parochial goal is simply testing then I agree that shortiterations can be quite challenging But if your goal is to deliversmall chunks of working software to gain customer and prod-uct quality feedback and to ultimately deliver high value andworking products applications then small iterations rock

To Rexrsquos point regarding ldquosqueezing test system mainte-nancerdquo all of the agile methods adhere to the notion of com-pleting a small set of features that meet rdquodonenessrdquo criteriaat the end of each Iteration or Sprint

There should be no work that is left outmdashfor exampleappropriate test automation development or maintenance oftest ldquosystemsrdquo

Itrsquos up to the testers to simply place this work on theProduct and Sprint Backlogs and plan its completion as partof each Iteration or Sprint

Sometimes this takes effort and courage to defend this workagainst feature pressure However all team members encounterthe same challenge and need to ensure that they deliver ldquodonerdquosoftware from each of their functional perspectives

UNIT TESTING IS OFTEN INADEQUATEFirst agile methods stress automated testing of all kindsmdashnotsimply unit tests FitNesse Selenium and Watir Watin are

examples of open source automation tools that are widelyused in the agile community and that focus on non-unit testing

We do stress unit tests as being an important part of theoverall coverage and as being a safety net for the teamsrsquo refac-toring efforts We want developers to be responsible for con-tributing to test automationmdashnot just testers And if a team isusing TDD as a practice wersquove learned that developing testscan be a powerful aid as one designs their software in smallincremental steps

No agile team to my knowledge views unit tests as theend-all of testing In fact as part of delivering ldquodonerdquo softwareincrements they must decide on the set of tests (unit func-

20 bull Software Test amp Performance JULY 2009

[I expect testers to pair with

developers to improve practices anddevelopers

to deliver solid unit tests before they

can be considered lsquodonersquo]

tional acceptance non-functional regression etc) thatshould be run within the Sprint in order to meet a customer-ready deliverable Every team member should be definingthese testing tasks with the testers taking the lead

To Rexrsquos last point that programmers donrsquot commit to doingeffective unit testing I expect the testers to pair with the devel-opers to improve these practices and I expect developers todeliver solid unit tests before they can be considered ldquodonerdquo

MANAGING THE INCREASED REGRESSION RISKThere is a continuing misperception that unit tests are theonly tests run on a Sprint-by-Sprint basis While itrsquos true thatthe team might not be able to run a full regression test with-in each Sprint it is their responsibility to apply risk-basedtechniques based upon whatrsquos been added or changed andto run appropriate functional non-functional exploratory andregression tests as part of delivering ldquodonerdquo software at theend of the Sprint

Clearly automation development at all levels helps withthis so the team should be investing time in improving theiroverall ability to automate all of the testing types I mention asappropriate

POOR OR MISSING TEST ORACLESI hate to break the news to Rex but the best test oracles arethe people within the team not the documentation Documen-tation almost always loses its relevance and accuracy Yes itrsquoshelpful but should never be considered the single source forall knowledge

What agility encourages is for the team to talkmdashto plan their

the costs Such automated unit tests allow what agile propo-nents call refactoring Refactoring is the redesign of majorchunks of code or even entire objects The automated unit

tests provide for quick regression testing of refactored codeSome agilists recommend substituting automated unit tests fordesign as in Test Driven Development While this soundsgood in theory there are two problems that I have observedwith this approach in practice

First unit testing has limited bug-finding utility In researchpublished in June 2002 in CrossTalk the Journal of Defense

Software Engineering author Capers Jones found unit testingto be only 25 to 30 percent effective at finding and removingdefects In RBCS assessments we have found that good sys-tem testing by an independent test team averages around 85percent effective at finding defects So organizations that careabout quality should have both good unit testing and good sys-tem testing

Second while unit testing is often thought of as thedeveloperrsquos job I have found that many programmers justdonrsquot do it or deliver cursory attempts This can create com-plications and delays for the system test team in its role asldquobug finder of last resortrdquo as they deal with buggy code dur-ing short test execution periods in the sprints This can alsolead to code churn during testing since so much code mustchange to fix the bugs

The amount of change can ultimately outstrip even theability of the best automated regression test system to keepup which then leads to lower defect detection effectivenessfor the test team

MANAGING THE INCREASED REGRESSION RISKCapers Jones also reported that regression accounts forabout seven percent of bugs In iterative life cycles such asScrum however code that worked in previous sprints getschurned by new features in each subsequent sprint Thisincreases the risk of regression

Agilists emphasize good automated unit testing in part tomanage the regression risk inherent in such churn But with eventhe best unit testing topping out at about 50 percent defect

JULY 2009 wwwstpcollaborativecom bull 21

[While unit testing is often thought

of as the developerrsquos job I have found that

they just donrsquot do it or [they] deliver

cursory attempts]

22 bull Software Test amp Performance JULY 2009

work together to cluster in small groups working on specific fea-tures to work with the customer gaining valuable feedbackmdashandto collaborate across disciplines From this they can decidewhere when and to what degree documentation is necessary

We expect team members to be professionals and to beresponsible and accountable We trust their judgment overblindly filling in a plan template or a requirement document orblindly running the same tests over and over again to meetsome perceived risk

All oracles have a cost We engage the customers in thesedecisions as wellmdashensuring that those oracles that are providedare actually used and kept current

THE SHIFTING TEST BASISIrsquom lost here Agile teams define requirements Some usePost-it Notes and cards or define them as User StoriesOthers describe use cases or use traditional requirementforms Many even add a construct called acceptance tests ona finely grained story level that helps define the critical ele-ments of each feature or requirement These are things thatthe customer will ldquotestrdquo as part of their acceptance at the endof a Sprint or Iteration

Irsquod guess that Rex has seen poor or missing requirements inhis agile experience and Irsquod say those were not mature or solid-ly performing agile teams ldquoGoodrdquo teams define requirementsat a level that assures the team will build what the customerneeds They ensure collaboration surrounding each require-mentmdashat the point of attack

To capture a key point here The agile methods try to adopta Just-in-Time and Just Enough approach to their requirement

removal effectiveness according to Jones automated regressiontesting via unit tests will miss at least half of the regression bugs

POOR OR MISSING TEST ORACLESAgile favors working software over written documentationand in my observations special scorn is reserved for specifi-cations The suggestion by the Agile Manifesto (wwwagile-manifesto org) that people should value ldquoworking softwareover comprehensive documentationrdquo can create real chal-lenges for a test team when taken too far Testers userequirements specifications and other documents as testoracles ie the means to determine correct behavior undera given test condition We have seen testers in agile situa-tions given documents with insufficient detail and in somecases given no documents at all

Even when testers are given adequate documents twoother agile development principles keep the test oracle chal-lenge alive First agile requires teams to embrace changeand espouses that ldquothe most efficient and effective methodof conveying information to and within a development team isface-to-face conversationrdquo

From what Irsquove seen these two principles allow the proj-ect team to change the definition of correct behavior at anytime including after testers have created tests to confirm aparticular behavior and even after testers have reported bugsagainst a particular behavior

Further the definition of correct behavior can change in ameeting or a discussion where a tester is not present If thatchange is not communicated to the tester the project suffers

false positives when testers report bugs against behaviornow defined as correct

THE SHIFTING TEST BASISRequirements-based testing strategies cannot handle vague ormissing requirements specifications Missing requirementsspecifications would mean a test team following a requirements-based testing strategy not only canrsquot say what it means for aparticular test to pass or fail they wouldnrsquot have a test basisupon which to base their tests

The test basis also provides a means to measure theresults In a requirements-based test strategy testers canrsquotreport test results accurately if the requirements are missing orpoor Testers canrsquot report the percentage of the test basis cov-ered by passed tests because the requirements wonrsquot provideenough detail for meaningful coverage analysis

TOO MANY MEETINGSA ldquoface-to-face conversationrdquo is just another way to describe ameeting A client once jokingly described Scrum as a heavy-weight process saying ldquoIrsquom surprised at the name agilemdashitshould be called couch potato There are too many meetingsTherersquos too much jawboning I find it really ironic that there are allthese books explaining how simple it isrdquo

Perhaps it was intentional not to use the dreaded word meet-ing in the Agile Manifesto but the result is often the same In someorganizations everyone gets invited to every meeting meetingsballoon in number and duration and managers and leads are notavailable to manage and lead their team because they are in meet-

ings much of the day Effectiveness and efficiency drop Anotherclient once told me of a senior manager who said in response toa complaint from a line manager about how attending meetingswas negatively impacting his ability to lead his team ldquoWersquore goingto continue to have these meetings until I find out why nothing isgetting done around hererdquo The senior manager was I am told theonly one in the room who didnrsquot get the irony

THE BOTTOM LINEIn RBCS client engagements we typically recommend blend-ed testing strategies that align well with Scrum and otheragile methodologies What seems most likely to work in manycases is a blend of risk-based testing maintainable automat-ed testing (at both the unit and system level) and a percent-age of effort spent on reactive testing (eg exploratory) Insome cases our methods will help mitigate the testing risksand reduce the testing challenges associated with thesemethodologies

Clearly there is no single way to perform effective soft-ware testing and as the industry evolves so too will the mosteffective methods that do the most good yacute

development An option of ldquono requirementsrdquo does not exist

TOO MANY MEETINGSThe point of face-to-face collaboration is that itrsquos the most effec-tive and richest way to communicate in technical teams Has anydeveloper ever implemented a requirement had a tester test itand a customer receive it with perfect alignment That wouldtruly be exceptional More oftenthe tester will say the softwareperforms differently than the testsandor the customer says I didnrsquotask for that even though theywere all working off of the exactsame requirement

The point is that technicalteams need to rely on written aswell as face-to-face collabora-tion to effectively reduce themisunderstandings and reworkthat written requirements alone often drive

Beyond this agile teams should be conducting incrediblyeffective meetings something perhaps lacking in the exam-ples Rex has seen I suspect this problem isnrsquot agile-centricand that those teams simply need a good facilitator

THE BOTTOM LINEEffectively testing todayrsquos software is incredibly challenging andwe all need to stop thinking of agile testing as somethingopposed to traditional testing Testers need every possible tool

to perform testing well Agile methods have emerged as a set ofpractices and a mindset that is context based and powerful Theytake a holistic view to testing and qualitymdashreinforcing the two asa whole Testers in this model have a powerful opportunity to notonly test but to champion and guide the overall teamrsquos focus ondelivering quality software

We need to stop our history of opposition to new testing app-roaches and begin embracing abroader set of testing practicesthat work in each projectrsquos uniquecontext I also believe we that weneed to stop quoting CapersJones as I suspect his data andstudies do not reflect the dynam-ics of mature agile teams Irsquom look-ing forward to encouraging Rex totry and gain deeper understandingof agile testing practices yacute

Bob will have ample oppor-tunity to do that in October when he and Rex meet for thefirst time as fellow members of STampP CollaborativersquosStrategic Advisory Board ndashEd

JULY 2009 wwwstpcollaborativecom bull 23

[Teams need to rely on written and face-to-face

collaboration to effectively reduce rework]

LEARN AGILE TESTING TECHNIQUESdirectly from Bob Galen himself this fall atSTPCon The Scrum Master will be on hand inCambridge MA Oct 19-23 to teach ldquoAgile andHigh-Speed Software Testing Techniquesrdquo anin-depth training course for bringing agile test-

ing to your organization VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

LEARN TEST MANAGEMENT from Rex Blackat the Software Test amp Performance Conferencethis fall in Cambridge Mass Learn about ldquoTestManagement A Risk-Based Approachrdquo the prac-tice that guides test activities based on carefulassessment and prioritization of the risks involved VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

ciplines and draw on a variety of skills to do their jobs And whilethere are many unsung testing heroes inside software develop-ment shops there are the beginnings in many organizations of agrowing recognition that test and performance engineers shouldbe considered peers of their development counterparts

We asked several testing gurus to share their thoughts onthe state of the testing industry and give us some insights on wherethey see things evolving One thing for anyone to realize is thatwhile testing as an industry is maturing it is still relatively new Thesoftware tester job isnt something that shows up on the US CensusWe are still very much a new profession says Scott Barber thefounder and chief technologist of PerfTestPlus in Palm Bay Fla Mostof our very first programmers are still alive we arent thousands ofyears old like other disciplinesrdquo

Perhaps the best rundown of the industryrsquos evolution is summed up by Michael Boltonwho lives in Toronto Testing has historically been confirmation validation and verificationhe says I believe that testing must become far more about exploration discovery investi-gation learning and communicating what youve learnedrdquo Bolton runs test consultancyDevelopSense and teaches rapid software testing methods ldquoI like to encourage people torecognize and manage uncertainty and ambiguity and instead of trying to eliminate them get-ting them down to levels that are suffi-ciently low for people to do use-ful workrdquo

While any gathering ofsoftware testers will find dis-agreement on overall pur-pose one thing is cleartesting isnt about how toimprove code qualityndashthat isthe role of the engineeringpeople who do the coding workand of management says BoltonTesting is about providing informationto those people so that better-informed decisionsabout quality are possible

The future careers for the current cropof test engineers seem to be on the rise

All this activity has made software testing a very big tent The industry has matured and grownto encompass numerous approaches to methods training and tools Because of this diver-sity there is a growing consensus that the best testers are those that can span multiple dis-

The software testing industry is at a historic crossroads Rich Internet applica-tions cloud computing the rise of virtualization and more capable mobiledevices have all made the job of producing reliable software more complexwhile at the same time enabling more sophistication for automated test tools

Wersquore All Part of the Story

We are a new profession most of our first programmers are

still alivendash Scott Barber

A skilledtester is theproduct of avariety of

experiencesndash Jon Bach

24 bull Software Test amp Performance

the business outcomes theyrsquove delivered he says He goes on to describe the best mix in his development teams A performance

engineering organization should be comprised of engineers that are just as capableor more capable than the core product engineers This allows for the performanceteam to be just as familiar with the product code base and for them to go as far asmaking performance enhancements to the product and testing the outcome of thosechanges

At Google employees involved in testing generally have one of three titles saysDavid Saff a Google software engineer in test whorsquos basedin Cambridge Mass Software engineers are expected to unit-test their own code and many take an active role in maintain-ing testing frameworks and system tests for their teams Testengineers look at the quality requirements of a project identifyholes and fill them by applying existing automation tools where pos-sible or managing manual testing teams where needed Softwareengineers in test look for holes in the tools available and serve as internal

tool developers and implementation consultants Any teamcant survive without high competence in each domain

CLOUDS ARE FORMINGThe move toward cloud computing has helped softwaretesting overall but also made for some new challengestoo Cloud computing adds layers of complexity to test-ing but it also makes things more interesting says BoltonThese layers include being able to predict latency and avail-ability of all components that reside in the cloud Barber saysthat Too many people take latency for granted still and many

application developers still take the entire network infrastruc-ture for granted and dont consider it part of their domain

so programmers just assume it is working unless theircode has a problem

Bartow agrees Network latency hasbecome even more important with the intro-duction of Rich Internet Application tech-nologies such as Adobe Flex Every value that you enter

David Strom is a freelance reporter and author living in St Louis

particularly within large software shops such as Intuit and Google There software testengineers are often more important than the development engineers who writethe original code and have their own career track with parallel compensationand benefits

At Intuit QA engineers are held to the exact same standards and expec-tations as product developersrdquo says Dan Bartow a manager of performanceengineering for the company which makes TurboTax and other financial softwareldquoIn fact my team of five full-time performance engineers is calibrated the sameway as the product engineers at the end of year in terms of their skill-sets and

Virtualizationmakes things

easierand harder

ndash Michael Bolton

wwwstpcollaborativecom bull 25

of the Story

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 13: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

tests may serve a variety of purposesthere is no standard approach to imple-menting them Often a combination oftools must be employed to address eachspecific situation Both commercial andfree tools are available for the differenttypes of testing and each may be bestsuited to a particular type of applicationIn addition some teams build customframeworks using Ruby Java PerlPython or another language to get thebest match for their project

MANUAL EXPLORATORY TESTINGThere are two main types ofdefects those under whichthe application fails to performaccording to expectations (asdefined by its requirements)and those that cause theapplication to behave improp-erly in response to an unex-pected event Automatedtests can be effective at find-ing the former For a given setof expected inputs the appli-cation should produce a specific output Howeverautomation is not particularlygood at finding the unexpect-ed

Consider for example anapplication that permits a userto perform tasks A B and CAs the product was beingdeveloped no one expected that A Band C were in any way related Howeverwhen the user performs these tasks in aspecific order the program crashes It isunlikely that the functional requirementsinclude a statement like ldquoThe systemshall not crash when the user performsA then C then Brdquo As a result it isunlikely that an automated test exists tosimulate the situation This is where man-ual exploratory testing can be beneficial

Manual exploratory testing gives thetester an opportunity to truly demon-strate their value to a project Testerstend to be good at coming up withunusual ldquowhat ifrdquo scenarios that repre-sent what actual users will do with theapplication in the production environ-ments These might be long series oftasks odd combinations of events orprolonged use of the program Oftenthese scenarios are too difficult or cost-ly to automate

Manual exploratory testing is morethan the tester wandering randomlythrough the application To be done

effectively the tester must start with aplan He needs to isolate a section ofthe program to explore and time box theexploration Once the tester begins theexploration he should create a record ofthe actions taken This is crucial to theability to recreate any problems foundScreen recording tools also might proveuseful for documenting exactly what thetester does during an exploratory testingsession When flaws are found duringthis process new acceptance or regres-sion tests should be written (and auto-mated) to expose the bad behavior pro-

vide a verification that it has been cor-rected and ensure that it does notreturn

For software testing to be success-ful in an agile development environmenta four-part strategy comprising compli-mentary approaches has given me thebest results

Automated unit testing puts theonus of removing obvious errors ondevelopers where it belongs and resultsin higher initial quality delivered totesters

Automated acceptance testingsolves the critical problem of engagingstakeholders to identify key functionalityand to define what it means for the pro-gram to work correctly and provide con-fidence that the core functionality isbeing properly implemented

Automated regression testing pro-vides thorough coverage ensures thatthe application meets functional and non-functional requirements and guaranteesthat functionality once working is not dis-abled in later iterations

Manual exploratory testing exposesunexpected behavior by letting testersexercise the application in ways too dif-ficult or costly to automate

This four-phase approach to testingenables a project team to effectively andthoroughly determine an applicationrsquosreadiness for release Consistent with

agile practices testingbegins early in the softwaredevelopment life cycle and itis done constantly through-out the project Every teammember participates in someway in the testing effort andthe mixture of testing typesminimizes the risks associat-ed with any single techniqueBy automating what can beautomated it becomes pos-sible to provide testers withsufficient time to fully applytheir unique skills and experi-ences yacute

REFERENCES AND FURTHERREADINGbull Cunningham Ward Framework

for Integrated Test October 122007 httpfitc2com

bull Dunkin Robert Burrell TheLanguage of Testing Dummies

Mocks Fakes and Stubs Weirdest UndreamtUse Case November 29 2008httpwwwjrollercomrobertburrelldonkinentrytdd_mocks_fakes_stubs

bull FitNesse Front Page FitNesse httpwwwfitnesseorg

bull Fowler Martin Mocks Arenrsquot Stubs MocksArenrsquot Stubs January 2 2007 httpwwwmartinfowlercomarticlesmocksArentStubshtml

bull jUnitorg Resources for Test DrivenDevelopment jUnit httpwwwjunitorg

bull Porter Robert Mock Objects in Test DrivenDesign BlogOfBob October 7 2007httpwwwrp2ccomblogofbobMockObjectsInTestDrivenDesignaspx

bull Rothman Johanna Does Exploratory TestingHave A Place On Agile Teams StickyMinds July7 2008httpwwwstickymindscomsitewideaspFunction=edetailampObjectType=COLampObjectId=13860

[Manual exploratory

testing is more than

wandering randomly

through the application

the tester must start

with a plan]

LEARN TEST AUTOMATION from Robert Walsh at the Software Test ampPerformance Conference this fall Hersquoll be with us in Cambridge MAOct 19-23 teaching ldquoFirst Steps to Test Automationrdquo an in-depth coursethat can automate your testing fast VViissiitt wwwwwwssttppccoonnccoomm

observed with clients using agile methodologies typi-cally Scrum While my associates and I have helpedclients overcome these challenges I will not present allof our suggested solutions here Instead Software Testamp Performance magazine has given test consultant andcertified Scrum Master Bob Galen an opportunity tooffer solutions

What I describe here are challenges we haveobserved or discussed with clients who told us they werefollowing an agile life cycle Some of these challengesmay be inherent in agile methodologies However somemay in fact actually arise from improper understanding orimplementation of the chosen methodology resulting inavoidable problems

VOLUME AND SPEED OF CHANGE One of the principles of agile development is that proj-ect teams should welcome changing requirementseven late in development Many testing strategies

especially analytical require-ments-based testing becomequite inefficient in such situa-tions While automated test-ing can help accommodateon-going change automationcan be fragile and high-main-tenance if not implementedproperly We find that risk-based testing coupled withreactive techniques likeexploratory testing and prop-

By Rex Black

Every software development model presents itsown testing implications Some ease the testingprocess others make it more challenging Whatfollows is a look at some challenges I have

Rex Black is founder and president of RBCS a software hardware andsystems consultancy and is a founding member of the STP CollaborativeStrategic Advisory Board

18 bull Software Test amp Performance JULY 2009

The AgileProcessCreatesTestingPuzzles

Which AreSolved When

Agile IsPerformedCorrectly

of traditional testing consultants I donrsquot believe theywere gathered from the perspective of an actual agiletester one with serious experience working in matureand high performing agile teams

Such views often come from the perspective of ascribe or observer of teams who at least from my per-spective arenrsquot practicing agility very well Or theyrsquoreanecdotal reactions to the Agile Manifesto or other writ-ings without trying to understand the underlying intentionsWhatrsquos discouraging is that superficial reactions from wellrespected sources can give the wrong impression

VOLUME AND SPEED OF CHANGEThe point of embracing change is the realization thatmost software projects are volatile Business conditionsare incredibly dynamic Customers may not actuallyknow what they need until they see the application startto coalesce as working software The developers maynot have understood the challenges associated with thearchitecture without actuallybuilding parts of it nor mighttesters understand the mostrisk-based areas to test with-out seeing the application run

The point of trying to freezerequirements at a historicalinstant in time where thingsare well understood withoutwriting a lick of software is afoolrsquos errand Agile methods tryto embrace change react to it

Iwas asked by Software Test amp Performance magazine to provide a rebuttal to Rex Blackrsquos perspectives on how ldquoagile challenges testingrdquoRexrsquos views and reactions to agility are typical

By Bob GalenBob Galen is principal consultant of the RGalen Consulting Group anda founding member of the STP Collaborative Strategic Advisory Board

JULY 2009 wwwstpcollaborativecom bull 19

erly-done automation works bestWhatrsquos more change-related challenges often arise from

changes in the definition of the product and its correct behav-ior When the test team is not kept informed of thesechanges or when the rate of change is very high this canimpose inefficiencies on the development execution andmaintenance of tests

SHORT ITERATIONS SQUEEZE TESTINGIn sequential life cycles if the organization involves the testteam early in the project the testers have lots of time todevelop and maintain their tests and can do so in parallel withthe development of the system and prior to the start of sys-tem test execution Some more formal iterative lifecycle mod-els such as Rapid Application Development and the RationalUnified Process often allow substantial periods of timebetween test execution periods for each iteration Theseintervals allow test teams develop and maintain their test sys-tems

Most teams using agile life cycles progress more quicklyeg Scrum uses sprints two to four week periods of shortfast-paced development iterations I have seen this pacesqueeze the test teamrsquos ability to develop and maintain testsystems Testing strategies that include an automation ele-ment also have proven particularly sensitive to this challenge

UNIT TESTING IS OFTEN INADEQUATE Most agile proponents stress good automated unit testing andframeworks such as JUnit have helped simplify and minimize

and make the changes visible across the team and to stakehold-ers while guiding the software toward the customersrsquo true needsIf the traditional testing approaches that Rex alludes to can notadapt to change effectively then Irsquod argue that they need tochange

SHORT ITERATIONS SQUEEZE TESTINGIf your parochial goal is simply testing then I agree that shortiterations can be quite challenging But if your goal is to deliversmall chunks of working software to gain customer and prod-uct quality feedback and to ultimately deliver high value andworking products applications then small iterations rock

To Rexrsquos point regarding ldquosqueezing test system mainte-nancerdquo all of the agile methods adhere to the notion of com-pleting a small set of features that meet rdquodonenessrdquo criteriaat the end of each Iteration or Sprint

There should be no work that is left outmdashfor exampleappropriate test automation development or maintenance oftest ldquosystemsrdquo

Itrsquos up to the testers to simply place this work on theProduct and Sprint Backlogs and plan its completion as partof each Iteration or Sprint

Sometimes this takes effort and courage to defend this workagainst feature pressure However all team members encounterthe same challenge and need to ensure that they deliver ldquodonerdquosoftware from each of their functional perspectives

UNIT TESTING IS OFTEN INADEQUATEFirst agile methods stress automated testing of all kindsmdashnotsimply unit tests FitNesse Selenium and Watir Watin are

examples of open source automation tools that are widelyused in the agile community and that focus on non-unit testing

We do stress unit tests as being an important part of theoverall coverage and as being a safety net for the teamsrsquo refac-toring efforts We want developers to be responsible for con-tributing to test automationmdashnot just testers And if a team isusing TDD as a practice wersquove learned that developing testscan be a powerful aid as one designs their software in smallincremental steps

No agile team to my knowledge views unit tests as theend-all of testing In fact as part of delivering ldquodonerdquo softwareincrements they must decide on the set of tests (unit func-

20 bull Software Test amp Performance JULY 2009

[I expect testers to pair with

developers to improve practices anddevelopers

to deliver solid unit tests before they

can be considered lsquodonersquo]

tional acceptance non-functional regression etc) thatshould be run within the Sprint in order to meet a customer-ready deliverable Every team member should be definingthese testing tasks with the testers taking the lead

To Rexrsquos last point that programmers donrsquot commit to doingeffective unit testing I expect the testers to pair with the devel-opers to improve these practices and I expect developers todeliver solid unit tests before they can be considered ldquodonerdquo

MANAGING THE INCREASED REGRESSION RISKThere is a continuing misperception that unit tests are theonly tests run on a Sprint-by-Sprint basis While itrsquos true thatthe team might not be able to run a full regression test with-in each Sprint it is their responsibility to apply risk-basedtechniques based upon whatrsquos been added or changed andto run appropriate functional non-functional exploratory andregression tests as part of delivering ldquodonerdquo software at theend of the Sprint

Clearly automation development at all levels helps withthis so the team should be investing time in improving theiroverall ability to automate all of the testing types I mention asappropriate

POOR OR MISSING TEST ORACLESI hate to break the news to Rex but the best test oracles arethe people within the team not the documentation Documen-tation almost always loses its relevance and accuracy Yes itrsquoshelpful but should never be considered the single source forall knowledge

What agility encourages is for the team to talkmdashto plan their

the costs Such automated unit tests allow what agile propo-nents call refactoring Refactoring is the redesign of majorchunks of code or even entire objects The automated unit

tests provide for quick regression testing of refactored codeSome agilists recommend substituting automated unit tests fordesign as in Test Driven Development While this soundsgood in theory there are two problems that I have observedwith this approach in practice

First unit testing has limited bug-finding utility In researchpublished in June 2002 in CrossTalk the Journal of Defense

Software Engineering author Capers Jones found unit testingto be only 25 to 30 percent effective at finding and removingdefects In RBCS assessments we have found that good sys-tem testing by an independent test team averages around 85percent effective at finding defects So organizations that careabout quality should have both good unit testing and good sys-tem testing

Second while unit testing is often thought of as thedeveloperrsquos job I have found that many programmers justdonrsquot do it or deliver cursory attempts This can create com-plications and delays for the system test team in its role asldquobug finder of last resortrdquo as they deal with buggy code dur-ing short test execution periods in the sprints This can alsolead to code churn during testing since so much code mustchange to fix the bugs

The amount of change can ultimately outstrip even theability of the best automated regression test system to keepup which then leads to lower defect detection effectivenessfor the test team

MANAGING THE INCREASED REGRESSION RISKCapers Jones also reported that regression accounts forabout seven percent of bugs In iterative life cycles such asScrum however code that worked in previous sprints getschurned by new features in each subsequent sprint Thisincreases the risk of regression

Agilists emphasize good automated unit testing in part tomanage the regression risk inherent in such churn But with eventhe best unit testing topping out at about 50 percent defect

JULY 2009 wwwstpcollaborativecom bull 21

[While unit testing is often thought

of as the developerrsquos job I have found that

they just donrsquot do it or [they] deliver

cursory attempts]

22 bull Software Test amp Performance JULY 2009

work together to cluster in small groups working on specific fea-tures to work with the customer gaining valuable feedbackmdashandto collaborate across disciplines From this they can decidewhere when and to what degree documentation is necessary

We expect team members to be professionals and to beresponsible and accountable We trust their judgment overblindly filling in a plan template or a requirement document orblindly running the same tests over and over again to meetsome perceived risk

All oracles have a cost We engage the customers in thesedecisions as wellmdashensuring that those oracles that are providedare actually used and kept current

THE SHIFTING TEST BASISIrsquom lost here Agile teams define requirements Some usePost-it Notes and cards or define them as User StoriesOthers describe use cases or use traditional requirementforms Many even add a construct called acceptance tests ona finely grained story level that helps define the critical ele-ments of each feature or requirement These are things thatthe customer will ldquotestrdquo as part of their acceptance at the endof a Sprint or Iteration

Irsquod guess that Rex has seen poor or missing requirements inhis agile experience and Irsquod say those were not mature or solid-ly performing agile teams ldquoGoodrdquo teams define requirementsat a level that assures the team will build what the customerneeds They ensure collaboration surrounding each require-mentmdashat the point of attack

To capture a key point here The agile methods try to adopta Just-in-Time and Just Enough approach to their requirement

removal effectiveness according to Jones automated regressiontesting via unit tests will miss at least half of the regression bugs

POOR OR MISSING TEST ORACLESAgile favors working software over written documentationand in my observations special scorn is reserved for specifi-cations The suggestion by the Agile Manifesto (wwwagile-manifesto org) that people should value ldquoworking softwareover comprehensive documentationrdquo can create real chal-lenges for a test team when taken too far Testers userequirements specifications and other documents as testoracles ie the means to determine correct behavior undera given test condition We have seen testers in agile situa-tions given documents with insufficient detail and in somecases given no documents at all

Even when testers are given adequate documents twoother agile development principles keep the test oracle chal-lenge alive First agile requires teams to embrace changeand espouses that ldquothe most efficient and effective methodof conveying information to and within a development team isface-to-face conversationrdquo

From what Irsquove seen these two principles allow the proj-ect team to change the definition of correct behavior at anytime including after testers have created tests to confirm aparticular behavior and even after testers have reported bugsagainst a particular behavior

Further the definition of correct behavior can change in ameeting or a discussion where a tester is not present If thatchange is not communicated to the tester the project suffers

false positives when testers report bugs against behaviornow defined as correct

THE SHIFTING TEST BASISRequirements-based testing strategies cannot handle vague ormissing requirements specifications Missing requirementsspecifications would mean a test team following a requirements-based testing strategy not only canrsquot say what it means for aparticular test to pass or fail they wouldnrsquot have a test basisupon which to base their tests

The test basis also provides a means to measure theresults In a requirements-based test strategy testers canrsquotreport test results accurately if the requirements are missing orpoor Testers canrsquot report the percentage of the test basis cov-ered by passed tests because the requirements wonrsquot provideenough detail for meaningful coverage analysis

TOO MANY MEETINGSA ldquoface-to-face conversationrdquo is just another way to describe ameeting A client once jokingly described Scrum as a heavy-weight process saying ldquoIrsquom surprised at the name agilemdashitshould be called couch potato There are too many meetingsTherersquos too much jawboning I find it really ironic that there are allthese books explaining how simple it isrdquo

Perhaps it was intentional not to use the dreaded word meet-ing in the Agile Manifesto but the result is often the same In someorganizations everyone gets invited to every meeting meetingsballoon in number and duration and managers and leads are notavailable to manage and lead their team because they are in meet-

ings much of the day Effectiveness and efficiency drop Anotherclient once told me of a senior manager who said in response toa complaint from a line manager about how attending meetingswas negatively impacting his ability to lead his team ldquoWersquore goingto continue to have these meetings until I find out why nothing isgetting done around hererdquo The senior manager was I am told theonly one in the room who didnrsquot get the irony

THE BOTTOM LINEIn RBCS client engagements we typically recommend blend-ed testing strategies that align well with Scrum and otheragile methodologies What seems most likely to work in manycases is a blend of risk-based testing maintainable automat-ed testing (at both the unit and system level) and a percent-age of effort spent on reactive testing (eg exploratory) Insome cases our methods will help mitigate the testing risksand reduce the testing challenges associated with thesemethodologies

Clearly there is no single way to perform effective soft-ware testing and as the industry evolves so too will the mosteffective methods that do the most good yacute

development An option of ldquono requirementsrdquo does not exist

TOO MANY MEETINGSThe point of face-to-face collaboration is that itrsquos the most effec-tive and richest way to communicate in technical teams Has anydeveloper ever implemented a requirement had a tester test itand a customer receive it with perfect alignment That wouldtruly be exceptional More oftenthe tester will say the softwareperforms differently than the testsandor the customer says I didnrsquotask for that even though theywere all working off of the exactsame requirement

The point is that technicalteams need to rely on written aswell as face-to-face collabora-tion to effectively reduce themisunderstandings and reworkthat written requirements alone often drive

Beyond this agile teams should be conducting incrediblyeffective meetings something perhaps lacking in the exam-ples Rex has seen I suspect this problem isnrsquot agile-centricand that those teams simply need a good facilitator

THE BOTTOM LINEEffectively testing todayrsquos software is incredibly challenging andwe all need to stop thinking of agile testing as somethingopposed to traditional testing Testers need every possible tool

to perform testing well Agile methods have emerged as a set ofpractices and a mindset that is context based and powerful Theytake a holistic view to testing and qualitymdashreinforcing the two asa whole Testers in this model have a powerful opportunity to notonly test but to champion and guide the overall teamrsquos focus ondelivering quality software

We need to stop our history of opposition to new testing app-roaches and begin embracing abroader set of testing practicesthat work in each projectrsquos uniquecontext I also believe we that weneed to stop quoting CapersJones as I suspect his data andstudies do not reflect the dynam-ics of mature agile teams Irsquom look-ing forward to encouraging Rex totry and gain deeper understandingof agile testing practices yacute

Bob will have ample oppor-tunity to do that in October when he and Rex meet for thefirst time as fellow members of STampP CollaborativersquosStrategic Advisory Board ndashEd

JULY 2009 wwwstpcollaborativecom bull 23

[Teams need to rely on written and face-to-face

collaboration to effectively reduce rework]

LEARN AGILE TESTING TECHNIQUESdirectly from Bob Galen himself this fall atSTPCon The Scrum Master will be on hand inCambridge MA Oct 19-23 to teach ldquoAgile andHigh-Speed Software Testing Techniquesrdquo anin-depth training course for bringing agile test-

ing to your organization VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

LEARN TEST MANAGEMENT from Rex Blackat the Software Test amp Performance Conferencethis fall in Cambridge Mass Learn about ldquoTestManagement A Risk-Based Approachrdquo the prac-tice that guides test activities based on carefulassessment and prioritization of the risks involved VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

ciplines and draw on a variety of skills to do their jobs And whilethere are many unsung testing heroes inside software develop-ment shops there are the beginnings in many organizations of agrowing recognition that test and performance engineers shouldbe considered peers of their development counterparts

We asked several testing gurus to share their thoughts onthe state of the testing industry and give us some insights on wherethey see things evolving One thing for anyone to realize is thatwhile testing as an industry is maturing it is still relatively new Thesoftware tester job isnt something that shows up on the US CensusWe are still very much a new profession says Scott Barber thefounder and chief technologist of PerfTestPlus in Palm Bay Fla Mostof our very first programmers are still alive we arent thousands ofyears old like other disciplinesrdquo

Perhaps the best rundown of the industryrsquos evolution is summed up by Michael Boltonwho lives in Toronto Testing has historically been confirmation validation and verificationhe says I believe that testing must become far more about exploration discovery investi-gation learning and communicating what youve learnedrdquo Bolton runs test consultancyDevelopSense and teaches rapid software testing methods ldquoI like to encourage people torecognize and manage uncertainty and ambiguity and instead of trying to eliminate them get-ting them down to levels that are suffi-ciently low for people to do use-ful workrdquo

While any gathering ofsoftware testers will find dis-agreement on overall pur-pose one thing is cleartesting isnt about how toimprove code qualityndashthat isthe role of the engineeringpeople who do the coding workand of management says BoltonTesting is about providing informationto those people so that better-informed decisionsabout quality are possible

The future careers for the current cropof test engineers seem to be on the rise

All this activity has made software testing a very big tent The industry has matured and grownto encompass numerous approaches to methods training and tools Because of this diver-sity there is a growing consensus that the best testers are those that can span multiple dis-

The software testing industry is at a historic crossroads Rich Internet applica-tions cloud computing the rise of virtualization and more capable mobiledevices have all made the job of producing reliable software more complexwhile at the same time enabling more sophistication for automated test tools

Wersquore All Part of the Story

We are a new profession most of our first programmers are

still alivendash Scott Barber

A skilledtester is theproduct of avariety of

experiencesndash Jon Bach

24 bull Software Test amp Performance

the business outcomes theyrsquove delivered he says He goes on to describe the best mix in his development teams A performance

engineering organization should be comprised of engineers that are just as capableor more capable than the core product engineers This allows for the performanceteam to be just as familiar with the product code base and for them to go as far asmaking performance enhancements to the product and testing the outcome of thosechanges

At Google employees involved in testing generally have one of three titles saysDavid Saff a Google software engineer in test whorsquos basedin Cambridge Mass Software engineers are expected to unit-test their own code and many take an active role in maintain-ing testing frameworks and system tests for their teams Testengineers look at the quality requirements of a project identifyholes and fill them by applying existing automation tools where pos-sible or managing manual testing teams where needed Softwareengineers in test look for holes in the tools available and serve as internal

tool developers and implementation consultants Any teamcant survive without high competence in each domain

CLOUDS ARE FORMINGThe move toward cloud computing has helped softwaretesting overall but also made for some new challengestoo Cloud computing adds layers of complexity to test-ing but it also makes things more interesting says BoltonThese layers include being able to predict latency and avail-ability of all components that reside in the cloud Barber saysthat Too many people take latency for granted still and many

application developers still take the entire network infrastruc-ture for granted and dont consider it part of their domain

so programmers just assume it is working unless theircode has a problem

Bartow agrees Network latency hasbecome even more important with the intro-duction of Rich Internet Application tech-nologies such as Adobe Flex Every value that you enter

David Strom is a freelance reporter and author living in St Louis

particularly within large software shops such as Intuit and Google There software testengineers are often more important than the development engineers who writethe original code and have their own career track with parallel compensationand benefits

At Intuit QA engineers are held to the exact same standards and expec-tations as product developersrdquo says Dan Bartow a manager of performanceengineering for the company which makes TurboTax and other financial softwareldquoIn fact my team of five full-time performance engineers is calibrated the sameway as the product engineers at the end of year in terms of their skill-sets and

Virtualizationmakes things

easierand harder

ndash Michael Bolton

wwwstpcollaborativecom bull 25

of the Story

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 14: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

observed with clients using agile methodologies typi-cally Scrum While my associates and I have helpedclients overcome these challenges I will not present allof our suggested solutions here Instead Software Testamp Performance magazine has given test consultant andcertified Scrum Master Bob Galen an opportunity tooffer solutions

What I describe here are challenges we haveobserved or discussed with clients who told us they werefollowing an agile life cycle Some of these challengesmay be inherent in agile methodologies However somemay in fact actually arise from improper understanding orimplementation of the chosen methodology resulting inavoidable problems

VOLUME AND SPEED OF CHANGE One of the principles of agile development is that proj-ect teams should welcome changing requirementseven late in development Many testing strategies

especially analytical require-ments-based testing becomequite inefficient in such situa-tions While automated test-ing can help accommodateon-going change automationcan be fragile and high-main-tenance if not implementedproperly We find that risk-based testing coupled withreactive techniques likeexploratory testing and prop-

By Rex Black

Every software development model presents itsown testing implications Some ease the testingprocess others make it more challenging Whatfollows is a look at some challenges I have

Rex Black is founder and president of RBCS a software hardware andsystems consultancy and is a founding member of the STP CollaborativeStrategic Advisory Board

18 bull Software Test amp Performance JULY 2009

The AgileProcessCreatesTestingPuzzles

Which AreSolved When

Agile IsPerformedCorrectly

of traditional testing consultants I donrsquot believe theywere gathered from the perspective of an actual agiletester one with serious experience working in matureand high performing agile teams

Such views often come from the perspective of ascribe or observer of teams who at least from my per-spective arenrsquot practicing agility very well Or theyrsquoreanecdotal reactions to the Agile Manifesto or other writ-ings without trying to understand the underlying intentionsWhatrsquos discouraging is that superficial reactions from wellrespected sources can give the wrong impression

VOLUME AND SPEED OF CHANGEThe point of embracing change is the realization thatmost software projects are volatile Business conditionsare incredibly dynamic Customers may not actuallyknow what they need until they see the application startto coalesce as working software The developers maynot have understood the challenges associated with thearchitecture without actuallybuilding parts of it nor mighttesters understand the mostrisk-based areas to test with-out seeing the application run

The point of trying to freezerequirements at a historicalinstant in time where thingsare well understood withoutwriting a lick of software is afoolrsquos errand Agile methods tryto embrace change react to it

Iwas asked by Software Test amp Performance magazine to provide a rebuttal to Rex Blackrsquos perspectives on how ldquoagile challenges testingrdquoRexrsquos views and reactions to agility are typical

By Bob GalenBob Galen is principal consultant of the RGalen Consulting Group anda founding member of the STP Collaborative Strategic Advisory Board

JULY 2009 wwwstpcollaborativecom bull 19

erly-done automation works bestWhatrsquos more change-related challenges often arise from

changes in the definition of the product and its correct behav-ior When the test team is not kept informed of thesechanges or when the rate of change is very high this canimpose inefficiencies on the development execution andmaintenance of tests

SHORT ITERATIONS SQUEEZE TESTINGIn sequential life cycles if the organization involves the testteam early in the project the testers have lots of time todevelop and maintain their tests and can do so in parallel withthe development of the system and prior to the start of sys-tem test execution Some more formal iterative lifecycle mod-els such as Rapid Application Development and the RationalUnified Process often allow substantial periods of timebetween test execution periods for each iteration Theseintervals allow test teams develop and maintain their test sys-tems

Most teams using agile life cycles progress more quicklyeg Scrum uses sprints two to four week periods of shortfast-paced development iterations I have seen this pacesqueeze the test teamrsquos ability to develop and maintain testsystems Testing strategies that include an automation ele-ment also have proven particularly sensitive to this challenge

UNIT TESTING IS OFTEN INADEQUATE Most agile proponents stress good automated unit testing andframeworks such as JUnit have helped simplify and minimize

and make the changes visible across the team and to stakehold-ers while guiding the software toward the customersrsquo true needsIf the traditional testing approaches that Rex alludes to can notadapt to change effectively then Irsquod argue that they need tochange

SHORT ITERATIONS SQUEEZE TESTINGIf your parochial goal is simply testing then I agree that shortiterations can be quite challenging But if your goal is to deliversmall chunks of working software to gain customer and prod-uct quality feedback and to ultimately deliver high value andworking products applications then small iterations rock

To Rexrsquos point regarding ldquosqueezing test system mainte-nancerdquo all of the agile methods adhere to the notion of com-pleting a small set of features that meet rdquodonenessrdquo criteriaat the end of each Iteration or Sprint

There should be no work that is left outmdashfor exampleappropriate test automation development or maintenance oftest ldquosystemsrdquo

Itrsquos up to the testers to simply place this work on theProduct and Sprint Backlogs and plan its completion as partof each Iteration or Sprint

Sometimes this takes effort and courage to defend this workagainst feature pressure However all team members encounterthe same challenge and need to ensure that they deliver ldquodonerdquosoftware from each of their functional perspectives

UNIT TESTING IS OFTEN INADEQUATEFirst agile methods stress automated testing of all kindsmdashnotsimply unit tests FitNesse Selenium and Watir Watin are

examples of open source automation tools that are widelyused in the agile community and that focus on non-unit testing

We do stress unit tests as being an important part of theoverall coverage and as being a safety net for the teamsrsquo refac-toring efforts We want developers to be responsible for con-tributing to test automationmdashnot just testers And if a team isusing TDD as a practice wersquove learned that developing testscan be a powerful aid as one designs their software in smallincremental steps

No agile team to my knowledge views unit tests as theend-all of testing In fact as part of delivering ldquodonerdquo softwareincrements they must decide on the set of tests (unit func-

20 bull Software Test amp Performance JULY 2009

[I expect testers to pair with

developers to improve practices anddevelopers

to deliver solid unit tests before they

can be considered lsquodonersquo]

tional acceptance non-functional regression etc) thatshould be run within the Sprint in order to meet a customer-ready deliverable Every team member should be definingthese testing tasks with the testers taking the lead

To Rexrsquos last point that programmers donrsquot commit to doingeffective unit testing I expect the testers to pair with the devel-opers to improve these practices and I expect developers todeliver solid unit tests before they can be considered ldquodonerdquo

MANAGING THE INCREASED REGRESSION RISKThere is a continuing misperception that unit tests are theonly tests run on a Sprint-by-Sprint basis While itrsquos true thatthe team might not be able to run a full regression test with-in each Sprint it is their responsibility to apply risk-basedtechniques based upon whatrsquos been added or changed andto run appropriate functional non-functional exploratory andregression tests as part of delivering ldquodonerdquo software at theend of the Sprint

Clearly automation development at all levels helps withthis so the team should be investing time in improving theiroverall ability to automate all of the testing types I mention asappropriate

POOR OR MISSING TEST ORACLESI hate to break the news to Rex but the best test oracles arethe people within the team not the documentation Documen-tation almost always loses its relevance and accuracy Yes itrsquoshelpful but should never be considered the single source forall knowledge

What agility encourages is for the team to talkmdashto plan their

the costs Such automated unit tests allow what agile propo-nents call refactoring Refactoring is the redesign of majorchunks of code or even entire objects The automated unit

tests provide for quick regression testing of refactored codeSome agilists recommend substituting automated unit tests fordesign as in Test Driven Development While this soundsgood in theory there are two problems that I have observedwith this approach in practice

First unit testing has limited bug-finding utility In researchpublished in June 2002 in CrossTalk the Journal of Defense

Software Engineering author Capers Jones found unit testingto be only 25 to 30 percent effective at finding and removingdefects In RBCS assessments we have found that good sys-tem testing by an independent test team averages around 85percent effective at finding defects So organizations that careabout quality should have both good unit testing and good sys-tem testing

Second while unit testing is often thought of as thedeveloperrsquos job I have found that many programmers justdonrsquot do it or deliver cursory attempts This can create com-plications and delays for the system test team in its role asldquobug finder of last resortrdquo as they deal with buggy code dur-ing short test execution periods in the sprints This can alsolead to code churn during testing since so much code mustchange to fix the bugs

The amount of change can ultimately outstrip even theability of the best automated regression test system to keepup which then leads to lower defect detection effectivenessfor the test team

MANAGING THE INCREASED REGRESSION RISKCapers Jones also reported that regression accounts forabout seven percent of bugs In iterative life cycles such asScrum however code that worked in previous sprints getschurned by new features in each subsequent sprint Thisincreases the risk of regression

Agilists emphasize good automated unit testing in part tomanage the regression risk inherent in such churn But with eventhe best unit testing topping out at about 50 percent defect

JULY 2009 wwwstpcollaborativecom bull 21

[While unit testing is often thought

of as the developerrsquos job I have found that

they just donrsquot do it or [they] deliver

cursory attempts]

22 bull Software Test amp Performance JULY 2009

work together to cluster in small groups working on specific fea-tures to work with the customer gaining valuable feedbackmdashandto collaborate across disciplines From this they can decidewhere when and to what degree documentation is necessary

We expect team members to be professionals and to beresponsible and accountable We trust their judgment overblindly filling in a plan template or a requirement document orblindly running the same tests over and over again to meetsome perceived risk

All oracles have a cost We engage the customers in thesedecisions as wellmdashensuring that those oracles that are providedare actually used and kept current

THE SHIFTING TEST BASISIrsquom lost here Agile teams define requirements Some usePost-it Notes and cards or define them as User StoriesOthers describe use cases or use traditional requirementforms Many even add a construct called acceptance tests ona finely grained story level that helps define the critical ele-ments of each feature or requirement These are things thatthe customer will ldquotestrdquo as part of their acceptance at the endof a Sprint or Iteration

Irsquod guess that Rex has seen poor or missing requirements inhis agile experience and Irsquod say those were not mature or solid-ly performing agile teams ldquoGoodrdquo teams define requirementsat a level that assures the team will build what the customerneeds They ensure collaboration surrounding each require-mentmdashat the point of attack

To capture a key point here The agile methods try to adopta Just-in-Time and Just Enough approach to their requirement

removal effectiveness according to Jones automated regressiontesting via unit tests will miss at least half of the regression bugs

POOR OR MISSING TEST ORACLESAgile favors working software over written documentationand in my observations special scorn is reserved for specifi-cations The suggestion by the Agile Manifesto (wwwagile-manifesto org) that people should value ldquoworking softwareover comprehensive documentationrdquo can create real chal-lenges for a test team when taken too far Testers userequirements specifications and other documents as testoracles ie the means to determine correct behavior undera given test condition We have seen testers in agile situa-tions given documents with insufficient detail and in somecases given no documents at all

Even when testers are given adequate documents twoother agile development principles keep the test oracle chal-lenge alive First agile requires teams to embrace changeand espouses that ldquothe most efficient and effective methodof conveying information to and within a development team isface-to-face conversationrdquo

From what Irsquove seen these two principles allow the proj-ect team to change the definition of correct behavior at anytime including after testers have created tests to confirm aparticular behavior and even after testers have reported bugsagainst a particular behavior

Further the definition of correct behavior can change in ameeting or a discussion where a tester is not present If thatchange is not communicated to the tester the project suffers

false positives when testers report bugs against behaviornow defined as correct

THE SHIFTING TEST BASISRequirements-based testing strategies cannot handle vague ormissing requirements specifications Missing requirementsspecifications would mean a test team following a requirements-based testing strategy not only canrsquot say what it means for aparticular test to pass or fail they wouldnrsquot have a test basisupon which to base their tests

The test basis also provides a means to measure theresults In a requirements-based test strategy testers canrsquotreport test results accurately if the requirements are missing orpoor Testers canrsquot report the percentage of the test basis cov-ered by passed tests because the requirements wonrsquot provideenough detail for meaningful coverage analysis

TOO MANY MEETINGSA ldquoface-to-face conversationrdquo is just another way to describe ameeting A client once jokingly described Scrum as a heavy-weight process saying ldquoIrsquom surprised at the name agilemdashitshould be called couch potato There are too many meetingsTherersquos too much jawboning I find it really ironic that there are allthese books explaining how simple it isrdquo

Perhaps it was intentional not to use the dreaded word meet-ing in the Agile Manifesto but the result is often the same In someorganizations everyone gets invited to every meeting meetingsballoon in number and duration and managers and leads are notavailable to manage and lead their team because they are in meet-

ings much of the day Effectiveness and efficiency drop Anotherclient once told me of a senior manager who said in response toa complaint from a line manager about how attending meetingswas negatively impacting his ability to lead his team ldquoWersquore goingto continue to have these meetings until I find out why nothing isgetting done around hererdquo The senior manager was I am told theonly one in the room who didnrsquot get the irony

THE BOTTOM LINEIn RBCS client engagements we typically recommend blend-ed testing strategies that align well with Scrum and otheragile methodologies What seems most likely to work in manycases is a blend of risk-based testing maintainable automat-ed testing (at both the unit and system level) and a percent-age of effort spent on reactive testing (eg exploratory) Insome cases our methods will help mitigate the testing risksand reduce the testing challenges associated with thesemethodologies

Clearly there is no single way to perform effective soft-ware testing and as the industry evolves so too will the mosteffective methods that do the most good yacute

development An option of ldquono requirementsrdquo does not exist

TOO MANY MEETINGSThe point of face-to-face collaboration is that itrsquos the most effec-tive and richest way to communicate in technical teams Has anydeveloper ever implemented a requirement had a tester test itand a customer receive it with perfect alignment That wouldtruly be exceptional More oftenthe tester will say the softwareperforms differently than the testsandor the customer says I didnrsquotask for that even though theywere all working off of the exactsame requirement

The point is that technicalteams need to rely on written aswell as face-to-face collabora-tion to effectively reduce themisunderstandings and reworkthat written requirements alone often drive

Beyond this agile teams should be conducting incrediblyeffective meetings something perhaps lacking in the exam-ples Rex has seen I suspect this problem isnrsquot agile-centricand that those teams simply need a good facilitator

THE BOTTOM LINEEffectively testing todayrsquos software is incredibly challenging andwe all need to stop thinking of agile testing as somethingopposed to traditional testing Testers need every possible tool

to perform testing well Agile methods have emerged as a set ofpractices and a mindset that is context based and powerful Theytake a holistic view to testing and qualitymdashreinforcing the two asa whole Testers in this model have a powerful opportunity to notonly test but to champion and guide the overall teamrsquos focus ondelivering quality software

We need to stop our history of opposition to new testing app-roaches and begin embracing abroader set of testing practicesthat work in each projectrsquos uniquecontext I also believe we that weneed to stop quoting CapersJones as I suspect his data andstudies do not reflect the dynam-ics of mature agile teams Irsquom look-ing forward to encouraging Rex totry and gain deeper understandingof agile testing practices yacute

Bob will have ample oppor-tunity to do that in October when he and Rex meet for thefirst time as fellow members of STampP CollaborativersquosStrategic Advisory Board ndashEd

JULY 2009 wwwstpcollaborativecom bull 23

[Teams need to rely on written and face-to-face

collaboration to effectively reduce rework]

LEARN AGILE TESTING TECHNIQUESdirectly from Bob Galen himself this fall atSTPCon The Scrum Master will be on hand inCambridge MA Oct 19-23 to teach ldquoAgile andHigh-Speed Software Testing Techniquesrdquo anin-depth training course for bringing agile test-

ing to your organization VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

LEARN TEST MANAGEMENT from Rex Blackat the Software Test amp Performance Conferencethis fall in Cambridge Mass Learn about ldquoTestManagement A Risk-Based Approachrdquo the prac-tice that guides test activities based on carefulassessment and prioritization of the risks involved VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

ciplines and draw on a variety of skills to do their jobs And whilethere are many unsung testing heroes inside software develop-ment shops there are the beginnings in many organizations of agrowing recognition that test and performance engineers shouldbe considered peers of their development counterparts

We asked several testing gurus to share their thoughts onthe state of the testing industry and give us some insights on wherethey see things evolving One thing for anyone to realize is thatwhile testing as an industry is maturing it is still relatively new Thesoftware tester job isnt something that shows up on the US CensusWe are still very much a new profession says Scott Barber thefounder and chief technologist of PerfTestPlus in Palm Bay Fla Mostof our very first programmers are still alive we arent thousands ofyears old like other disciplinesrdquo

Perhaps the best rundown of the industryrsquos evolution is summed up by Michael Boltonwho lives in Toronto Testing has historically been confirmation validation and verificationhe says I believe that testing must become far more about exploration discovery investi-gation learning and communicating what youve learnedrdquo Bolton runs test consultancyDevelopSense and teaches rapid software testing methods ldquoI like to encourage people torecognize and manage uncertainty and ambiguity and instead of trying to eliminate them get-ting them down to levels that are suffi-ciently low for people to do use-ful workrdquo

While any gathering ofsoftware testers will find dis-agreement on overall pur-pose one thing is cleartesting isnt about how toimprove code qualityndashthat isthe role of the engineeringpeople who do the coding workand of management says BoltonTesting is about providing informationto those people so that better-informed decisionsabout quality are possible

The future careers for the current cropof test engineers seem to be on the rise

All this activity has made software testing a very big tent The industry has matured and grownto encompass numerous approaches to methods training and tools Because of this diver-sity there is a growing consensus that the best testers are those that can span multiple dis-

The software testing industry is at a historic crossroads Rich Internet applica-tions cloud computing the rise of virtualization and more capable mobiledevices have all made the job of producing reliable software more complexwhile at the same time enabling more sophistication for automated test tools

Wersquore All Part of the Story

We are a new profession most of our first programmers are

still alivendash Scott Barber

A skilledtester is theproduct of avariety of

experiencesndash Jon Bach

24 bull Software Test amp Performance

the business outcomes theyrsquove delivered he says He goes on to describe the best mix in his development teams A performance

engineering organization should be comprised of engineers that are just as capableor more capable than the core product engineers This allows for the performanceteam to be just as familiar with the product code base and for them to go as far asmaking performance enhancements to the product and testing the outcome of thosechanges

At Google employees involved in testing generally have one of three titles saysDavid Saff a Google software engineer in test whorsquos basedin Cambridge Mass Software engineers are expected to unit-test their own code and many take an active role in maintain-ing testing frameworks and system tests for their teams Testengineers look at the quality requirements of a project identifyholes and fill them by applying existing automation tools where pos-sible or managing manual testing teams where needed Softwareengineers in test look for holes in the tools available and serve as internal

tool developers and implementation consultants Any teamcant survive without high competence in each domain

CLOUDS ARE FORMINGThe move toward cloud computing has helped softwaretesting overall but also made for some new challengestoo Cloud computing adds layers of complexity to test-ing but it also makes things more interesting says BoltonThese layers include being able to predict latency and avail-ability of all components that reside in the cloud Barber saysthat Too many people take latency for granted still and many

application developers still take the entire network infrastruc-ture for granted and dont consider it part of their domain

so programmers just assume it is working unless theircode has a problem

Bartow agrees Network latency hasbecome even more important with the intro-duction of Rich Internet Application tech-nologies such as Adobe Flex Every value that you enter

David Strom is a freelance reporter and author living in St Louis

particularly within large software shops such as Intuit and Google There software testengineers are often more important than the development engineers who writethe original code and have their own career track with parallel compensationand benefits

At Intuit QA engineers are held to the exact same standards and expec-tations as product developersrdquo says Dan Bartow a manager of performanceengineering for the company which makes TurboTax and other financial softwareldquoIn fact my team of five full-time performance engineers is calibrated the sameway as the product engineers at the end of year in terms of their skill-sets and

Virtualizationmakes things

easierand harder

ndash Michael Bolton

wwwstpcollaborativecom bull 25

of the Story

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 15: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

Which AreSolved When

Agile IsPerformedCorrectly

of traditional testing consultants I donrsquot believe theywere gathered from the perspective of an actual agiletester one with serious experience working in matureand high performing agile teams

Such views often come from the perspective of ascribe or observer of teams who at least from my per-spective arenrsquot practicing agility very well Or theyrsquoreanecdotal reactions to the Agile Manifesto or other writ-ings without trying to understand the underlying intentionsWhatrsquos discouraging is that superficial reactions from wellrespected sources can give the wrong impression

VOLUME AND SPEED OF CHANGEThe point of embracing change is the realization thatmost software projects are volatile Business conditionsare incredibly dynamic Customers may not actuallyknow what they need until they see the application startto coalesce as working software The developers maynot have understood the challenges associated with thearchitecture without actuallybuilding parts of it nor mighttesters understand the mostrisk-based areas to test with-out seeing the application run

The point of trying to freezerequirements at a historicalinstant in time where thingsare well understood withoutwriting a lick of software is afoolrsquos errand Agile methods tryto embrace change react to it

Iwas asked by Software Test amp Performance magazine to provide a rebuttal to Rex Blackrsquos perspectives on how ldquoagile challenges testingrdquoRexrsquos views and reactions to agility are typical

By Bob GalenBob Galen is principal consultant of the RGalen Consulting Group anda founding member of the STP Collaborative Strategic Advisory Board

JULY 2009 wwwstpcollaborativecom bull 19

erly-done automation works bestWhatrsquos more change-related challenges often arise from

changes in the definition of the product and its correct behav-ior When the test team is not kept informed of thesechanges or when the rate of change is very high this canimpose inefficiencies on the development execution andmaintenance of tests

SHORT ITERATIONS SQUEEZE TESTINGIn sequential life cycles if the organization involves the testteam early in the project the testers have lots of time todevelop and maintain their tests and can do so in parallel withthe development of the system and prior to the start of sys-tem test execution Some more formal iterative lifecycle mod-els such as Rapid Application Development and the RationalUnified Process often allow substantial periods of timebetween test execution periods for each iteration Theseintervals allow test teams develop and maintain their test sys-tems

Most teams using agile life cycles progress more quicklyeg Scrum uses sprints two to four week periods of shortfast-paced development iterations I have seen this pacesqueeze the test teamrsquos ability to develop and maintain testsystems Testing strategies that include an automation ele-ment also have proven particularly sensitive to this challenge

UNIT TESTING IS OFTEN INADEQUATE Most agile proponents stress good automated unit testing andframeworks such as JUnit have helped simplify and minimize

and make the changes visible across the team and to stakehold-ers while guiding the software toward the customersrsquo true needsIf the traditional testing approaches that Rex alludes to can notadapt to change effectively then Irsquod argue that they need tochange

SHORT ITERATIONS SQUEEZE TESTINGIf your parochial goal is simply testing then I agree that shortiterations can be quite challenging But if your goal is to deliversmall chunks of working software to gain customer and prod-uct quality feedback and to ultimately deliver high value andworking products applications then small iterations rock

To Rexrsquos point regarding ldquosqueezing test system mainte-nancerdquo all of the agile methods adhere to the notion of com-pleting a small set of features that meet rdquodonenessrdquo criteriaat the end of each Iteration or Sprint

There should be no work that is left outmdashfor exampleappropriate test automation development or maintenance oftest ldquosystemsrdquo

Itrsquos up to the testers to simply place this work on theProduct and Sprint Backlogs and plan its completion as partof each Iteration or Sprint

Sometimes this takes effort and courage to defend this workagainst feature pressure However all team members encounterthe same challenge and need to ensure that they deliver ldquodonerdquosoftware from each of their functional perspectives

UNIT TESTING IS OFTEN INADEQUATEFirst agile methods stress automated testing of all kindsmdashnotsimply unit tests FitNesse Selenium and Watir Watin are

examples of open source automation tools that are widelyused in the agile community and that focus on non-unit testing

We do stress unit tests as being an important part of theoverall coverage and as being a safety net for the teamsrsquo refac-toring efforts We want developers to be responsible for con-tributing to test automationmdashnot just testers And if a team isusing TDD as a practice wersquove learned that developing testscan be a powerful aid as one designs their software in smallincremental steps

No agile team to my knowledge views unit tests as theend-all of testing In fact as part of delivering ldquodonerdquo softwareincrements they must decide on the set of tests (unit func-

20 bull Software Test amp Performance JULY 2009

[I expect testers to pair with

developers to improve practices anddevelopers

to deliver solid unit tests before they

can be considered lsquodonersquo]

tional acceptance non-functional regression etc) thatshould be run within the Sprint in order to meet a customer-ready deliverable Every team member should be definingthese testing tasks with the testers taking the lead

To Rexrsquos last point that programmers donrsquot commit to doingeffective unit testing I expect the testers to pair with the devel-opers to improve these practices and I expect developers todeliver solid unit tests before they can be considered ldquodonerdquo

MANAGING THE INCREASED REGRESSION RISKThere is a continuing misperception that unit tests are theonly tests run on a Sprint-by-Sprint basis While itrsquos true thatthe team might not be able to run a full regression test with-in each Sprint it is their responsibility to apply risk-basedtechniques based upon whatrsquos been added or changed andto run appropriate functional non-functional exploratory andregression tests as part of delivering ldquodonerdquo software at theend of the Sprint

Clearly automation development at all levels helps withthis so the team should be investing time in improving theiroverall ability to automate all of the testing types I mention asappropriate

POOR OR MISSING TEST ORACLESI hate to break the news to Rex but the best test oracles arethe people within the team not the documentation Documen-tation almost always loses its relevance and accuracy Yes itrsquoshelpful but should never be considered the single source forall knowledge

What agility encourages is for the team to talkmdashto plan their

the costs Such automated unit tests allow what agile propo-nents call refactoring Refactoring is the redesign of majorchunks of code or even entire objects The automated unit

tests provide for quick regression testing of refactored codeSome agilists recommend substituting automated unit tests fordesign as in Test Driven Development While this soundsgood in theory there are two problems that I have observedwith this approach in practice

First unit testing has limited bug-finding utility In researchpublished in June 2002 in CrossTalk the Journal of Defense

Software Engineering author Capers Jones found unit testingto be only 25 to 30 percent effective at finding and removingdefects In RBCS assessments we have found that good sys-tem testing by an independent test team averages around 85percent effective at finding defects So organizations that careabout quality should have both good unit testing and good sys-tem testing

Second while unit testing is often thought of as thedeveloperrsquos job I have found that many programmers justdonrsquot do it or deliver cursory attempts This can create com-plications and delays for the system test team in its role asldquobug finder of last resortrdquo as they deal with buggy code dur-ing short test execution periods in the sprints This can alsolead to code churn during testing since so much code mustchange to fix the bugs

The amount of change can ultimately outstrip even theability of the best automated regression test system to keepup which then leads to lower defect detection effectivenessfor the test team

MANAGING THE INCREASED REGRESSION RISKCapers Jones also reported that regression accounts forabout seven percent of bugs In iterative life cycles such asScrum however code that worked in previous sprints getschurned by new features in each subsequent sprint Thisincreases the risk of regression

Agilists emphasize good automated unit testing in part tomanage the regression risk inherent in such churn But with eventhe best unit testing topping out at about 50 percent defect

JULY 2009 wwwstpcollaborativecom bull 21

[While unit testing is often thought

of as the developerrsquos job I have found that

they just donrsquot do it or [they] deliver

cursory attempts]

22 bull Software Test amp Performance JULY 2009

work together to cluster in small groups working on specific fea-tures to work with the customer gaining valuable feedbackmdashandto collaborate across disciplines From this they can decidewhere when and to what degree documentation is necessary

We expect team members to be professionals and to beresponsible and accountable We trust their judgment overblindly filling in a plan template or a requirement document orblindly running the same tests over and over again to meetsome perceived risk

All oracles have a cost We engage the customers in thesedecisions as wellmdashensuring that those oracles that are providedare actually used and kept current

THE SHIFTING TEST BASISIrsquom lost here Agile teams define requirements Some usePost-it Notes and cards or define them as User StoriesOthers describe use cases or use traditional requirementforms Many even add a construct called acceptance tests ona finely grained story level that helps define the critical ele-ments of each feature or requirement These are things thatthe customer will ldquotestrdquo as part of their acceptance at the endof a Sprint or Iteration

Irsquod guess that Rex has seen poor or missing requirements inhis agile experience and Irsquod say those were not mature or solid-ly performing agile teams ldquoGoodrdquo teams define requirementsat a level that assures the team will build what the customerneeds They ensure collaboration surrounding each require-mentmdashat the point of attack

To capture a key point here The agile methods try to adopta Just-in-Time and Just Enough approach to their requirement

removal effectiveness according to Jones automated regressiontesting via unit tests will miss at least half of the regression bugs

POOR OR MISSING TEST ORACLESAgile favors working software over written documentationand in my observations special scorn is reserved for specifi-cations The suggestion by the Agile Manifesto (wwwagile-manifesto org) that people should value ldquoworking softwareover comprehensive documentationrdquo can create real chal-lenges for a test team when taken too far Testers userequirements specifications and other documents as testoracles ie the means to determine correct behavior undera given test condition We have seen testers in agile situa-tions given documents with insufficient detail and in somecases given no documents at all

Even when testers are given adequate documents twoother agile development principles keep the test oracle chal-lenge alive First agile requires teams to embrace changeand espouses that ldquothe most efficient and effective methodof conveying information to and within a development team isface-to-face conversationrdquo

From what Irsquove seen these two principles allow the proj-ect team to change the definition of correct behavior at anytime including after testers have created tests to confirm aparticular behavior and even after testers have reported bugsagainst a particular behavior

Further the definition of correct behavior can change in ameeting or a discussion where a tester is not present If thatchange is not communicated to the tester the project suffers

false positives when testers report bugs against behaviornow defined as correct

THE SHIFTING TEST BASISRequirements-based testing strategies cannot handle vague ormissing requirements specifications Missing requirementsspecifications would mean a test team following a requirements-based testing strategy not only canrsquot say what it means for aparticular test to pass or fail they wouldnrsquot have a test basisupon which to base their tests

The test basis also provides a means to measure theresults In a requirements-based test strategy testers canrsquotreport test results accurately if the requirements are missing orpoor Testers canrsquot report the percentage of the test basis cov-ered by passed tests because the requirements wonrsquot provideenough detail for meaningful coverage analysis

TOO MANY MEETINGSA ldquoface-to-face conversationrdquo is just another way to describe ameeting A client once jokingly described Scrum as a heavy-weight process saying ldquoIrsquom surprised at the name agilemdashitshould be called couch potato There are too many meetingsTherersquos too much jawboning I find it really ironic that there are allthese books explaining how simple it isrdquo

Perhaps it was intentional not to use the dreaded word meet-ing in the Agile Manifesto but the result is often the same In someorganizations everyone gets invited to every meeting meetingsballoon in number and duration and managers and leads are notavailable to manage and lead their team because they are in meet-

ings much of the day Effectiveness and efficiency drop Anotherclient once told me of a senior manager who said in response toa complaint from a line manager about how attending meetingswas negatively impacting his ability to lead his team ldquoWersquore goingto continue to have these meetings until I find out why nothing isgetting done around hererdquo The senior manager was I am told theonly one in the room who didnrsquot get the irony

THE BOTTOM LINEIn RBCS client engagements we typically recommend blend-ed testing strategies that align well with Scrum and otheragile methodologies What seems most likely to work in manycases is a blend of risk-based testing maintainable automat-ed testing (at both the unit and system level) and a percent-age of effort spent on reactive testing (eg exploratory) Insome cases our methods will help mitigate the testing risksand reduce the testing challenges associated with thesemethodologies

Clearly there is no single way to perform effective soft-ware testing and as the industry evolves so too will the mosteffective methods that do the most good yacute

development An option of ldquono requirementsrdquo does not exist

TOO MANY MEETINGSThe point of face-to-face collaboration is that itrsquos the most effec-tive and richest way to communicate in technical teams Has anydeveloper ever implemented a requirement had a tester test itand a customer receive it with perfect alignment That wouldtruly be exceptional More oftenthe tester will say the softwareperforms differently than the testsandor the customer says I didnrsquotask for that even though theywere all working off of the exactsame requirement

The point is that technicalteams need to rely on written aswell as face-to-face collabora-tion to effectively reduce themisunderstandings and reworkthat written requirements alone often drive

Beyond this agile teams should be conducting incrediblyeffective meetings something perhaps lacking in the exam-ples Rex has seen I suspect this problem isnrsquot agile-centricand that those teams simply need a good facilitator

THE BOTTOM LINEEffectively testing todayrsquos software is incredibly challenging andwe all need to stop thinking of agile testing as somethingopposed to traditional testing Testers need every possible tool

to perform testing well Agile methods have emerged as a set ofpractices and a mindset that is context based and powerful Theytake a holistic view to testing and qualitymdashreinforcing the two asa whole Testers in this model have a powerful opportunity to notonly test but to champion and guide the overall teamrsquos focus ondelivering quality software

We need to stop our history of opposition to new testing app-roaches and begin embracing abroader set of testing practicesthat work in each projectrsquos uniquecontext I also believe we that weneed to stop quoting CapersJones as I suspect his data andstudies do not reflect the dynam-ics of mature agile teams Irsquom look-ing forward to encouraging Rex totry and gain deeper understandingof agile testing practices yacute

Bob will have ample oppor-tunity to do that in October when he and Rex meet for thefirst time as fellow members of STampP CollaborativersquosStrategic Advisory Board ndashEd

JULY 2009 wwwstpcollaborativecom bull 23

[Teams need to rely on written and face-to-face

collaboration to effectively reduce rework]

LEARN AGILE TESTING TECHNIQUESdirectly from Bob Galen himself this fall atSTPCon The Scrum Master will be on hand inCambridge MA Oct 19-23 to teach ldquoAgile andHigh-Speed Software Testing Techniquesrdquo anin-depth training course for bringing agile test-

ing to your organization VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

LEARN TEST MANAGEMENT from Rex Blackat the Software Test amp Performance Conferencethis fall in Cambridge Mass Learn about ldquoTestManagement A Risk-Based Approachrdquo the prac-tice that guides test activities based on carefulassessment and prioritization of the risks involved VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

ciplines and draw on a variety of skills to do their jobs And whilethere are many unsung testing heroes inside software develop-ment shops there are the beginnings in many organizations of agrowing recognition that test and performance engineers shouldbe considered peers of their development counterparts

We asked several testing gurus to share their thoughts onthe state of the testing industry and give us some insights on wherethey see things evolving One thing for anyone to realize is thatwhile testing as an industry is maturing it is still relatively new Thesoftware tester job isnt something that shows up on the US CensusWe are still very much a new profession says Scott Barber thefounder and chief technologist of PerfTestPlus in Palm Bay Fla Mostof our very first programmers are still alive we arent thousands ofyears old like other disciplinesrdquo

Perhaps the best rundown of the industryrsquos evolution is summed up by Michael Boltonwho lives in Toronto Testing has historically been confirmation validation and verificationhe says I believe that testing must become far more about exploration discovery investi-gation learning and communicating what youve learnedrdquo Bolton runs test consultancyDevelopSense and teaches rapid software testing methods ldquoI like to encourage people torecognize and manage uncertainty and ambiguity and instead of trying to eliminate them get-ting them down to levels that are suffi-ciently low for people to do use-ful workrdquo

While any gathering ofsoftware testers will find dis-agreement on overall pur-pose one thing is cleartesting isnt about how toimprove code qualityndashthat isthe role of the engineeringpeople who do the coding workand of management says BoltonTesting is about providing informationto those people so that better-informed decisionsabout quality are possible

The future careers for the current cropof test engineers seem to be on the rise

All this activity has made software testing a very big tent The industry has matured and grownto encompass numerous approaches to methods training and tools Because of this diver-sity there is a growing consensus that the best testers are those that can span multiple dis-

The software testing industry is at a historic crossroads Rich Internet applica-tions cloud computing the rise of virtualization and more capable mobiledevices have all made the job of producing reliable software more complexwhile at the same time enabling more sophistication for automated test tools

Wersquore All Part of the Story

We are a new profession most of our first programmers are

still alivendash Scott Barber

A skilledtester is theproduct of avariety of

experiencesndash Jon Bach

24 bull Software Test amp Performance

the business outcomes theyrsquove delivered he says He goes on to describe the best mix in his development teams A performance

engineering organization should be comprised of engineers that are just as capableor more capable than the core product engineers This allows for the performanceteam to be just as familiar with the product code base and for them to go as far asmaking performance enhancements to the product and testing the outcome of thosechanges

At Google employees involved in testing generally have one of three titles saysDavid Saff a Google software engineer in test whorsquos basedin Cambridge Mass Software engineers are expected to unit-test their own code and many take an active role in maintain-ing testing frameworks and system tests for their teams Testengineers look at the quality requirements of a project identifyholes and fill them by applying existing automation tools where pos-sible or managing manual testing teams where needed Softwareengineers in test look for holes in the tools available and serve as internal

tool developers and implementation consultants Any teamcant survive without high competence in each domain

CLOUDS ARE FORMINGThe move toward cloud computing has helped softwaretesting overall but also made for some new challengestoo Cloud computing adds layers of complexity to test-ing but it also makes things more interesting says BoltonThese layers include being able to predict latency and avail-ability of all components that reside in the cloud Barber saysthat Too many people take latency for granted still and many

application developers still take the entire network infrastruc-ture for granted and dont consider it part of their domain

so programmers just assume it is working unless theircode has a problem

Bartow agrees Network latency hasbecome even more important with the intro-duction of Rich Internet Application tech-nologies such as Adobe Flex Every value that you enter

David Strom is a freelance reporter and author living in St Louis

particularly within large software shops such as Intuit and Google There software testengineers are often more important than the development engineers who writethe original code and have their own career track with parallel compensationand benefits

At Intuit QA engineers are held to the exact same standards and expec-tations as product developersrdquo says Dan Bartow a manager of performanceengineering for the company which makes TurboTax and other financial softwareldquoIn fact my team of five full-time performance engineers is calibrated the sameway as the product engineers at the end of year in terms of their skill-sets and

Virtualizationmakes things

easierand harder

ndash Michael Bolton

wwwstpcollaborativecom bull 25

of the Story

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 16: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

erly-done automation works bestWhatrsquos more change-related challenges often arise from

changes in the definition of the product and its correct behav-ior When the test team is not kept informed of thesechanges or when the rate of change is very high this canimpose inefficiencies on the development execution andmaintenance of tests

SHORT ITERATIONS SQUEEZE TESTINGIn sequential life cycles if the organization involves the testteam early in the project the testers have lots of time todevelop and maintain their tests and can do so in parallel withthe development of the system and prior to the start of sys-tem test execution Some more formal iterative lifecycle mod-els such as Rapid Application Development and the RationalUnified Process often allow substantial periods of timebetween test execution periods for each iteration Theseintervals allow test teams develop and maintain their test sys-tems

Most teams using agile life cycles progress more quicklyeg Scrum uses sprints two to four week periods of shortfast-paced development iterations I have seen this pacesqueeze the test teamrsquos ability to develop and maintain testsystems Testing strategies that include an automation ele-ment also have proven particularly sensitive to this challenge

UNIT TESTING IS OFTEN INADEQUATE Most agile proponents stress good automated unit testing andframeworks such as JUnit have helped simplify and minimize

and make the changes visible across the team and to stakehold-ers while guiding the software toward the customersrsquo true needsIf the traditional testing approaches that Rex alludes to can notadapt to change effectively then Irsquod argue that they need tochange

SHORT ITERATIONS SQUEEZE TESTINGIf your parochial goal is simply testing then I agree that shortiterations can be quite challenging But if your goal is to deliversmall chunks of working software to gain customer and prod-uct quality feedback and to ultimately deliver high value andworking products applications then small iterations rock

To Rexrsquos point regarding ldquosqueezing test system mainte-nancerdquo all of the agile methods adhere to the notion of com-pleting a small set of features that meet rdquodonenessrdquo criteriaat the end of each Iteration or Sprint

There should be no work that is left outmdashfor exampleappropriate test automation development or maintenance oftest ldquosystemsrdquo

Itrsquos up to the testers to simply place this work on theProduct and Sprint Backlogs and plan its completion as partof each Iteration or Sprint

Sometimes this takes effort and courage to defend this workagainst feature pressure However all team members encounterthe same challenge and need to ensure that they deliver ldquodonerdquosoftware from each of their functional perspectives

UNIT TESTING IS OFTEN INADEQUATEFirst agile methods stress automated testing of all kindsmdashnotsimply unit tests FitNesse Selenium and Watir Watin are

examples of open source automation tools that are widelyused in the agile community and that focus on non-unit testing

We do stress unit tests as being an important part of theoverall coverage and as being a safety net for the teamsrsquo refac-toring efforts We want developers to be responsible for con-tributing to test automationmdashnot just testers And if a team isusing TDD as a practice wersquove learned that developing testscan be a powerful aid as one designs their software in smallincremental steps

No agile team to my knowledge views unit tests as theend-all of testing In fact as part of delivering ldquodonerdquo softwareincrements they must decide on the set of tests (unit func-

20 bull Software Test amp Performance JULY 2009

[I expect testers to pair with

developers to improve practices anddevelopers

to deliver solid unit tests before they

can be considered lsquodonersquo]

tional acceptance non-functional regression etc) thatshould be run within the Sprint in order to meet a customer-ready deliverable Every team member should be definingthese testing tasks with the testers taking the lead

To Rexrsquos last point that programmers donrsquot commit to doingeffective unit testing I expect the testers to pair with the devel-opers to improve these practices and I expect developers todeliver solid unit tests before they can be considered ldquodonerdquo

MANAGING THE INCREASED REGRESSION RISKThere is a continuing misperception that unit tests are theonly tests run on a Sprint-by-Sprint basis While itrsquos true thatthe team might not be able to run a full regression test with-in each Sprint it is their responsibility to apply risk-basedtechniques based upon whatrsquos been added or changed andto run appropriate functional non-functional exploratory andregression tests as part of delivering ldquodonerdquo software at theend of the Sprint

Clearly automation development at all levels helps withthis so the team should be investing time in improving theiroverall ability to automate all of the testing types I mention asappropriate

POOR OR MISSING TEST ORACLESI hate to break the news to Rex but the best test oracles arethe people within the team not the documentation Documen-tation almost always loses its relevance and accuracy Yes itrsquoshelpful but should never be considered the single source forall knowledge

What agility encourages is for the team to talkmdashto plan their

the costs Such automated unit tests allow what agile propo-nents call refactoring Refactoring is the redesign of majorchunks of code or even entire objects The automated unit

tests provide for quick regression testing of refactored codeSome agilists recommend substituting automated unit tests fordesign as in Test Driven Development While this soundsgood in theory there are two problems that I have observedwith this approach in practice

First unit testing has limited bug-finding utility In researchpublished in June 2002 in CrossTalk the Journal of Defense

Software Engineering author Capers Jones found unit testingto be only 25 to 30 percent effective at finding and removingdefects In RBCS assessments we have found that good sys-tem testing by an independent test team averages around 85percent effective at finding defects So organizations that careabout quality should have both good unit testing and good sys-tem testing

Second while unit testing is often thought of as thedeveloperrsquos job I have found that many programmers justdonrsquot do it or deliver cursory attempts This can create com-plications and delays for the system test team in its role asldquobug finder of last resortrdquo as they deal with buggy code dur-ing short test execution periods in the sprints This can alsolead to code churn during testing since so much code mustchange to fix the bugs

The amount of change can ultimately outstrip even theability of the best automated regression test system to keepup which then leads to lower defect detection effectivenessfor the test team

MANAGING THE INCREASED REGRESSION RISKCapers Jones also reported that regression accounts forabout seven percent of bugs In iterative life cycles such asScrum however code that worked in previous sprints getschurned by new features in each subsequent sprint Thisincreases the risk of regression

Agilists emphasize good automated unit testing in part tomanage the regression risk inherent in such churn But with eventhe best unit testing topping out at about 50 percent defect

JULY 2009 wwwstpcollaborativecom bull 21

[While unit testing is often thought

of as the developerrsquos job I have found that

they just donrsquot do it or [they] deliver

cursory attempts]

22 bull Software Test amp Performance JULY 2009

work together to cluster in small groups working on specific fea-tures to work with the customer gaining valuable feedbackmdashandto collaborate across disciplines From this they can decidewhere when and to what degree documentation is necessary

We expect team members to be professionals and to beresponsible and accountable We trust their judgment overblindly filling in a plan template or a requirement document orblindly running the same tests over and over again to meetsome perceived risk

All oracles have a cost We engage the customers in thesedecisions as wellmdashensuring that those oracles that are providedare actually used and kept current

THE SHIFTING TEST BASISIrsquom lost here Agile teams define requirements Some usePost-it Notes and cards or define them as User StoriesOthers describe use cases or use traditional requirementforms Many even add a construct called acceptance tests ona finely grained story level that helps define the critical ele-ments of each feature or requirement These are things thatthe customer will ldquotestrdquo as part of their acceptance at the endof a Sprint or Iteration

Irsquod guess that Rex has seen poor or missing requirements inhis agile experience and Irsquod say those were not mature or solid-ly performing agile teams ldquoGoodrdquo teams define requirementsat a level that assures the team will build what the customerneeds They ensure collaboration surrounding each require-mentmdashat the point of attack

To capture a key point here The agile methods try to adopta Just-in-Time and Just Enough approach to their requirement

removal effectiveness according to Jones automated regressiontesting via unit tests will miss at least half of the regression bugs

POOR OR MISSING TEST ORACLESAgile favors working software over written documentationand in my observations special scorn is reserved for specifi-cations The suggestion by the Agile Manifesto (wwwagile-manifesto org) that people should value ldquoworking softwareover comprehensive documentationrdquo can create real chal-lenges for a test team when taken too far Testers userequirements specifications and other documents as testoracles ie the means to determine correct behavior undera given test condition We have seen testers in agile situa-tions given documents with insufficient detail and in somecases given no documents at all

Even when testers are given adequate documents twoother agile development principles keep the test oracle chal-lenge alive First agile requires teams to embrace changeand espouses that ldquothe most efficient and effective methodof conveying information to and within a development team isface-to-face conversationrdquo

From what Irsquove seen these two principles allow the proj-ect team to change the definition of correct behavior at anytime including after testers have created tests to confirm aparticular behavior and even after testers have reported bugsagainst a particular behavior

Further the definition of correct behavior can change in ameeting or a discussion where a tester is not present If thatchange is not communicated to the tester the project suffers

false positives when testers report bugs against behaviornow defined as correct

THE SHIFTING TEST BASISRequirements-based testing strategies cannot handle vague ormissing requirements specifications Missing requirementsspecifications would mean a test team following a requirements-based testing strategy not only canrsquot say what it means for aparticular test to pass or fail they wouldnrsquot have a test basisupon which to base their tests

The test basis also provides a means to measure theresults In a requirements-based test strategy testers canrsquotreport test results accurately if the requirements are missing orpoor Testers canrsquot report the percentage of the test basis cov-ered by passed tests because the requirements wonrsquot provideenough detail for meaningful coverage analysis

TOO MANY MEETINGSA ldquoface-to-face conversationrdquo is just another way to describe ameeting A client once jokingly described Scrum as a heavy-weight process saying ldquoIrsquom surprised at the name agilemdashitshould be called couch potato There are too many meetingsTherersquos too much jawboning I find it really ironic that there are allthese books explaining how simple it isrdquo

Perhaps it was intentional not to use the dreaded word meet-ing in the Agile Manifesto but the result is often the same In someorganizations everyone gets invited to every meeting meetingsballoon in number and duration and managers and leads are notavailable to manage and lead their team because they are in meet-

ings much of the day Effectiveness and efficiency drop Anotherclient once told me of a senior manager who said in response toa complaint from a line manager about how attending meetingswas negatively impacting his ability to lead his team ldquoWersquore goingto continue to have these meetings until I find out why nothing isgetting done around hererdquo The senior manager was I am told theonly one in the room who didnrsquot get the irony

THE BOTTOM LINEIn RBCS client engagements we typically recommend blend-ed testing strategies that align well with Scrum and otheragile methodologies What seems most likely to work in manycases is a blend of risk-based testing maintainable automat-ed testing (at both the unit and system level) and a percent-age of effort spent on reactive testing (eg exploratory) Insome cases our methods will help mitigate the testing risksand reduce the testing challenges associated with thesemethodologies

Clearly there is no single way to perform effective soft-ware testing and as the industry evolves so too will the mosteffective methods that do the most good yacute

development An option of ldquono requirementsrdquo does not exist

TOO MANY MEETINGSThe point of face-to-face collaboration is that itrsquos the most effec-tive and richest way to communicate in technical teams Has anydeveloper ever implemented a requirement had a tester test itand a customer receive it with perfect alignment That wouldtruly be exceptional More oftenthe tester will say the softwareperforms differently than the testsandor the customer says I didnrsquotask for that even though theywere all working off of the exactsame requirement

The point is that technicalteams need to rely on written aswell as face-to-face collabora-tion to effectively reduce themisunderstandings and reworkthat written requirements alone often drive

Beyond this agile teams should be conducting incrediblyeffective meetings something perhaps lacking in the exam-ples Rex has seen I suspect this problem isnrsquot agile-centricand that those teams simply need a good facilitator

THE BOTTOM LINEEffectively testing todayrsquos software is incredibly challenging andwe all need to stop thinking of agile testing as somethingopposed to traditional testing Testers need every possible tool

to perform testing well Agile methods have emerged as a set ofpractices and a mindset that is context based and powerful Theytake a holistic view to testing and qualitymdashreinforcing the two asa whole Testers in this model have a powerful opportunity to notonly test but to champion and guide the overall teamrsquos focus ondelivering quality software

We need to stop our history of opposition to new testing app-roaches and begin embracing abroader set of testing practicesthat work in each projectrsquos uniquecontext I also believe we that weneed to stop quoting CapersJones as I suspect his data andstudies do not reflect the dynam-ics of mature agile teams Irsquom look-ing forward to encouraging Rex totry and gain deeper understandingof agile testing practices yacute

Bob will have ample oppor-tunity to do that in October when he and Rex meet for thefirst time as fellow members of STampP CollaborativersquosStrategic Advisory Board ndashEd

JULY 2009 wwwstpcollaborativecom bull 23

[Teams need to rely on written and face-to-face

collaboration to effectively reduce rework]

LEARN AGILE TESTING TECHNIQUESdirectly from Bob Galen himself this fall atSTPCon The Scrum Master will be on hand inCambridge MA Oct 19-23 to teach ldquoAgile andHigh-Speed Software Testing Techniquesrdquo anin-depth training course for bringing agile test-

ing to your organization VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

LEARN TEST MANAGEMENT from Rex Blackat the Software Test amp Performance Conferencethis fall in Cambridge Mass Learn about ldquoTestManagement A Risk-Based Approachrdquo the prac-tice that guides test activities based on carefulassessment and prioritization of the risks involved VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

ciplines and draw on a variety of skills to do their jobs And whilethere are many unsung testing heroes inside software develop-ment shops there are the beginnings in many organizations of agrowing recognition that test and performance engineers shouldbe considered peers of their development counterparts

We asked several testing gurus to share their thoughts onthe state of the testing industry and give us some insights on wherethey see things evolving One thing for anyone to realize is thatwhile testing as an industry is maturing it is still relatively new Thesoftware tester job isnt something that shows up on the US CensusWe are still very much a new profession says Scott Barber thefounder and chief technologist of PerfTestPlus in Palm Bay Fla Mostof our very first programmers are still alive we arent thousands ofyears old like other disciplinesrdquo

Perhaps the best rundown of the industryrsquos evolution is summed up by Michael Boltonwho lives in Toronto Testing has historically been confirmation validation and verificationhe says I believe that testing must become far more about exploration discovery investi-gation learning and communicating what youve learnedrdquo Bolton runs test consultancyDevelopSense and teaches rapid software testing methods ldquoI like to encourage people torecognize and manage uncertainty and ambiguity and instead of trying to eliminate them get-ting them down to levels that are suffi-ciently low for people to do use-ful workrdquo

While any gathering ofsoftware testers will find dis-agreement on overall pur-pose one thing is cleartesting isnt about how toimprove code qualityndashthat isthe role of the engineeringpeople who do the coding workand of management says BoltonTesting is about providing informationto those people so that better-informed decisionsabout quality are possible

The future careers for the current cropof test engineers seem to be on the rise

All this activity has made software testing a very big tent The industry has matured and grownto encompass numerous approaches to methods training and tools Because of this diver-sity there is a growing consensus that the best testers are those that can span multiple dis-

The software testing industry is at a historic crossroads Rich Internet applica-tions cloud computing the rise of virtualization and more capable mobiledevices have all made the job of producing reliable software more complexwhile at the same time enabling more sophistication for automated test tools

Wersquore All Part of the Story

We are a new profession most of our first programmers are

still alivendash Scott Barber

A skilledtester is theproduct of avariety of

experiencesndash Jon Bach

24 bull Software Test amp Performance

the business outcomes theyrsquove delivered he says He goes on to describe the best mix in his development teams A performance

engineering organization should be comprised of engineers that are just as capableor more capable than the core product engineers This allows for the performanceteam to be just as familiar with the product code base and for them to go as far asmaking performance enhancements to the product and testing the outcome of thosechanges

At Google employees involved in testing generally have one of three titles saysDavid Saff a Google software engineer in test whorsquos basedin Cambridge Mass Software engineers are expected to unit-test their own code and many take an active role in maintain-ing testing frameworks and system tests for their teams Testengineers look at the quality requirements of a project identifyholes and fill them by applying existing automation tools where pos-sible or managing manual testing teams where needed Softwareengineers in test look for holes in the tools available and serve as internal

tool developers and implementation consultants Any teamcant survive without high competence in each domain

CLOUDS ARE FORMINGThe move toward cloud computing has helped softwaretesting overall but also made for some new challengestoo Cloud computing adds layers of complexity to test-ing but it also makes things more interesting says BoltonThese layers include being able to predict latency and avail-ability of all components that reside in the cloud Barber saysthat Too many people take latency for granted still and many

application developers still take the entire network infrastruc-ture for granted and dont consider it part of their domain

so programmers just assume it is working unless theircode has a problem

Bartow agrees Network latency hasbecome even more important with the intro-duction of Rich Internet Application tech-nologies such as Adobe Flex Every value that you enter

David Strom is a freelance reporter and author living in St Louis

particularly within large software shops such as Intuit and Google There software testengineers are often more important than the development engineers who writethe original code and have their own career track with parallel compensationand benefits

At Intuit QA engineers are held to the exact same standards and expec-tations as product developersrdquo says Dan Bartow a manager of performanceengineering for the company which makes TurboTax and other financial softwareldquoIn fact my team of five full-time performance engineers is calibrated the sameway as the product engineers at the end of year in terms of their skill-sets and

Virtualizationmakes things

easierand harder

ndash Michael Bolton

wwwstpcollaborativecom bull 25

of the Story

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 17: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

tional acceptance non-functional regression etc) thatshould be run within the Sprint in order to meet a customer-ready deliverable Every team member should be definingthese testing tasks with the testers taking the lead

To Rexrsquos last point that programmers donrsquot commit to doingeffective unit testing I expect the testers to pair with the devel-opers to improve these practices and I expect developers todeliver solid unit tests before they can be considered ldquodonerdquo

MANAGING THE INCREASED REGRESSION RISKThere is a continuing misperception that unit tests are theonly tests run on a Sprint-by-Sprint basis While itrsquos true thatthe team might not be able to run a full regression test with-in each Sprint it is their responsibility to apply risk-basedtechniques based upon whatrsquos been added or changed andto run appropriate functional non-functional exploratory andregression tests as part of delivering ldquodonerdquo software at theend of the Sprint

Clearly automation development at all levels helps withthis so the team should be investing time in improving theiroverall ability to automate all of the testing types I mention asappropriate

POOR OR MISSING TEST ORACLESI hate to break the news to Rex but the best test oracles arethe people within the team not the documentation Documen-tation almost always loses its relevance and accuracy Yes itrsquoshelpful but should never be considered the single source forall knowledge

What agility encourages is for the team to talkmdashto plan their

the costs Such automated unit tests allow what agile propo-nents call refactoring Refactoring is the redesign of majorchunks of code or even entire objects The automated unit

tests provide for quick regression testing of refactored codeSome agilists recommend substituting automated unit tests fordesign as in Test Driven Development While this soundsgood in theory there are two problems that I have observedwith this approach in practice

First unit testing has limited bug-finding utility In researchpublished in June 2002 in CrossTalk the Journal of Defense

Software Engineering author Capers Jones found unit testingto be only 25 to 30 percent effective at finding and removingdefects In RBCS assessments we have found that good sys-tem testing by an independent test team averages around 85percent effective at finding defects So organizations that careabout quality should have both good unit testing and good sys-tem testing

Second while unit testing is often thought of as thedeveloperrsquos job I have found that many programmers justdonrsquot do it or deliver cursory attempts This can create com-plications and delays for the system test team in its role asldquobug finder of last resortrdquo as they deal with buggy code dur-ing short test execution periods in the sprints This can alsolead to code churn during testing since so much code mustchange to fix the bugs

The amount of change can ultimately outstrip even theability of the best automated regression test system to keepup which then leads to lower defect detection effectivenessfor the test team

MANAGING THE INCREASED REGRESSION RISKCapers Jones also reported that regression accounts forabout seven percent of bugs In iterative life cycles such asScrum however code that worked in previous sprints getschurned by new features in each subsequent sprint Thisincreases the risk of regression

Agilists emphasize good automated unit testing in part tomanage the regression risk inherent in such churn But with eventhe best unit testing topping out at about 50 percent defect

JULY 2009 wwwstpcollaborativecom bull 21

[While unit testing is often thought

of as the developerrsquos job I have found that

they just donrsquot do it or [they] deliver

cursory attempts]

22 bull Software Test amp Performance JULY 2009

work together to cluster in small groups working on specific fea-tures to work with the customer gaining valuable feedbackmdashandto collaborate across disciplines From this they can decidewhere when and to what degree documentation is necessary

We expect team members to be professionals and to beresponsible and accountable We trust their judgment overblindly filling in a plan template or a requirement document orblindly running the same tests over and over again to meetsome perceived risk

All oracles have a cost We engage the customers in thesedecisions as wellmdashensuring that those oracles that are providedare actually used and kept current

THE SHIFTING TEST BASISIrsquom lost here Agile teams define requirements Some usePost-it Notes and cards or define them as User StoriesOthers describe use cases or use traditional requirementforms Many even add a construct called acceptance tests ona finely grained story level that helps define the critical ele-ments of each feature or requirement These are things thatthe customer will ldquotestrdquo as part of their acceptance at the endof a Sprint or Iteration

Irsquod guess that Rex has seen poor or missing requirements inhis agile experience and Irsquod say those were not mature or solid-ly performing agile teams ldquoGoodrdquo teams define requirementsat a level that assures the team will build what the customerneeds They ensure collaboration surrounding each require-mentmdashat the point of attack

To capture a key point here The agile methods try to adopta Just-in-Time and Just Enough approach to their requirement

removal effectiveness according to Jones automated regressiontesting via unit tests will miss at least half of the regression bugs

POOR OR MISSING TEST ORACLESAgile favors working software over written documentationand in my observations special scorn is reserved for specifi-cations The suggestion by the Agile Manifesto (wwwagile-manifesto org) that people should value ldquoworking softwareover comprehensive documentationrdquo can create real chal-lenges for a test team when taken too far Testers userequirements specifications and other documents as testoracles ie the means to determine correct behavior undera given test condition We have seen testers in agile situa-tions given documents with insufficient detail and in somecases given no documents at all

Even when testers are given adequate documents twoother agile development principles keep the test oracle chal-lenge alive First agile requires teams to embrace changeand espouses that ldquothe most efficient and effective methodof conveying information to and within a development team isface-to-face conversationrdquo

From what Irsquove seen these two principles allow the proj-ect team to change the definition of correct behavior at anytime including after testers have created tests to confirm aparticular behavior and even after testers have reported bugsagainst a particular behavior

Further the definition of correct behavior can change in ameeting or a discussion where a tester is not present If thatchange is not communicated to the tester the project suffers

false positives when testers report bugs against behaviornow defined as correct

THE SHIFTING TEST BASISRequirements-based testing strategies cannot handle vague ormissing requirements specifications Missing requirementsspecifications would mean a test team following a requirements-based testing strategy not only canrsquot say what it means for aparticular test to pass or fail they wouldnrsquot have a test basisupon which to base their tests

The test basis also provides a means to measure theresults In a requirements-based test strategy testers canrsquotreport test results accurately if the requirements are missing orpoor Testers canrsquot report the percentage of the test basis cov-ered by passed tests because the requirements wonrsquot provideenough detail for meaningful coverage analysis

TOO MANY MEETINGSA ldquoface-to-face conversationrdquo is just another way to describe ameeting A client once jokingly described Scrum as a heavy-weight process saying ldquoIrsquom surprised at the name agilemdashitshould be called couch potato There are too many meetingsTherersquos too much jawboning I find it really ironic that there are allthese books explaining how simple it isrdquo

Perhaps it was intentional not to use the dreaded word meet-ing in the Agile Manifesto but the result is often the same In someorganizations everyone gets invited to every meeting meetingsballoon in number and duration and managers and leads are notavailable to manage and lead their team because they are in meet-

ings much of the day Effectiveness and efficiency drop Anotherclient once told me of a senior manager who said in response toa complaint from a line manager about how attending meetingswas negatively impacting his ability to lead his team ldquoWersquore goingto continue to have these meetings until I find out why nothing isgetting done around hererdquo The senior manager was I am told theonly one in the room who didnrsquot get the irony

THE BOTTOM LINEIn RBCS client engagements we typically recommend blend-ed testing strategies that align well with Scrum and otheragile methodologies What seems most likely to work in manycases is a blend of risk-based testing maintainable automat-ed testing (at both the unit and system level) and a percent-age of effort spent on reactive testing (eg exploratory) Insome cases our methods will help mitigate the testing risksand reduce the testing challenges associated with thesemethodologies

Clearly there is no single way to perform effective soft-ware testing and as the industry evolves so too will the mosteffective methods that do the most good yacute

development An option of ldquono requirementsrdquo does not exist

TOO MANY MEETINGSThe point of face-to-face collaboration is that itrsquos the most effec-tive and richest way to communicate in technical teams Has anydeveloper ever implemented a requirement had a tester test itand a customer receive it with perfect alignment That wouldtruly be exceptional More oftenthe tester will say the softwareperforms differently than the testsandor the customer says I didnrsquotask for that even though theywere all working off of the exactsame requirement

The point is that technicalteams need to rely on written aswell as face-to-face collabora-tion to effectively reduce themisunderstandings and reworkthat written requirements alone often drive

Beyond this agile teams should be conducting incrediblyeffective meetings something perhaps lacking in the exam-ples Rex has seen I suspect this problem isnrsquot agile-centricand that those teams simply need a good facilitator

THE BOTTOM LINEEffectively testing todayrsquos software is incredibly challenging andwe all need to stop thinking of agile testing as somethingopposed to traditional testing Testers need every possible tool

to perform testing well Agile methods have emerged as a set ofpractices and a mindset that is context based and powerful Theytake a holistic view to testing and qualitymdashreinforcing the two asa whole Testers in this model have a powerful opportunity to notonly test but to champion and guide the overall teamrsquos focus ondelivering quality software

We need to stop our history of opposition to new testing app-roaches and begin embracing abroader set of testing practicesthat work in each projectrsquos uniquecontext I also believe we that weneed to stop quoting CapersJones as I suspect his data andstudies do not reflect the dynam-ics of mature agile teams Irsquom look-ing forward to encouraging Rex totry and gain deeper understandingof agile testing practices yacute

Bob will have ample oppor-tunity to do that in October when he and Rex meet for thefirst time as fellow members of STampP CollaborativersquosStrategic Advisory Board ndashEd

JULY 2009 wwwstpcollaborativecom bull 23

[Teams need to rely on written and face-to-face

collaboration to effectively reduce rework]

LEARN AGILE TESTING TECHNIQUESdirectly from Bob Galen himself this fall atSTPCon The Scrum Master will be on hand inCambridge MA Oct 19-23 to teach ldquoAgile andHigh-Speed Software Testing Techniquesrdquo anin-depth training course for bringing agile test-

ing to your organization VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

LEARN TEST MANAGEMENT from Rex Blackat the Software Test amp Performance Conferencethis fall in Cambridge Mass Learn about ldquoTestManagement A Risk-Based Approachrdquo the prac-tice that guides test activities based on carefulassessment and prioritization of the risks involved VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

ciplines and draw on a variety of skills to do their jobs And whilethere are many unsung testing heroes inside software develop-ment shops there are the beginnings in many organizations of agrowing recognition that test and performance engineers shouldbe considered peers of their development counterparts

We asked several testing gurus to share their thoughts onthe state of the testing industry and give us some insights on wherethey see things evolving One thing for anyone to realize is thatwhile testing as an industry is maturing it is still relatively new Thesoftware tester job isnt something that shows up on the US CensusWe are still very much a new profession says Scott Barber thefounder and chief technologist of PerfTestPlus in Palm Bay Fla Mostof our very first programmers are still alive we arent thousands ofyears old like other disciplinesrdquo

Perhaps the best rundown of the industryrsquos evolution is summed up by Michael Boltonwho lives in Toronto Testing has historically been confirmation validation and verificationhe says I believe that testing must become far more about exploration discovery investi-gation learning and communicating what youve learnedrdquo Bolton runs test consultancyDevelopSense and teaches rapid software testing methods ldquoI like to encourage people torecognize and manage uncertainty and ambiguity and instead of trying to eliminate them get-ting them down to levels that are suffi-ciently low for people to do use-ful workrdquo

While any gathering ofsoftware testers will find dis-agreement on overall pur-pose one thing is cleartesting isnt about how toimprove code qualityndashthat isthe role of the engineeringpeople who do the coding workand of management says BoltonTesting is about providing informationto those people so that better-informed decisionsabout quality are possible

The future careers for the current cropof test engineers seem to be on the rise

All this activity has made software testing a very big tent The industry has matured and grownto encompass numerous approaches to methods training and tools Because of this diver-sity there is a growing consensus that the best testers are those that can span multiple dis-

The software testing industry is at a historic crossroads Rich Internet applica-tions cloud computing the rise of virtualization and more capable mobiledevices have all made the job of producing reliable software more complexwhile at the same time enabling more sophistication for automated test tools

Wersquore All Part of the Story

We are a new profession most of our first programmers are

still alivendash Scott Barber

A skilledtester is theproduct of avariety of

experiencesndash Jon Bach

24 bull Software Test amp Performance

the business outcomes theyrsquove delivered he says He goes on to describe the best mix in his development teams A performance

engineering organization should be comprised of engineers that are just as capableor more capable than the core product engineers This allows for the performanceteam to be just as familiar with the product code base and for them to go as far asmaking performance enhancements to the product and testing the outcome of thosechanges

At Google employees involved in testing generally have one of three titles saysDavid Saff a Google software engineer in test whorsquos basedin Cambridge Mass Software engineers are expected to unit-test their own code and many take an active role in maintain-ing testing frameworks and system tests for their teams Testengineers look at the quality requirements of a project identifyholes and fill them by applying existing automation tools where pos-sible or managing manual testing teams where needed Softwareengineers in test look for holes in the tools available and serve as internal

tool developers and implementation consultants Any teamcant survive without high competence in each domain

CLOUDS ARE FORMINGThe move toward cloud computing has helped softwaretesting overall but also made for some new challengestoo Cloud computing adds layers of complexity to test-ing but it also makes things more interesting says BoltonThese layers include being able to predict latency and avail-ability of all components that reside in the cloud Barber saysthat Too many people take latency for granted still and many

application developers still take the entire network infrastruc-ture for granted and dont consider it part of their domain

so programmers just assume it is working unless theircode has a problem

Bartow agrees Network latency hasbecome even more important with the intro-duction of Rich Internet Application tech-nologies such as Adobe Flex Every value that you enter

David Strom is a freelance reporter and author living in St Louis

particularly within large software shops such as Intuit and Google There software testengineers are often more important than the development engineers who writethe original code and have their own career track with parallel compensationand benefits

At Intuit QA engineers are held to the exact same standards and expec-tations as product developersrdquo says Dan Bartow a manager of performanceengineering for the company which makes TurboTax and other financial softwareldquoIn fact my team of five full-time performance engineers is calibrated the sameway as the product engineers at the end of year in terms of their skill-sets and

Virtualizationmakes things

easierand harder

ndash Michael Bolton

wwwstpcollaborativecom bull 25

of the Story

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 18: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

22 bull Software Test amp Performance JULY 2009

work together to cluster in small groups working on specific fea-tures to work with the customer gaining valuable feedbackmdashandto collaborate across disciplines From this they can decidewhere when and to what degree documentation is necessary

We expect team members to be professionals and to beresponsible and accountable We trust their judgment overblindly filling in a plan template or a requirement document orblindly running the same tests over and over again to meetsome perceived risk

All oracles have a cost We engage the customers in thesedecisions as wellmdashensuring that those oracles that are providedare actually used and kept current

THE SHIFTING TEST BASISIrsquom lost here Agile teams define requirements Some usePost-it Notes and cards or define them as User StoriesOthers describe use cases or use traditional requirementforms Many even add a construct called acceptance tests ona finely grained story level that helps define the critical ele-ments of each feature or requirement These are things thatthe customer will ldquotestrdquo as part of their acceptance at the endof a Sprint or Iteration

Irsquod guess that Rex has seen poor or missing requirements inhis agile experience and Irsquod say those were not mature or solid-ly performing agile teams ldquoGoodrdquo teams define requirementsat a level that assures the team will build what the customerneeds They ensure collaboration surrounding each require-mentmdashat the point of attack

To capture a key point here The agile methods try to adopta Just-in-Time and Just Enough approach to their requirement

removal effectiveness according to Jones automated regressiontesting via unit tests will miss at least half of the regression bugs

POOR OR MISSING TEST ORACLESAgile favors working software over written documentationand in my observations special scorn is reserved for specifi-cations The suggestion by the Agile Manifesto (wwwagile-manifesto org) that people should value ldquoworking softwareover comprehensive documentationrdquo can create real chal-lenges for a test team when taken too far Testers userequirements specifications and other documents as testoracles ie the means to determine correct behavior undera given test condition We have seen testers in agile situa-tions given documents with insufficient detail and in somecases given no documents at all

Even when testers are given adequate documents twoother agile development principles keep the test oracle chal-lenge alive First agile requires teams to embrace changeand espouses that ldquothe most efficient and effective methodof conveying information to and within a development team isface-to-face conversationrdquo

From what Irsquove seen these two principles allow the proj-ect team to change the definition of correct behavior at anytime including after testers have created tests to confirm aparticular behavior and even after testers have reported bugsagainst a particular behavior

Further the definition of correct behavior can change in ameeting or a discussion where a tester is not present If thatchange is not communicated to the tester the project suffers

false positives when testers report bugs against behaviornow defined as correct

THE SHIFTING TEST BASISRequirements-based testing strategies cannot handle vague ormissing requirements specifications Missing requirementsspecifications would mean a test team following a requirements-based testing strategy not only canrsquot say what it means for aparticular test to pass or fail they wouldnrsquot have a test basisupon which to base their tests

The test basis also provides a means to measure theresults In a requirements-based test strategy testers canrsquotreport test results accurately if the requirements are missing orpoor Testers canrsquot report the percentage of the test basis cov-ered by passed tests because the requirements wonrsquot provideenough detail for meaningful coverage analysis

TOO MANY MEETINGSA ldquoface-to-face conversationrdquo is just another way to describe ameeting A client once jokingly described Scrum as a heavy-weight process saying ldquoIrsquom surprised at the name agilemdashitshould be called couch potato There are too many meetingsTherersquos too much jawboning I find it really ironic that there are allthese books explaining how simple it isrdquo

Perhaps it was intentional not to use the dreaded word meet-ing in the Agile Manifesto but the result is often the same In someorganizations everyone gets invited to every meeting meetingsballoon in number and duration and managers and leads are notavailable to manage and lead their team because they are in meet-

ings much of the day Effectiveness and efficiency drop Anotherclient once told me of a senior manager who said in response toa complaint from a line manager about how attending meetingswas negatively impacting his ability to lead his team ldquoWersquore goingto continue to have these meetings until I find out why nothing isgetting done around hererdquo The senior manager was I am told theonly one in the room who didnrsquot get the irony

THE BOTTOM LINEIn RBCS client engagements we typically recommend blend-ed testing strategies that align well with Scrum and otheragile methodologies What seems most likely to work in manycases is a blend of risk-based testing maintainable automat-ed testing (at both the unit and system level) and a percent-age of effort spent on reactive testing (eg exploratory) Insome cases our methods will help mitigate the testing risksand reduce the testing challenges associated with thesemethodologies

Clearly there is no single way to perform effective soft-ware testing and as the industry evolves so too will the mosteffective methods that do the most good yacute

development An option of ldquono requirementsrdquo does not exist

TOO MANY MEETINGSThe point of face-to-face collaboration is that itrsquos the most effec-tive and richest way to communicate in technical teams Has anydeveloper ever implemented a requirement had a tester test itand a customer receive it with perfect alignment That wouldtruly be exceptional More oftenthe tester will say the softwareperforms differently than the testsandor the customer says I didnrsquotask for that even though theywere all working off of the exactsame requirement

The point is that technicalteams need to rely on written aswell as face-to-face collabora-tion to effectively reduce themisunderstandings and reworkthat written requirements alone often drive

Beyond this agile teams should be conducting incrediblyeffective meetings something perhaps lacking in the exam-ples Rex has seen I suspect this problem isnrsquot agile-centricand that those teams simply need a good facilitator

THE BOTTOM LINEEffectively testing todayrsquos software is incredibly challenging andwe all need to stop thinking of agile testing as somethingopposed to traditional testing Testers need every possible tool

to perform testing well Agile methods have emerged as a set ofpractices and a mindset that is context based and powerful Theytake a holistic view to testing and qualitymdashreinforcing the two asa whole Testers in this model have a powerful opportunity to notonly test but to champion and guide the overall teamrsquos focus ondelivering quality software

We need to stop our history of opposition to new testing app-roaches and begin embracing abroader set of testing practicesthat work in each projectrsquos uniquecontext I also believe we that weneed to stop quoting CapersJones as I suspect his data andstudies do not reflect the dynam-ics of mature agile teams Irsquom look-ing forward to encouraging Rex totry and gain deeper understandingof agile testing practices yacute

Bob will have ample oppor-tunity to do that in October when he and Rex meet for thefirst time as fellow members of STampP CollaborativersquosStrategic Advisory Board ndashEd

JULY 2009 wwwstpcollaborativecom bull 23

[Teams need to rely on written and face-to-face

collaboration to effectively reduce rework]

LEARN AGILE TESTING TECHNIQUESdirectly from Bob Galen himself this fall atSTPCon The Scrum Master will be on hand inCambridge MA Oct 19-23 to teach ldquoAgile andHigh-Speed Software Testing Techniquesrdquo anin-depth training course for bringing agile test-

ing to your organization VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

LEARN TEST MANAGEMENT from Rex Blackat the Software Test amp Performance Conferencethis fall in Cambridge Mass Learn about ldquoTestManagement A Risk-Based Approachrdquo the prac-tice that guides test activities based on carefulassessment and prioritization of the risks involved VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

ciplines and draw on a variety of skills to do their jobs And whilethere are many unsung testing heroes inside software develop-ment shops there are the beginnings in many organizations of agrowing recognition that test and performance engineers shouldbe considered peers of their development counterparts

We asked several testing gurus to share their thoughts onthe state of the testing industry and give us some insights on wherethey see things evolving One thing for anyone to realize is thatwhile testing as an industry is maturing it is still relatively new Thesoftware tester job isnt something that shows up on the US CensusWe are still very much a new profession says Scott Barber thefounder and chief technologist of PerfTestPlus in Palm Bay Fla Mostof our very first programmers are still alive we arent thousands ofyears old like other disciplinesrdquo

Perhaps the best rundown of the industryrsquos evolution is summed up by Michael Boltonwho lives in Toronto Testing has historically been confirmation validation and verificationhe says I believe that testing must become far more about exploration discovery investi-gation learning and communicating what youve learnedrdquo Bolton runs test consultancyDevelopSense and teaches rapid software testing methods ldquoI like to encourage people torecognize and manage uncertainty and ambiguity and instead of trying to eliminate them get-ting them down to levels that are suffi-ciently low for people to do use-ful workrdquo

While any gathering ofsoftware testers will find dis-agreement on overall pur-pose one thing is cleartesting isnt about how toimprove code qualityndashthat isthe role of the engineeringpeople who do the coding workand of management says BoltonTesting is about providing informationto those people so that better-informed decisionsabout quality are possible

The future careers for the current cropof test engineers seem to be on the rise

All this activity has made software testing a very big tent The industry has matured and grownto encompass numerous approaches to methods training and tools Because of this diver-sity there is a growing consensus that the best testers are those that can span multiple dis-

The software testing industry is at a historic crossroads Rich Internet applica-tions cloud computing the rise of virtualization and more capable mobiledevices have all made the job of producing reliable software more complexwhile at the same time enabling more sophistication for automated test tools

Wersquore All Part of the Story

We are a new profession most of our first programmers are

still alivendash Scott Barber

A skilledtester is theproduct of avariety of

experiencesndash Jon Bach

24 bull Software Test amp Performance

the business outcomes theyrsquove delivered he says He goes on to describe the best mix in his development teams A performance

engineering organization should be comprised of engineers that are just as capableor more capable than the core product engineers This allows for the performanceteam to be just as familiar with the product code base and for them to go as far asmaking performance enhancements to the product and testing the outcome of thosechanges

At Google employees involved in testing generally have one of three titles saysDavid Saff a Google software engineer in test whorsquos basedin Cambridge Mass Software engineers are expected to unit-test their own code and many take an active role in maintain-ing testing frameworks and system tests for their teams Testengineers look at the quality requirements of a project identifyholes and fill them by applying existing automation tools where pos-sible or managing manual testing teams where needed Softwareengineers in test look for holes in the tools available and serve as internal

tool developers and implementation consultants Any teamcant survive without high competence in each domain

CLOUDS ARE FORMINGThe move toward cloud computing has helped softwaretesting overall but also made for some new challengestoo Cloud computing adds layers of complexity to test-ing but it also makes things more interesting says BoltonThese layers include being able to predict latency and avail-ability of all components that reside in the cloud Barber saysthat Too many people take latency for granted still and many

application developers still take the entire network infrastruc-ture for granted and dont consider it part of their domain

so programmers just assume it is working unless theircode has a problem

Bartow agrees Network latency hasbecome even more important with the intro-duction of Rich Internet Application tech-nologies such as Adobe Flex Every value that you enter

David Strom is a freelance reporter and author living in St Louis

particularly within large software shops such as Intuit and Google There software testengineers are often more important than the development engineers who writethe original code and have their own career track with parallel compensationand benefits

At Intuit QA engineers are held to the exact same standards and expec-tations as product developersrdquo says Dan Bartow a manager of performanceengineering for the company which makes TurboTax and other financial softwareldquoIn fact my team of five full-time performance engineers is calibrated the sameway as the product engineers at the end of year in terms of their skill-sets and

Virtualizationmakes things

easierand harder

ndash Michael Bolton

wwwstpcollaborativecom bull 25

of the Story

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 19: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

false positives when testers report bugs against behaviornow defined as correct

THE SHIFTING TEST BASISRequirements-based testing strategies cannot handle vague ormissing requirements specifications Missing requirementsspecifications would mean a test team following a requirements-based testing strategy not only canrsquot say what it means for aparticular test to pass or fail they wouldnrsquot have a test basisupon which to base their tests

The test basis also provides a means to measure theresults In a requirements-based test strategy testers canrsquotreport test results accurately if the requirements are missing orpoor Testers canrsquot report the percentage of the test basis cov-ered by passed tests because the requirements wonrsquot provideenough detail for meaningful coverage analysis

TOO MANY MEETINGSA ldquoface-to-face conversationrdquo is just another way to describe ameeting A client once jokingly described Scrum as a heavy-weight process saying ldquoIrsquom surprised at the name agilemdashitshould be called couch potato There are too many meetingsTherersquos too much jawboning I find it really ironic that there are allthese books explaining how simple it isrdquo

Perhaps it was intentional not to use the dreaded word meet-ing in the Agile Manifesto but the result is often the same In someorganizations everyone gets invited to every meeting meetingsballoon in number and duration and managers and leads are notavailable to manage and lead their team because they are in meet-

ings much of the day Effectiveness and efficiency drop Anotherclient once told me of a senior manager who said in response toa complaint from a line manager about how attending meetingswas negatively impacting his ability to lead his team ldquoWersquore goingto continue to have these meetings until I find out why nothing isgetting done around hererdquo The senior manager was I am told theonly one in the room who didnrsquot get the irony

THE BOTTOM LINEIn RBCS client engagements we typically recommend blend-ed testing strategies that align well with Scrum and otheragile methodologies What seems most likely to work in manycases is a blend of risk-based testing maintainable automat-ed testing (at both the unit and system level) and a percent-age of effort spent on reactive testing (eg exploratory) Insome cases our methods will help mitigate the testing risksand reduce the testing challenges associated with thesemethodologies

Clearly there is no single way to perform effective soft-ware testing and as the industry evolves so too will the mosteffective methods that do the most good yacute

development An option of ldquono requirementsrdquo does not exist

TOO MANY MEETINGSThe point of face-to-face collaboration is that itrsquos the most effec-tive and richest way to communicate in technical teams Has anydeveloper ever implemented a requirement had a tester test itand a customer receive it with perfect alignment That wouldtruly be exceptional More oftenthe tester will say the softwareperforms differently than the testsandor the customer says I didnrsquotask for that even though theywere all working off of the exactsame requirement

The point is that technicalteams need to rely on written aswell as face-to-face collabora-tion to effectively reduce themisunderstandings and reworkthat written requirements alone often drive

Beyond this agile teams should be conducting incrediblyeffective meetings something perhaps lacking in the exam-ples Rex has seen I suspect this problem isnrsquot agile-centricand that those teams simply need a good facilitator

THE BOTTOM LINEEffectively testing todayrsquos software is incredibly challenging andwe all need to stop thinking of agile testing as somethingopposed to traditional testing Testers need every possible tool

to perform testing well Agile methods have emerged as a set ofpractices and a mindset that is context based and powerful Theytake a holistic view to testing and qualitymdashreinforcing the two asa whole Testers in this model have a powerful opportunity to notonly test but to champion and guide the overall teamrsquos focus ondelivering quality software

We need to stop our history of opposition to new testing app-roaches and begin embracing abroader set of testing practicesthat work in each projectrsquos uniquecontext I also believe we that weneed to stop quoting CapersJones as I suspect his data andstudies do not reflect the dynam-ics of mature agile teams Irsquom look-ing forward to encouraging Rex totry and gain deeper understandingof agile testing practices yacute

Bob will have ample oppor-tunity to do that in October when he and Rex meet for thefirst time as fellow members of STampP CollaborativersquosStrategic Advisory Board ndashEd

JULY 2009 wwwstpcollaborativecom bull 23

[Teams need to rely on written and face-to-face

collaboration to effectively reduce rework]

LEARN AGILE TESTING TECHNIQUESdirectly from Bob Galen himself this fall atSTPCon The Scrum Master will be on hand inCambridge MA Oct 19-23 to teach ldquoAgile andHigh-Speed Software Testing Techniquesrdquo anin-depth training course for bringing agile test-

ing to your organization VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

LEARN TEST MANAGEMENT from Rex Blackat the Software Test amp Performance Conferencethis fall in Cambridge Mass Learn about ldquoTestManagement A Risk-Based Approachrdquo the prac-tice that guides test activities based on carefulassessment and prioritization of the risks involved VViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

ciplines and draw on a variety of skills to do their jobs And whilethere are many unsung testing heroes inside software develop-ment shops there are the beginnings in many organizations of agrowing recognition that test and performance engineers shouldbe considered peers of their development counterparts

We asked several testing gurus to share their thoughts onthe state of the testing industry and give us some insights on wherethey see things evolving One thing for anyone to realize is thatwhile testing as an industry is maturing it is still relatively new Thesoftware tester job isnt something that shows up on the US CensusWe are still very much a new profession says Scott Barber thefounder and chief technologist of PerfTestPlus in Palm Bay Fla Mostof our very first programmers are still alive we arent thousands ofyears old like other disciplinesrdquo

Perhaps the best rundown of the industryrsquos evolution is summed up by Michael Boltonwho lives in Toronto Testing has historically been confirmation validation and verificationhe says I believe that testing must become far more about exploration discovery investi-gation learning and communicating what youve learnedrdquo Bolton runs test consultancyDevelopSense and teaches rapid software testing methods ldquoI like to encourage people torecognize and manage uncertainty and ambiguity and instead of trying to eliminate them get-ting them down to levels that are suffi-ciently low for people to do use-ful workrdquo

While any gathering ofsoftware testers will find dis-agreement on overall pur-pose one thing is cleartesting isnt about how toimprove code qualityndashthat isthe role of the engineeringpeople who do the coding workand of management says BoltonTesting is about providing informationto those people so that better-informed decisionsabout quality are possible

The future careers for the current cropof test engineers seem to be on the rise

All this activity has made software testing a very big tent The industry has matured and grownto encompass numerous approaches to methods training and tools Because of this diver-sity there is a growing consensus that the best testers are those that can span multiple dis-

The software testing industry is at a historic crossroads Rich Internet applica-tions cloud computing the rise of virtualization and more capable mobiledevices have all made the job of producing reliable software more complexwhile at the same time enabling more sophistication for automated test tools

Wersquore All Part of the Story

We are a new profession most of our first programmers are

still alivendash Scott Barber

A skilledtester is theproduct of avariety of

experiencesndash Jon Bach

24 bull Software Test amp Performance

the business outcomes theyrsquove delivered he says He goes on to describe the best mix in his development teams A performance

engineering organization should be comprised of engineers that are just as capableor more capable than the core product engineers This allows for the performanceteam to be just as familiar with the product code base and for them to go as far asmaking performance enhancements to the product and testing the outcome of thosechanges

At Google employees involved in testing generally have one of three titles saysDavid Saff a Google software engineer in test whorsquos basedin Cambridge Mass Software engineers are expected to unit-test their own code and many take an active role in maintain-ing testing frameworks and system tests for their teams Testengineers look at the quality requirements of a project identifyholes and fill them by applying existing automation tools where pos-sible or managing manual testing teams where needed Softwareengineers in test look for holes in the tools available and serve as internal

tool developers and implementation consultants Any teamcant survive without high competence in each domain

CLOUDS ARE FORMINGThe move toward cloud computing has helped softwaretesting overall but also made for some new challengestoo Cloud computing adds layers of complexity to test-ing but it also makes things more interesting says BoltonThese layers include being able to predict latency and avail-ability of all components that reside in the cloud Barber saysthat Too many people take latency for granted still and many

application developers still take the entire network infrastruc-ture for granted and dont consider it part of their domain

so programmers just assume it is working unless theircode has a problem

Bartow agrees Network latency hasbecome even more important with the intro-duction of Rich Internet Application tech-nologies such as Adobe Flex Every value that you enter

David Strom is a freelance reporter and author living in St Louis

particularly within large software shops such as Intuit and Google There software testengineers are often more important than the development engineers who writethe original code and have their own career track with parallel compensationand benefits

At Intuit QA engineers are held to the exact same standards and expec-tations as product developersrdquo says Dan Bartow a manager of performanceengineering for the company which makes TurboTax and other financial softwareldquoIn fact my team of five full-time performance engineers is calibrated the sameway as the product engineers at the end of year in terms of their skill-sets and

Virtualizationmakes things

easierand harder

ndash Michael Bolton

wwwstpcollaborativecom bull 25

of the Story

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 20: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

ciplines and draw on a variety of skills to do their jobs And whilethere are many unsung testing heroes inside software develop-ment shops there are the beginnings in many organizations of agrowing recognition that test and performance engineers shouldbe considered peers of their development counterparts

We asked several testing gurus to share their thoughts onthe state of the testing industry and give us some insights on wherethey see things evolving One thing for anyone to realize is thatwhile testing as an industry is maturing it is still relatively new Thesoftware tester job isnt something that shows up on the US CensusWe are still very much a new profession says Scott Barber thefounder and chief technologist of PerfTestPlus in Palm Bay Fla Mostof our very first programmers are still alive we arent thousands ofyears old like other disciplinesrdquo

Perhaps the best rundown of the industryrsquos evolution is summed up by Michael Boltonwho lives in Toronto Testing has historically been confirmation validation and verificationhe says I believe that testing must become far more about exploration discovery investi-gation learning and communicating what youve learnedrdquo Bolton runs test consultancyDevelopSense and teaches rapid software testing methods ldquoI like to encourage people torecognize and manage uncertainty and ambiguity and instead of trying to eliminate them get-ting them down to levels that are suffi-ciently low for people to do use-ful workrdquo

While any gathering ofsoftware testers will find dis-agreement on overall pur-pose one thing is cleartesting isnt about how toimprove code qualityndashthat isthe role of the engineeringpeople who do the coding workand of management says BoltonTesting is about providing informationto those people so that better-informed decisionsabout quality are possible

The future careers for the current cropof test engineers seem to be on the rise

All this activity has made software testing a very big tent The industry has matured and grownto encompass numerous approaches to methods training and tools Because of this diver-sity there is a growing consensus that the best testers are those that can span multiple dis-

The software testing industry is at a historic crossroads Rich Internet applica-tions cloud computing the rise of virtualization and more capable mobiledevices have all made the job of producing reliable software more complexwhile at the same time enabling more sophistication for automated test tools

Wersquore All Part of the Story

We are a new profession most of our first programmers are

still alivendash Scott Barber

A skilledtester is theproduct of avariety of

experiencesndash Jon Bach

24 bull Software Test amp Performance

the business outcomes theyrsquove delivered he says He goes on to describe the best mix in his development teams A performance

engineering organization should be comprised of engineers that are just as capableor more capable than the core product engineers This allows for the performanceteam to be just as familiar with the product code base and for them to go as far asmaking performance enhancements to the product and testing the outcome of thosechanges

At Google employees involved in testing generally have one of three titles saysDavid Saff a Google software engineer in test whorsquos basedin Cambridge Mass Software engineers are expected to unit-test their own code and many take an active role in maintain-ing testing frameworks and system tests for their teams Testengineers look at the quality requirements of a project identifyholes and fill them by applying existing automation tools where pos-sible or managing manual testing teams where needed Softwareengineers in test look for holes in the tools available and serve as internal

tool developers and implementation consultants Any teamcant survive without high competence in each domain

CLOUDS ARE FORMINGThe move toward cloud computing has helped softwaretesting overall but also made for some new challengestoo Cloud computing adds layers of complexity to test-ing but it also makes things more interesting says BoltonThese layers include being able to predict latency and avail-ability of all components that reside in the cloud Barber saysthat Too many people take latency for granted still and many

application developers still take the entire network infrastruc-ture for granted and dont consider it part of their domain

so programmers just assume it is working unless theircode has a problem

Bartow agrees Network latency hasbecome even more important with the intro-duction of Rich Internet Application tech-nologies such as Adobe Flex Every value that you enter

David Strom is a freelance reporter and author living in St Louis

particularly within large software shops such as Intuit and Google There software testengineers are often more important than the development engineers who writethe original code and have their own career track with parallel compensationand benefits

At Intuit QA engineers are held to the exact same standards and expec-tations as product developersrdquo says Dan Bartow a manager of performanceengineering for the company which makes TurboTax and other financial softwareldquoIn fact my team of five full-time performance engineers is calibrated the sameway as the product engineers at the end of year in terms of their skill-sets and

Virtualizationmakes things

easierand harder

ndash Michael Bolton

wwwstpcollaborativecom bull 25

of the Story

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 21: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

the business outcomes theyrsquove delivered he says He goes on to describe the best mix in his development teams A performance

engineering organization should be comprised of engineers that are just as capableor more capable than the core product engineers This allows for the performanceteam to be just as familiar with the product code base and for them to go as far asmaking performance enhancements to the product and testing the outcome of thosechanges

At Google employees involved in testing generally have one of three titles saysDavid Saff a Google software engineer in test whorsquos basedin Cambridge Mass Software engineers are expected to unit-test their own code and many take an active role in maintain-ing testing frameworks and system tests for their teams Testengineers look at the quality requirements of a project identifyholes and fill them by applying existing automation tools where pos-sible or managing manual testing teams where needed Softwareengineers in test look for holes in the tools available and serve as internal

tool developers and implementation consultants Any teamcant survive without high competence in each domain

CLOUDS ARE FORMINGThe move toward cloud computing has helped softwaretesting overall but also made for some new challengestoo Cloud computing adds layers of complexity to test-ing but it also makes things more interesting says BoltonThese layers include being able to predict latency and avail-ability of all components that reside in the cloud Barber saysthat Too many people take latency for granted still and many

application developers still take the entire network infrastruc-ture for granted and dont consider it part of their domain

so programmers just assume it is working unless theircode has a problem

Bartow agrees Network latency hasbecome even more important with the intro-duction of Rich Internet Application tech-nologies such as Adobe Flex Every value that you enter

David Strom is a freelance reporter and author living in St Louis

particularly within large software shops such as Intuit and Google There software testengineers are often more important than the development engineers who writethe original code and have their own career track with parallel compensationand benefits

At Intuit QA engineers are held to the exact same standards and expec-tations as product developersrdquo says Dan Bartow a manager of performanceengineering for the company which makes TurboTax and other financial softwareldquoIn fact my team of five full-time performance engineers is calibrated the sameway as the product engineers at the end of year in terms of their skill-sets and

Virtualizationmakes things

easierand harder

ndash Michael Bolton

wwwstpcollaborativecom bull 25

of the Story

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 22: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

for applications such as our TaxCaster [tax estimation tool]generates a server-side request and the interface controlssuch as sliders need to respond quickly to give them a desk-top feel

Having more applications in the cloud increases the difficultyof forecasting overall performance as demand varies through-out the day and testing strategies need to understand thesevariations Systems are getting more complex and with moremoving parts and definitions of success and failure are get-ting fuzzierrdquo says Kent Beck creator of Extreme ProgrammingldquoFor example parts of Amazons home page may not show upbecause some service was down Because the rest of the pageshows up quickly is that a good or bad result How do youdesign a test to handle all the permutations

According to Bartow Cloud computing technology is goingto reshape production validation testing and challenge the way wedesign our performance engi-neering strategies By harness-ing the power of the cloud andworking with our partnerSOASTAcoms CloudTest wewere able to spin up hundredsof servers and generate anenormous amount of loadagainst our infrastructure tosimulate real tax users onTurboTax Online Thanks tothe cloud we are now able torealistically test in productionwith true concurrency

TRAINING TRAINTechnology is further trans-

forming the industry interms of tester train-

ing one ofthe manybenefitsof univer-

sal Internetconnectivity

and the ubiquity of theWeb We have certainly seen

our companys opportunities relat-ed to Web-based training justexplode over the last yearrdquo saysRex Black who runs RBCS a test-ing and quality consultancy in Bul-verde TX He says revenue frome-learning continues to grow Blackis a past president of theInternational Software TestingQualifications Board (ISTQB)which maintains the most wide-ly held certification standardsand RBCS offers its training

But the ISTQB stan-dards and other certifica-tions arent without con-troversy within the softwaretest community It is onlyone form of credential saysJon Bach a test manager inSeattle A skilled tester isthe product of a variety ofexperiences the best ofwhich tend to be experiencesfrom learning The best peo-ple whom I have hired werentcertified and were fantastictesters because they had a history of being great learners notgreat test-takersrdquo Bach goes on to say by way of compari-

son ldquoAirline pilots have certifications but they also haveto have hours of simulator time and co-pilot time in thecockpit A doctor has certification but they also haveyears of residency and interning experience beforethey can open a practicerdquo

Patricia McQuaid president and co-founder of theAmerican Software Testing Certification Board (theAmerican component of the ISTQB) defends certifi-cation as a means of providing a standardized set ofterms for testers around the globe ldquoImaginethe person in charge trying to ensure thatthe testers know and use the same termi-nology [and that] they understand thebasics of the testing life cycle This goalis difficult enough but is exacerbated bythe team members being from different coun-

tries working on the same testing jobrdquo Organizations from nearly 50 countries meet four

times a year to build consensus on software testing termson an international level ldquoCountries that are part of thescheme recognize the certification from any of the oth-er countries because all material for the certificationexam is agreed upon and the exam questions mustcome from this materialrdquo

James Bach elder brother of Jon is an advocate of expe-riential learning in testing If you want to learn how to test youmust test and preferably under supervised conditions This is exact-ly how I learned to fly aircraft sail boats and scuba dive Its howwe all learned to drive and testing is a lot harder than driving

As a founding member of the Context-Driven School ofsoftware testing Bach asserts there are no testing best prac-tices ldquoTesting is problem-solving not button-mashing Thereare many schools of thought within the field not a single com-munity or philosophy of testing that can claim to represent allrdquoFor those who do seek online training he recommends theBlack Box Software Testing (BBST) series offered by theAssociation for Software Testing for which Jon and Jameshave served as board members

Another boon to testing brought about by the Web hasbeen the proliferation of open source applications Barberbelieves this has helped the testing industry too by providinga nice counterpoint to the commercial test tools that are outthere Some of the open source products are excellent toolsand can be quite nimble and solve problems before the com-mercial vendors can The best of the open source world willcontinue to get better and the communities will get stronger

26 bull Software Test amp Performance JULY 2009

Thanks tothe cloud

we are ableto test in

productionndash Dan Bartow

Software engineersare expected to

unit-test their owncode and many take

an active role inmaintaining

testing frameworksndash David Saff

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 23: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

Black agrees You want to ensure that the team really picksthe right tool for the job and sometimes you need to cutthrough the hype and sales tactics that surround some ofthe commercial tools

One final challenge with Web-based applications is todesign them for the growing use of mobile browsers BobVisovich is the senior director development and operations atPublishers Clearing House Online Network based in Port

Washington NY He finds that we haveto set aside more time for ensuring

browser capability as we get more intomobile devices and we have to becareful to educate our QA staff tounderstand the differencesbetween the typical Web andmobile browsing experi-ences His biggest sur-

prise with implementingRich Internet Applications is

that we have had to put morefunctionality in the hands of the

creative people that design theoverall user experience and our

engineers have had tolearn the right skillsto be able to talk to

them so we can sup-port our newest applica-

tions

AGILE ASCENSIONThen therersquos the surging popu-larity of agile programming

methods which have bothhelped and challenged

testers according toBlack Some of ourclients are using agilemethodologies and they

are struggling to figureout what constitutes an

adequate specification giventhe agile preference for minimizing doc-

umentation Were working with a number of clients tohelp them achieve a good balance here between ensuring thatevery requirement is tested for audit purposes and still beingable to use the agile methods Visovich is trying to combine hisQA and development teams for more agile programming but hashad to do a lot more coaching and training because developersinitially found it disconcerting that they didnt have a good amountof documentation to create their initial test cases But the goodnews is that the teams are understanding our business process-es better and we can take care of defects more quickly

As another benefit Bolton notes in some cases agile process-es are taking us in the direction where programmers are moreresponsible for the quality of their code and thats great It helpswhen many people are paying attention and using many senso-ry modes to evaluate a product toordquo (Incase you missed it turnto page 18 for full coverage of The Great Agile Debate ndashEd)

SCHOOL DAZEWhat are the academic underpinnings that will help create the

most successful testers As Visovich alluded to the besttesters dont always come out of engineering or computer sci-ence departments but have more holistic worldviews thatinclude understanding the underlying business too The bestsituation is to have testers that are part of the developmentteam and work jointly with them and collaborate on a sharedvision says Barber of PerTestPlus By doing this the testerscan learn which functional parts of the application are criticalsupport the business priorities

Some of our interviewees spoke of being able to understandphilosophy social science or logic as keys to becoming greattesters QA engineers usually span multiple functional areas andare a cornerstone for our businessrdquo says Intuitrsquos Bartow ldquoThis

gives them greater breadthand depth making themmore valuable to the organ-ization than some productengineers

Our computer scienceschools teach kids how towrite code when they areworking by themselves in adorm room in the middle ofthe night but they arentteaching them how to workon cross-functional teamsor how to test their codesays Barber Most CSprograms still dont[require] a single man-datory testing courseto graduate and manydont have any cours-es on networking somost of these stu-dents graduate and allthey know is how towrite code in severallanguages

DevelopSensersquosBolton agrees If wetesters want to beeffective and greatat what we do itreally helps to knowa little about philos-ophy and to thinkcarefully about how we know what we know When you addthe human mind to what we can find using Google andWikipedia its awesome We all are our own best semanticsearch tools I am not sure that were teaching enough about

Get used to programming for a lot of CPUs

ndash Kent Beck

JULY 2009 wwwstpcollaborativecom bull 27

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 24: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

rapid learning critical thinkinghuman factors and user experi-ence as we train our future pro-grammers

VIRTUAL PARADOXVirtualization has made testing both eas-ier and harder for different reasons As moreshops make use of virtualized servers and desktops to deploytheir applications they create more complex environments

that require moresophisticated test

plans But the ben-

efits are thereVirtualizationmakes certainthings easier andcheaper like set-up and configu-ration But italso makes cer-tain things hard-

er There can besignificant differences

between a real machine anda virtualized machine says Bolton Barber

agrees The one place where virtualizationcan help us is in managing our test envi-ronment because itrsquos relatively easyto duplicate and start up a new virtu-al machine image with all applicationspre-installed and tools configuredfor a particular environment

SIMPLY AUTOMATIONAutomation certainly has made cre-ating more sophisticated test planseasier over the years and clearlyautomation is only going to become

more capable in the future But manytesting experts realize that automationisnt everything and should just beone of many parts of a good test-

ing strategy We canttrust our tools to makeour decisions for ussays Bolton

I feel that the indus-try is starting to draw avery clear line betweenblack-box testers and true

QA engineers said Bartow With automa-tion becoming so prevalent I believe thatthe black-box testers in the industry arefacing a real challenge to grow their soft-ware engineering skills

The tedious parts of the testers jobare definitely going away which changesthe structure of the profession says Beckwho with Erich Gamma created the JUnit

unit testingframework andhas authored numer-ous books on computerprogramming and test-driv-en development But themore pieces that you havethe greater the probability offailure of the system Thismakes it more challenging fortesting Ideally testers shouldbe able to look at these com-plex systems and analyzewhere the communicationserrors are to be able to head offcatastrophic errors that have big busi-ness impacts

We are seeing a lot of exciting evo-lution in testing automation especially inthe tools available for [automated] unittesting and the integration of that unittesting into automated build environ-ments says Black Were still strug-gling as an industry with effective andmaintainable automated testing at thegraphical user interface He says thatthe best opportunities are where youblend automation with reactivetechniques that watch for spe-cific behaviors and bugs

Beck who lives in Merlin OR pre-dicts that we are seeing the days of cheap sin-gle-threaded performance improvements coming to anend where we can just buy our way out of problems Peopleare going to have to get used to programming for a lot of CPUsand that represents new testing and programming challengesand also gives us a new series of opportunities

It seems clear that testing as a profession can only growespecially at places such as Google and others where Webapplications are their bread and butter When each line ofcode you write today may very likely be executed by millionsof people next week it tends to focus the mind says SaffBut the question remains whether software testers will everachieve paritymdashin income and staturemdashwith their counterpartsover the wall yacute

28 bull Software Test amp Performance JULY 2009

We have seen ouropportunities

related to Web-based

training explode over the last year

ndash Rex Black

CATCH Jon and James Bach Scott Barber DanBartow Rex Black Michael Bolton and others atthe Software Test amp Performance Conferencethis fall Theyll all be on hand in CambridgeMass Oct 19-23 to meet teach and interact withyou and your testing teamVViissiitt wwwwwwssttppccoonnccoomm ffoorr mmoorree

Testing is problem-solving not

button-mashingndash James Bach

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 25: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

30 bull Software Test amp Performance JULY 2009

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 26: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

best web services testing is complexAnd itrsquos vastly different than testingtraditional software applications

Properly designed and deployedWeb services can take advantage ofexisting assets and reuse applicationsexchange information securely acrossand beyond the enterprise integrateexisting applications in loose couplings

that are both scalable and easily extensi-ble and reduce the cost of developingand integrating computing assets Buthow can something as ephemeral asinteroperability be tested when servicesyoursquore interoperating with might not beeven be there tomorrow

Helping to solve this and otherproblems is the Web ServicesInteroperability organization The WS-I(wwwwsiorg) gives testers a world ofresources to help increase confidencein the interoperability of their Webservices beyond the scope of commer-cial testing tools What follows is a

primer of those resources and how touse them to test Web services beingdeveloped in your organization

DEFINITIONSWeb services can be found not only onthe public Web but also in a companyrsquosinternal networks and in semi-privateconnections between partners For thepurposes of this article a Web service isdefined as a loosely coupled asynchro-nous interface that is exposed andinvoked over a network using platform-independent technology

A Web service also can be described

as a means of passing informationbetween applications or processes usingopen standards-based protocolsregardless of programming language orplatform using a machine-readable con-tract to describe how to pass andreceive information

Some of the nuts and bolts of Webservices include

bull WSDLs (Web Services Des-cription Language) define theservices for client tools

bull SOAP (Simple Object AccessProtocol) a message format toinvoke a service

Web services are ubiquitous And thanks to mature standards andtools itrsquos easier than ever to set them up But for distributed computing to continue to thrive itrsquos vital that services be set up correctly Comprehensive pre-deployment testing is critical and

Chris Ferris is an IBM Distinguished Engineer CTO of Industry Standards in IBMs Software Group StandardsStrategy organization and a former chair of the WS-I Basic Profile Working Group

Testing for Conformance With WebServices Interoperability Specifications

JULY 2009 wwwstpcollaborativecom bull 31

By Chris Ferris

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 27: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

32 bull Software Test amp Performance JULY 2009

bull XML (Extensible MarkupLanguage) the lingua franca ofthe Internet

Additionally some deploymentsmake use of UDDI (Universal DescriptionDiscovery and Integration) which can bethought of as Yellow Pages for looking upservices The results are complex in con-cept and implementation but simple andtransparent for end users

The complexities of developing anddeploying systems potentially based onvarying independent standards (OASISW3C etc) are many And it is such chal-

lenges that brought about the WS-I thecharter of which is to establish best prac-tices for Web services interoperability forselected groups of Web services stan-dards across platforms operating sys-tems and programming languages WS-I-published best practices are compiled inspecifications called profiles and madeavailable at no charge at wwwws-iorg

Some of the characteristics thatmake Web services desirable such asits loose couplings also present chal-lenges to testing For example itrsquos possi-ble that no single entity controls all the

components involved in a given Webservices system yet all the points ofinteraction must be tested and validatedCommercial tools tend to address onlythe most typical designs and configura-tions WS-Irsquos testing tools offer addition-al validation of functions In 2004 WS-Ipublished its first of its suite of testingtools and has been updating along withthe profiles ever since

WS-I TESTING GOALSTo complement its profiles and toenhance their value to Web services

developers WS-I set up an autonomousworking group with the charter of devel-oping WS-I test material and tools thatwould facilitate the verification of confor-mance to WS-I profiles

The items and artifacts monitoredand analyzed by the tools created by theworking group include message materialWeb service description material and reg-istry material For each WS-I Profile itaddresses the working group definestest assertions as a testable interpreta-tion of the profile definitions which aregathered in test assertion documents

The working group is also chargedwith defining or supervising the definitionof test case material necessary to verifythat the tools behave as expected andthat multiple WS-I tool implementationsfunction in a consistent manner Theresulting tools attempt to implementtesting of all assertions within testassertion documents in support of theuse of these tools in the evaluation ofprofile compliance

When the testing target is a Webservices platform or tool that implementsa specification that is itself underlying toa profile (such as SOAP WSDL HTTP)the working group is not required to han-dle exhaustive testing of such tools orplatforms for conformance to suchunderlying specifications It is insteadassumed that this level of conformancehas already been provided by specifictest suites and tools that are out of thescope of the working group Web serviceplatforms or development tools undertest are expected to have passed con-formance to the underlying specificationsthey claim to implement

Note that while WS-I Profile compli-ance is self-validated (that is WS-I is nota certifying authority) WS-I recommendsthe use of the testing tools before makingany claims of compliance with its profiles

WS-I TEST MATERIALSSUITESWS-I has now published test materialscovering the Basic Basic SecureAttachments and Simple SOAP BindingProfiles The test suite for a Profileincludes the Web Service CommunicationMonitor the Web Service Profile Analyzerand a test assertion document There arealso supplementary materials such asfunctional specifications for the WS-I toolsthat commercial tool makers may use todesign their own testing tools

WS-I tools test Web service imple-mentations using a non-intrusive man-in-the-middle black box approachFigure 1 illustrates how they operate

The Monitor captures messagesexchanged with Web services and storesthem for analysis by a second tool theWeb Service Profile Analyzer TheAnalyzer evaluates messages captured bythe Monitor and validates the descriptionand registration artifacts of the Web serv-ice These artifacts include the WSDL doc-ument(s) that describe(s) the Web servicethe XML schema files that describe thedata types used in the WSDL service def-inition and the UDDI registration entries

Generator

Requestor Web ServiceNormal Message Flow

MonitorAnalyzer

Monitor

Interceptor

LoggerMonitorConfig

File

AnalyzerConfig

File

TestAssertionDocument

Analyzer

ConformanceReport

SOAPMessages

SOAPMessages

MessageLog

XML Schema

WSDL

UDDI

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 1 WS-I TOOLS IN ACTION

Source Web Services Interoperability Organization

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 28: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

More than 300 test cases have been writ-ten and automated for the Analyzer toolEach test case exercises between 50 and90 test procedures

The output from the Analyzer is areport that indicates whether or not aWeb service meets the interoperabilityguidelines of the WS-I Profile The reportprovides details on the specific deviationsand failures so that users know whichrequirements of the Profile were not metFailure to meet Profile requirements doesnot mean that the specific Web service isldquobrokenrdquo just that it does not operate incompliance with that particular Profile

The test assertion document is theformal specification of the tests per-formed by the WS-I tools as specifiedin the interoperability profiles ThisXML document is organized in artifactcategories and is used by the Analyzerto identify interoperability issues

COLORING OUTSIDE THE LINESBy now you might be wonderingwhether WS-I compliance tools offerenough value to justify the effort and timeof an additional round of testing Theanswer depends on whether your com-ponents are ldquocoloring outside the linesrdquowhether theyrsquore going beyond what hasbeen agreed upon by the Web servicescommunity as the foundation for interop-erability defined by the WS-I Profiles

For example inside your firewall youmay have a controlled and homogeneoussituation where interoperability withother vendor solutions is less of a con-cern However when your Web servicessystem is to be deployed externally withyour partners and customers such maynot be the case

Running the WS-I test toolsagainst a particular deployed instance

can in fact warn youwhen your implemen-tation has venturedinto use of featuresand function for whichthe WS-I Profiles havenot prescribed inter-operability guidanceWhat you do as an ITdecision-maker withthis information typi-cally depends on thecircumstances of theWeb services deploy-ment

Although todaymost vendors buildWS-I Profile complianceinto their products theyalso allow for exten-sions beyond what is ina Profile a provision foradditional features thatare not yet covered bystandards or profiled byWS-I WS-I tools canassess a given imple-mentation against therequirements specifiedby the WS-I Profilesand determine whenthe defined extensibilitypoints have been uti-lized that might affectinteroperability

Vendors includeextra features to differ-entiate themselves butoften they do not differ-entiate between what is

interoperable and what is proprietary intheir development tooling Developers stillneed to know whether their systems willinteroperate securely and consistently

WS-I test tools let developers makeinformed decisions about their systemsndash the tools can capture the messagesbetween provider and consumer forexample and raise a flag if any require-ments have not been met Again such aflag doesnrsquot necessarily mean the Webservice is ldquobrokenrdquo but rather that itdoes not conform to WS-I Profiles WS-I test tools donrsquot deliver value state-ments on commercial products

GETTING STARTEDA group called the Web Services TestForum (wwwwstforg) is a great way tobecome familiar with WS-I tools and mate-rials In elementary school terms youcould think of WS-I as the classroom forformal instruction and of WSTF as thesandbox for trying things out

According to its Web site the ldquoWSTest Forum Group is meant to providean environment in which members ofthe Web service community can devel-op interop scenarios as well as testthose scenarios against other WebService implementations It also pro-vides a common testbed of regressiontests that the community can use dur-ing the development of their WebService implementationsrdquo

Formed in 2008 WSTFrsquos goals andobjectives are complementary withthose of the WS-I which organizes the

ABOUT WS-I

WS-I comprises a diverse communi-

ty of Web services leaders from a

wide range of companies and stan-

dards development organizations

(SDOs) WS-I committees and work-

ing groups create Profiles and sup-

porting Testing Tools based on Best

Practices for selected sets of Web

services standards The Profiles and

Testing Tools are available for use by

the Web services community to aid

in developing and deploying inter-

operable Web services Companies

interested in helping to establish

Best Practices for Web Services are

encouraged to join WS-I For more

information go to wwwws-iorg

wwwstpcollaborativecom bull 33

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 29: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

service of formally documenting thebest use of standardsndashdocumentingwhere the bar is for Web services devel-opers so to speak It does this in astructured manner that ensures consen-sus along with the confidence that whatis in place today will be there tomorrowWS-Irsquos processes are deliberate and theresulting thoroughness of results is

invaluable Some WS-I Profiles havealso been adopted by the InternationalStandards Organization

By comparison the informal WSTFoffers a flexibility that allows vendors totest not only the technologies covered inWS-I profiles but also technologies thatmay not be finished (from a standardiza-tion perspective) as well as more creative

aspects of applying the technologies thatare included in the WS-I Profiles WS-Idoesnrsquot prescribe specific test scenar-iosndashthe WS-I test materials are intendedto test all aspects of a Profilendashbut WSTFcan develop new scenarios almost on thefly such as different means for handlingtransactions and work out new patternsof using Web services that individual ven-dors can build upon

Current users of the WS-Is BasicProfile 12 or 20 or the Reliable andSecure Profile 10 should take note ofdraft of the Basic Profile TestingMethodology posted on April 28 (atwwwws-iorg) The testing approachused for the latest WS-I profiles differsfrom the one used in the past althoughoverall the general testing processremains similar reads the draft in partA new ldquologgingrdquo phase includes livemessage capture and is followed by aseparate Analysis phase involving a sep-arate tool (see Figure 2) The Test Logfile consolidates all artifacts under testincluding messages and metadata(WSDL definitions etc) The test analy-sis phase includes evaluation of WS arti-facts in the test log against the require-ments of a particular profile yacute

34 bull Software Test amp Performance JULY 2009

Client CodeWeb Service

WSDL XML Schema

Test LogFile

TestReport

Analyzer

MONITORSOAPXML

HTTP

MessageArtifacts

Meta data

Interceptor

Logger

FIG 2 A NEW PHASE FOR TESTING

Source Web Services Interoperability Organization

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 30: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made

confusion over what exactly is meantby will handle errors appropriately

Continuous Integration Sometimesreferred to as build or build machineCI is the process of running a checkoutfollowed by a build followed by the run-ning of the automated test suites con-stantly CI can run after every check-in orafter a specific period of time such asevery hour or perhaps simply restart assoon as it has finished Several toolsexist to provide the infrastructure in orderto make CI straightforward and simple

Balanced Breakfast The idea that dif-ferent test techniques tend to capture dif-ferent categories of defects Thereforegood testing uses a balance of techniquesinstead of one or two exclusively

Exploratory Testing ExploratoryTesting is a just-in-time feedback-based approach Far from ad hoc orundisciplined when done well Explora-tory Testing can provide meaningfulcoverage and stand up to scrutiny

Automated TestingIf youre going to ship an increasinglylarge and complex code base every fewweeks to every few months youll wanta way to run a series of tests over andover again A computer wont find thetests define them or tweak them whenthe interface changes so we believe theterm automated testing to testautomation is more accurate Having asignificant amount of regression testsautomated is part of the balanced break-fast of agile testing

TDD Test Driven Development A rela-tive of Extreme Programming in whichthe expectations for the code are writtenin the form of code before the actualcode is created The developers can runthe test see it fail and improve the codeuntil the test passes In classic TDD thedeveloper adds a single test for eachmethod to be created

ATDD Acceptance Test Driven Dev-elopment A business-level implementa-tion of TDD where part of the require-ments is a set of examples of use of the

software The developers (or testers)take those examples and create testsgenerally automated which fail and thenimplement code to make those testspass Arguments against ATDD includea slower feedback cycle that the worktakes more effort to automate and pro-duces lower return and that tests devel-oped in such a fashion are often brittleOrganizations using ATDD that are ableto go underneath the GUI and executeand evaluate business rules directlyoften have more success with ATDD

Red-Green-Refactor A popular TDDcycle The idea is to write a failing test(red) make it pass (green) and then toimprove the design of the code (refactor)using the suite of tests to make sure theimprovements didnt break anything else

BDD Behavior Driven DevelopmentInitially named by Dave Astels in referenceto to a form of developer-level testing andcode design BDD has since been used byother circles to specify behavior in accept-ance and system-level work BDD focuseson behavior in that tests are referred to aslsquoexpectationsrsquo and the test steps aredescribed in real business termsndashan exten-sion of Domain Driven Design BDD teststyle results in test suites that generallylook and feel much more like plain Englishthan xUnit-style tests

Developer Tests Tests written by a

programmer to exercise some codetypically without any kind of user inter-face involved To a non-technical cus-tomer these tests look like code Theseare sometimes called unit tests whichimplies testing at the lowest possiblelevel or unit of functionality We preferthe term Developer Test as it sepa-rates concerns by role

Customer Tests These are tests thata customer cares about They may drivea GUI or in some cases be expressed intable format and run underneath theGUI This test must be meaningful to thecustomer and should be captured in away that the customer can understandSometimes the customer can evenwatch the test at it runs

Importance of Human BeingsWith all this terminology it is easy toforget that the Agile Manifesto is areaction against heavyweight processand overweight life cycle managementtools The Agile Manifesto advisescompanies to remember that humansare people not cogs in a greatmachine and that good ideas cancome from anywhere at any time If wekeep that in mind while frequentlydelivering working software in smallpieces over and over that is somethingto be proud of yacutemdashThanks to Markus Gaertner for his con-tributions to this article

stamp pedia

Advertiser URL Page

Hewlett-Packard wwwhpcomgoalm 38

Seapine wwwseapinecomstpswift 11

Software Test amp Performance Collaborative wwwstpcollaborativecom 3ndash5

Software Test amp Performance Collaborative Vendor Membership wwwstpcollaborativecom 35

Software Test amp Performance eSeminars wwwstpcollaborativecom 22

STPCon Fall 2009 wwwstpconcom 29 34

Tech Excel wwwtechexcelcom 37

IInnddeexx ttoo AAddvveerrttiisseerrss

lt continued from page 10

36 bull Software Test amp Performance JULY 2009

Page 31: THE GREAT AGILE DEBATE STATE OF THE TEST INDUSTRY WS-I ... · The Four Pillars of Agile What's most important when implementing agile testing in your organization? Learn from a self-made