52
Inside: Agile automation | Testing tools | Penetration testing Raja Neravati on independent testing Taking a strategic approach Visit TEST online at www.testmagazine.co.uk Volume 3: Issue 1: February 2011 I NNOVATION F OR S OFTWARE Q UALITY

TEST Magazine - February-March 2011

Embed Size (px)

DESCRIPTION

The February-March 2011 issue of TEST Magazine

Citation preview

Inside: Agile automation | Testing tools | Penetration testing

Raja Neravati on independent testing

Taking a strategic

approach

Visit TEST online at www.testmagazine.co.uk

Volume 3: Issue 1: february 2011

INNoVAT IoN foR SofTWARE QuAL I TY

T.E.S.T

TH

E E

UR

OP

EA

N S

OF

TW

AR

E T

ES

TE

RV

OL

UM

E

3:

IS

SU

E

1:

FE

BR

UA

RY

2

01

1

Feature | 1

February 2011 | TESTwww.testmagazine.co.uk

Waking up in the morning in the dark cold days of winter (for those of us in the Northern hemisphere)

has never been easy, but as the clocks went back late last year a bug in Apple’s iPhone alarm clock app software made it that bit more diffi cult for many bleary-eyed wage slaves around the world. And it was still acting up come January the third this year when many were contemplating a return to work after a long Christmas break. Perhaps we should thank Apple for a few extra minutes/hours of lie in. The problem, you’ll all be pleased to hear has now been patched – at least I can say with total authority that my alarm clock app is now fi ring on all cylinders anyway.

Apple products generally have a reputation for quality and stability (with a few notable exceptions – one of them a MkIII iMac with a very stripy screen sitting in this very office) and while a glitch in the code for any of the thousands of iPhone apps now available might be excusable, a bug in one of the core functions of the iPhone is very bad news for Apple. OK, they patched it as soon as was humanly possible, but they still garnered negative headlines around the world. And all this hoo haa following last issue’s comment about the rapidly expanding apps development (and testing) market!

It’s not just Apple that has had a bad time in 2010 with software bugs. Our New Year’s Dishonours List on page four has the alarm clock glitch at number six as well as a whole

bunch of other costly and embarrassing incidents that could have been avoided with a little more emphasis on quality and testing.

Which brings me to something new for TEST magazine... Observant readers will notice I’m sure that the magazine has undergone something of a redesign. It was felt that after two years serving the testing industry it was time for an image overhaul, a refresh if you will. As from this issue we will also be publishing TEST as a bimonthly journal, that’s six issues a year as opposed to the four issues we published for the first two years of our existence. And finally, as we’re now publishing every two months we have decided to include a News section (see pages four and five). All our regular favourites are still right where they have always been in the magazine and for more regular news, product and opinion updates you can always check the TEST website at: www.testmagazine.co.uk.

Onward and upward!Until April!

Matt Bailey, Editor

Leader | 1

Apple hits snooze

It’s not just Apple that has had a

bad time in 2010 with software

bugs. Our New Year’s Dishonours

List on page four has the alarm

clock glitch at number six as well

as a whole bunch of other costly

and embarrassing incidents that

could have been avoided with a

little more emphasis on quality

and testing.

Matt Bailey, Editor

Editor Matthew [email protected] Tel: +44 (0)203 056 4599

To advertise contact:Grant [email protected]: +44(0)203 056 4598

Production & DesignToni Barrington [email protected] Cook [email protected]

Editorial & Advertising Enquiries 31 Media Ltd, Three Tuns House109 Borough High StreetLondon SE1 1NLTel: +44 (0) 870 863 6930Fax: +44 (0) 870 085 8837Email: [email protected] Web: www.testmagazine.co.uk

Printed by Pensord, Tram Road, Pontllanfraith, Blackwood. NP12 2YA

© 2011 31 Media Limited. All rights reserved.

TEST Magazine is edited, designed, and published by 31 Media Limited. No part of TEST Magazine may be reproduced, transmitted, stored electronically, distributed, or copied, in whole or part without the prior written consent of the publisher. A reprint service is available.

Opinions expressed in this journal do not necessarily refl ect those of the editor or TEST Magazine or its publisher, 31 Media Limited.

ISSN 2040-0160

INNoVAT IoN foR SofTWARE QuAL I TYInside: Agile automation | Testing tools | Penetration testing

Raja Neravati on independent testing

Taking a strategic

approach

Visit TEST online at www.testmagazine.co.uk

Volume 3: Issue 1: February 2011

INNOVAT ION FOR SOFTWARE QUAL I TY

T.E.S.T

TH

E E

UR

OP

EA

N S

OF

TW

AR

E T

ES

TE

R

VO

LU

ME

3

: I

SS

UE

1

: F

EB

RU

AR

Y

20

11

2 | Feature

TEST | February 2011 www.testmagazine.co.uk

Can you predict the future?Don’t leave anything to chance.Forecast tests the performance, reliability and scalability

of your business critical IT systems backed up by Facilita’s

specialist professional services, training and expert support.

visit Facilita at:

Powerful multi-protocol testing software

TM

Facilita Software Development Limited. Tel: +44 (0)1260 298 109 | email: [email protected] | www.facilita.com

23rd March Queen Elizabeth II Conference Centre

SRING 2011

February 2011 | TESTwww.testmagazine.co.uk

Contents | 3

1 Leader column Apple hits snooze. More software gaffes plus some news about TEST.

4 News

6 Cover story – Taking a strategic approach As testing is becoming one of the fastest growing areas of corporate IT expenditure,

independent testing is becoming increasingly valuable to companies looking to

drive innovation. Raja Neravati reports.

10 Banking on testing Automatic banking is a software-based sector where there is no room for error. Matt

Bailey spoke to president and co-founder of Paragon Application Systems, Gary Kirk.

14 Agile automation Pratik Shah explains why it is important for management to have a realistic approach

before it opts for Agile test automation.

18 Mature management Just how mature are companies’ software quality management processes in

today’s market? Geoff Thomson chairman of the UK Testing Board finds out.

22 Tools for modernisation Matthew Morris, a director at Desynit, an IT consultancy with specialist skills

in legacy applications, explains the role of testing tools in the context of

application modernisation.

26 There’s a bounty on your applications Are rewards, bounties, incentives and prizes the best way to make sure your

applications are secure? Anthony Haywood finds out.

30 Combining the automated with the manual Steve Miller outlines the best practices for uniting both automated and manual

test approaches to improve your software releases.

36 Emulating the network The slimming organisation Slimming World employed a network emulation

solution to ensure that when its in-house developed XpressWeigh software went

live, it performed exactly as expected.

40 TEST Directory

48 Last Word – Dave Whalen Back working as a tester, Dave Whalen learns when to keep his mouth shut.

Contents...FEB 2011

Can you predict the future?Don’t leave anything to chance.Forecast tests the performance, reliability and scalability

of your business critical IT systems backed up by Facilita’s

specialist professional services, training and expert support.

visit Facilita at:

Powerful multi-protocol testing software

TM

Facilita Software Development Limited. Tel: +44 (0)1260 298 109 | email: [email protected] | www.facilita.com

23rd March Queen Elizabeth II Conference Centre

SRING 2011

10

18

6

30 22

TEST | February 2011 www.testmagazine.co.uk

A list of the worst software failures of 2010 has been compiled by Software

Quality Systems. Paul Nelis, director at SQS comments: “These examples of software failures reinforce our drive to raise the profile of quality assurance and software testing within the IT industry. The list was voted for by our consultants based upon major software failures of the past 12 months.”

“Not investing sufficiently in quality assurance is short-sighted as the risks are high in terms of expensive emergency fixes and/or damage to a reputation. We believe that all of these examples could have been avoided through an effective quality management strategy,” he concluded. ToP SofTWARE fAILuRES of 20101. Car Manufacturer – brake

recall. Toyota’s recall of car brands due to anti-lock brake system defect.2. Wrong organs removed from donors. Faulty software led to the removal of the wrong organs from 25 donors in the UK. The error originated in faulty data conversion software that was used to upload information on donation preference.3. Government department prevents completion of online tax returns. Hundreds of people are unable to complete their tax returns online in due to a software bug that has locked users out of their online accounts. 4. Stock Exchange. A stock exchange suffered technical glitches during the first phase of its high-profile migration to new technology; trading on its alternative trading platform starting more than an hour late as a result of the problem.5. Software glitch causes outage for thousands of GPS receivers. While installing software upgrades to ground control stations for a new fleet of GPS satellites, inspectors discovered a glitch in software compatibility that rendered up to 10,000 GPS receivers dark for at least two weeks.

6. iPhone alarm clock bug. The bug caused the Apple handset’s alarm function to stop working correctly. When the clocks changed for ‘daylight saving’, the time automatically updated but the alarm function did not.7. Year 2010 Bug strikes bank cards. A faulty microchip made bank cards unreadable, as they weren’t able to recognise the year 2010, causing chaos in one European country. The bug affected up to 30 million debit and credit cards.8. Privacy lost on facebook. Users could view what should have been private chats between their friends, as well as view their pending friend requests.9. unauthorised access to mobile phone handset. The bug allowed anyone to bypass the 4-digit passcode lock in order to access data on the phone. This granted unauthorised access to contacts and voicemails.10. Phones become a remote bugging device. A smartphone user‘s every word could be recorded and transmitted back to a hacker. The attack (once executed) was trivial to perform.

Macs finally get an app store of their own

following success on its iPhone, iPad and iPod Touch platforms, Apple

Computer has launched an app store for its desktop and laptop computers. Downloads from the new store are reported to have topped a million on the first day of trading.

On opening the store offered more than 1,000 applications for desktops and laptops in 90 countries. The iPhone app store, which launched in 2008 with 800 applications, today offers more than 300,000. Apple says it wants to revolutionise the way people use software and applications on a desktop computer, in the same way that its iPhone app store has done for smartphones.

The company says it hopes the Mac app store will simplify the process of installing software and trigger a move away from downloads and physical CDs.

“The App Store revolutionised mobile apps. We hope to do the same for PC apps with the Mac App Store by making finding

and buying PC apps easy and fun,” commented Apple CEO Steve Jobs when the move was announced last year.

The company also hopes the Mac app store will be seen as a lucrative market by software developers, who retain a 70 percent share of sales made through the apps store. The company’s senior vice president of marketing, Phil Schiller predicted that the its latest invention will eventually be replicated by fellow computer manufacturers, including Microsoft. Rival technology companies, such as Google and Nokia, have opened mobile app stores akin to Apple's.

New Year’s dishonours list – the highest profile software failures of 2010

Blueberry Software Ltd has released a major upgrade to its software testing tool for developers and testers – BB TestAssistant Version 3.0. The company says its new version is designed to make accurate

reporting of complex defects easy. It is an innovative screen-recorder-based software testing tool that puts vital information at the fingertips of developers according to the company.

David Francis, project manager for BB TestAssistant, commented: “Software testing involves using a newly developed system or application under controlled conditions in an attempt to make things go wrong – in other words to see if things happen when they shouldn't or don't happen when they should. We've worked hard to create an application that simplifies the process of reporting defects, and offers outstanding value with high-end features at a low price. BB TestAssistant is easy to use and flexible enough to be incorporated into existing test protocols. Testers need no special training or skills to use it. We've listened to feedback, and version 3.0 has an array of must-have and sought- after features that we are confident will appeal to professional developers and testers.”

TooL foR TESTERS AND DEVELoPERS

February 2011 | TESTwww.testmagazine.co.uk

NEW

S

TestPlant, has announced that its robotic test tool product, eggPlant, has been recognised by the uS Patent

and Trademark Office and secured a US patent. The patent, number 7,870,504, has been granted for the software’s method of controlling or testing a system from another computer using a communications link such as Virtual Network Computing (VNC). This allows eggPlant to ‘see’ the screen on the system under test recognizing icons, buttons, message boxes, prompts, text, backgrounds and colours.

George Mackintosh, chief executive for TestPlant said: “This patent recognizes the

innovation in how eggplant works and secures our position in the market as the leader in GUI testing. This is the first example for years that a complete and current product in the $2.4bn global test tool marketplace has been recognised by the US Patent and Trademark Office as being sufficiently novel and innovative to deserve this distinction. It’s a great demonstration of our leadership and technical expertise in a market that many consider dominated by giants such as HP, IBM, Micro Focus and Microsoft. This is a stamp of TestPlant’s authority to describe our approach to user interface testing as truly unique.”

TESTPLANT GRANTED uS PATENT foR EGGPLANT

A survey has found that the pressure to meet deadlines, incomplete testing requirements and budgets are the top sources of stress for today’s application testing

organisations. Conducted at TestExpo Winter 2010 in December 2010, the survey of 102 testers and QA professionals examined the most widely used software testing methods and pressures facing today’s software testing teams.

When asked about the biggest pressures and sources of stress faced by their QA or software testing teams, three quarters (73 percent) of respondents said the pressure to meet deadlines was the leading cause of stress. This was followed by the completeness of test requirements provided (52 percent), followed by budget pressures (38 percent), then lack of standardised processes and methodologies (37 percent).

The survey also revealed that while cloud and Agile-based testing methods are gaining interest and use, manual and end-to-end testing are the most widely used methodologies. Almost all (96 percent)of respondents said their organisation used manual testing processes, 79 percent used end-to-end testing, 64 percent used automated regression testing and 63 percent used performance management testing methods. Half (50 percent) of respondents said their organisation currently used Agile testing methods, and 19 percent used cloud-based testing.

When asked about their plans for new, as yet unused, testing methodologies over the next twelve months, a fifth (19 percent) of respondents revealed plans for implementing cloud-based testing and 14 percent said had plans to implement Agile testing in the next twelve months.

Respondents were also asked about their organisation’s approach to testing training, revealing that the majority were fully certified (55 percent), 43 percent said they had been self-taught, 34 percent said they’d received informal training and 28 percent said they’d undergone formal training. The majority of respondents had a combination of these.

When asked about examples of software testing failures catching the eye of the public and the UK press, respondents named several high-profile cases where the press had questioned software testing including HMRC, Heathrow Terminal Five and the iPhone 4.

DEADLINES REQuIREMENTS ARE MAIN SouRCE of TESTER STRESS

Application modernisation – the key to software success

In its latest research report, the Standish Group has found that application modernisation projects have the highest likelihood of being completed on time, on budget and

with all critical features included when compared to alternative approaches.

The new report, Modernisation: Clearing a Pathway to Success, delivers an objective comparison of the cost, risk and ROI associated with rewriting, buying a package, and modernising a particular order processing application. The study found that, in this situation, application modernisation had the highest likelihood (53 percent) of being completed on time, on budget and with critical features included. This compares to a 30 percent success rate for replacing the order processing application with a commercial off-the-shelf (COTS) package and just four percent success when re-writing the application from scratch using the latest tools and techniques.

“Since modernisation projects are pre-optimised, requiring no process changes and little to no training, it is often the best route to take for business-critical applications,” commented Peter Gadd, VP modernisation at Micro Focus. “By definition, you get the same application you had before the project, but on a more flexible and less costly platform, making it easier to further improve the value it delivers to the organisation.”

TEST | February 2011 www.testmagazine.co.uk

As organisations seek to improve productivity and reduce application operating and maintenance costs, testing has become one of the fastest growing areas of corporate IT expenditure. Separating testing from development as part of the vendor selection process is becoming more and more popular and independent testing is becoming increasingly valuable to companies looking to drive innovation in their quality management process. Raja Neravati reports.

Taking a strategic approach

6 | Test cover story

February 2011 | TESTwww.testmagazine.co.uk

Test cover story | 7

In a recent webinar poll by AppLabs, which was moderated by TEST editor Matt Bailey, over 80 percent

of the participants indicated that they would prefer to use a Test Centre of Excellence (TCoE) to acquire testing services in the next 12 months. Broadly speaking, TCoE has various goals such as improving the overall quality of the organisation, reducing the spend on QA, having a standardised process to perform QA, evolving a common framework to measure the quality of a project, and integrating formal QA processes into project management stream as vital components.

Factors influencing qualityQuality is seen as the critical element behind the success of any product/application. Meeting the high quality standards can help organisations/teams to reduce the pernicious effects of rework on their budget and delivery schedule. Several factors that have a significant impact on quality during testing include functional correctness, test adequacy, test data, testing infrastructure and tools, and system performance. Defining test data adequacy based on the objectives of testing will help organisations to execute the tests that will result in product/application with no errors. It will also provide the necessary inputs on when to stop testing and the guidelines for additional test cases. The latest testing tools will help in running the tests much faster and to cover each and every feature of the application.

Apart from this, who it is that does the testing plays a major role on the quality of the end product/application. Today it is proven that an independent testing vendor who is solely focused on testing is able to do the task much better than the organisations where the development team is itself involved in the validation of the application.

An independent testing vendor with expertise of testing-only focused experts and latest tools will be able to perform the task with much needed rigor and provide an unbiased view of the defects.

Decentralised/ centralised testing modelsIn a traditional/decentralised model, testing is entirely at the discretion of the development teams. Some of the problems of decentralised test teams are lack of baseline and benchmarking, inconsistent quality across teams, redundancy in test infrastructure, and conflicts of interests. In a centralised model, testing is moved from the development teams to a centralised testing team that conducts all of the testing for a business unit or even for the entire organisation. By shifting to a centralised testing model, organisations will have certain advantages such as more consistent and objective testing, and more flexible deployment of testing resources.

Implementing centralised testing within the organisation can turn out to be expensive; or perhaps the budgets to build the required test environments are not sufficient and hence outsourcing testing is the logical step for enterprises to adopt. Leveraging independent testing services will help organisations to meet higher quality at less costs, while at the same time keep adept with the newer and latest technologies.

Test Center of ExcellenceA TCoE is a centralised testing model that brings together people and infrastructure into a shared services function for standardising processes and optimising resource utilisation in order to generate supernormal benefits across the organisation. TCoE helps organisations to meet their goals while at the same time lower the QA and maintenance costs, improve quality, and speed time to market.

By shifting to a centralised testing model, organisations will have certain advantages such as more consistent and objective testing, and more flexible deployment of testing resources.

TEST | February 2011 www.testmagazine.co.uk

8 | Test cover story

Some of the strategic components of a TCoE are awareness, assimilation, deployment, and improvement. Building a roadmap and vision for the centre of excellence, creating a knowledge base through testing best practices, using capabilities and components that the TCoE generates, and improving value by repeated use are the key processes of a TCoE model. It provides access to latest tools and technologies, scripting languages, databases, and metrics to ensure enterprise wide implementation as well as regular check points to evaluate process maturity and feedback.

A TCoE involves reusable test suites and frameworks and integrated knowledge management system to help leverage our learning. A TCoE Framework leveraging the latest testing infrastructure, advanced business/product based frameworks and engineering tools, dedicated resource management and proven process methodologies will ultimately result in a successful project delivery.

Though various benefits of the TCoE make it the right approach for enterprises to look for, it should be implemented properly to avoid unsatisfied and frustrated customers. Outsourced managed TCoEs can bring in the ability to establish and maintain a rigorous testing practice that are focused on defect detection and resolution, 24/7 testing support and industry specific expertise.

In an another webinar poll conducted by AppLabs, when asked about how they would build a centralised testing model, a majority of the participants (85 percent) said that they would prefer outsourced managed TCoEs instead of building an in-house centralised testing model.

Testing skills and experience are the critical elements to a successful end-product and independent testing vendors are well equipped with the latest tools and experienced testers to meet that need.

Marc Nadeau, senior director of quality assurance at Blackboard coments: “90 percent of our clients renew annually, and we have seen a significant increase in the number of clients that adopt our product immediately after release. We attribute much of this to the solidity of the testing and quality assurance programme. The success of our testing programme and our partnership with AppLabs has had a direct impact on our bottom line.”

What’s the score?AppLabs has pioneered proprietary SCORE (Standardise Centralise Optimise Review Expand) methodology to deliver TCoE. It maps processes and metrics to an organisation’s business goals and focuses on building organisation-wide test platform. It has a practical approach of transforming QA, organisation-wide with phased implementation and checkpoints to validate the progress. Standardise: The main reason behind a testing inefficiency is multiple processes being used by various testing groups in an organisation and standardisation of processes yields immediate benefits of predictability. We proceed to a gap analysis on several parameters eg, testing processes, software tools, templates, methodologies, and metrics for reporting purpose.

Defect stability metrics is established to estimate the number of defects in the code. We look in particular how developers are over time closing defects that have been identified by testers.

The main reason behind a testing inefficiency is multiple processes being used by various testing groups in an organisation and standardisation of processes yields immediate benefits of predictability.

February 2011 | TESTwww.testmagazine.co.uk

Test cover story | 9

Centralise: The objective of the centralise phase is to understand the level of testing resources eg, personnel and tools utilisation. For large engagements, we favour a flex team approach where 60 percent of the testing team is dedicated to the client and the remaining 40 percent moving from client to client. in the flex team model, the core team receives key training on the applications of the client and tools used by the client. optimise: Under the optimise phase, we provide an inventory of each application based on its business criticality and the level of testing resources and effort to be allocated. The company uses internal benchmarks based on the application and its technology to estimate ratios of personnel in testing.

We also work with the client to address how to put a maximum limit on software testing spent and work with internal IT on how to lead change management with users.Review: During the review phase, we create metrics to predict performance for applications that are maintained as part of multiple release cycles. The company estimates the level of defect density to be found in each application release. Variations between the actual results and the prediction are then analysed to understand causes and suggest corrective actions.Expand: Once the TCoE concept has been proven with a few applications or across a few business groups, we can then add other applications and groups to the TCoE. A gradual expansion of the TCoE ensures that the benefits of the initial steps are being realised before additional steps are taken and more people are brought into the TCoE fold. Every few months (or whatever period is agreed to in

the initial TCoE implementation plan) there will be a demonstrable ROI that shows that all of the TCoE benefits are obtained at practically SERO cost to the organisation.

“The work we are doing with AppLabs in terms of the creation of a Test Center of Excellence is of strategic importance to our business,” comments Peter Bates, manager of development support services at Friends. “Over the months we have been working together we have formed a very close partnership, and we are able to trust AppLabs to deliver the ideas and solutions we need.”

Know the scoreOur proprietary Core + Flexi resource model (SCORE methodology) is a pay only for usage offering, hence cost goes down exponentially with the products, in contrast to Silos where no economies of scale is achieved and cost grows proportionally with products. Since the launch of SCORE / TCoE, we have seen almost eight out of ten customers opting for this approach and realising its benefits.

Users have experienced increased level of automation and a reduction in manual effort by 40 percent on average and a decrease in test cycle time by over 20 percent; a lowering of test costs by 20-30 percent on average; as well as process maturity etc.

Increase in business complexity, usage of heterogeneous systems, budget constraints and increased quality expectations from customers are driving the need to do more with less. In an effort to gain efficiencies and run in the most optimised fashion, many enterprises are looking towards centralisation of quality services and establishing a Center of Excellence for Testing.

Raja Neravati Senior Vice President AppLabswww.applabs.com

Increase in business complexity, usage of heterogeneous systems, budget constraints and increased quality expectations from customers are driving the need to do more with less. In an effort to gain efficiencies and run in the most optimised fashion, many enterprises are looking towards centralisation of quality services and establishing a Center of Excellence for Testing.

10 | Feature

TEST | February 2011 www.testmagazine.co.uk

10 | Testing business

Gary Kirk has spent more than 25 years in the electronic payments industry.

He is president and co-founder of Paragon Application Systems where he provides leadership for the strategic direction of the company.

Since 1994, Paragon Application Systems has provided software testing solutions to simulate and test virtually any point in the transaction processing stream, providing peace of mind to financial institutions that their electronic payment (ePayment) systems are reliable and error-free. Today the company provides ePayment simulation, configuration and testing software and services to an impressive customer base, including top-ranking financial institutions, leading software providers, merchant acquirers, processors, interchanges and credit

unions. The company has helped more than 525 customers in over 85 countries drive more than 100,000 ATMs, connect to over 90 interchanges and process thousands of transactions per second.

“At Paragon, we base our success on the integrity, talent, and experience of our staff,” says Kirk. “Our collective technical expertise includes an in-depth understanding of financial message processing standards, both at the host and machine level. Moreover, our team brings extensive experience with EFT systems, including over 90 financial messaging formats. Coupling this specialised knowledge with responsiveness to our customers results in a high rate of repeat business.”

Exploding transaction volumes“The electronic payment environment is extremely dynamic, with transaction volumes continuing to grow rapidly,”

Automatic banking is a software-based sector where there is no room for error. With more than 25 years in the electronic payments industry, president and co-founder of Paragon Application Systems, Gary Kirk certainly knows the ropes. Matt Bailey spoke to him for TEST.

Banking on testing

Feature | 11

February 2011 | TESTwww.testmagazine.co.uk

explains Kirk. “New payment instruments, mergers, restructuring, mandates and new systems all put pressure on industry players. Add to that the high-stakes nature of electronic payments as well as the need to satisfy audit controls many financial institutions have put in place to ensure proper testing and the need for constant, effective and efficient testing becomes readily apparent.

“Volume and stress testing become more critical as transaction volumes increase. We’re talking about handling people’s money here. People lose patience very quickly when they encounter a problem with the transaction processing system, such as when a transaction is declined because a network goes down due to heavy traffic . News travels fast and in the current climate, there are no small mistakes anymore. The reputation of banks has already taken a hit and

they can’t afford to have other problems. They must test – the only real choice they have now is how efficiently they do it. Organisations are trying to squeeze the last bit of processing power they can get and stress testing helps ensure they can maximise their investments in hardware and software.”

opportunities and threatsDespite the prevailing global economic gloom, Kirk notes that Paragon has weathered the storm well. “In some ways we’re even more valuable to our clients now than we are when the economy is booming. Many companies have cut back on staffing levels and have fewer people to do the same amount of work. The risk doesn’t diminish, however, so it makes sense to use testing tools and automation to compensate for the reduced headcount. That’s where we come in.”

That doesn’t mean, however, that Paragon has been completely untouched by the economic situation. “The sales cycle is sometimes longer these days,” says Kirk. “The decision-making process has been slower over the past couple of years, as is the case in any economic downturn. Approval from more people ‘up the chain’ is sometimes required to get final sign- off on a purchase. We’ve seen instances where a manager who formerly could approve a $1,000,000 purchase now is required to get upper management approval for purchases as low as $10,000.”

Gary Kirk is starting to see the effects of off-shore testing. “Over the past few years we’ve seen the ebb and flow of off-shore testing. The vast majority of our customers have not outsourced their testing off-shore and prefer to utilise our tools and leverage our expertise in-house. A very few others have outsourced some of their testing. This is an area where industry-specific testing tools demonstrate their advantages over ‘one-size-fits-all’ testing tools. In addition, testers with strong industry experience and

Testing business | 11

Over the past few years we’ve seen the ebb and flow of off-shore testing. The vast majority of our customers have not outsourced their testing off-shore and prefer to utilise our tools and leverage our expertise in-house. A very few others have outsourced some of their testing. This is an area where industry-specific testing tools demonstrate their advantages over ‘one-size-fits-all’ testing tools.

TEST | February 2011 www.testmagazine.co.uk

12 | Testing business

knowledge are much more capable than ‘generic’ testers of adding value to the most time-consuming aspects of testing: designing and creating meaningful tests and analysing test results.”

Paragon offers a variety of products and services including automatic function and regression testing of more than 90 financial message formats including ISO 8583, ATM, POS and IFX; EMV, hardware/software migration, stress and load testing of ePayment systems to confirm capacity, automatic ATM configuration and testing and accelerated testing of offline processing.

Emerging technologiesIt may come as a surprise that testing of financial systems hasn’t always been automated. Kirk cited an example from the early days of Paragon. “When we started Paragon, ATM testing typically required someone to stand in front of a physical ATM, insert an ATM card and push all of the buttons required to run one transaction after another to ensure it was processed correctly by the bank’s software. It was a rather mind-numbing experience for the tester and was extremely inefficient. We automated much of the process with our FASTest for ATMs product which freed the tester from many of these rote tasks.

“We’re also seeing automation extended beyond simply clicking a button and watching a test run to include unattended testing. More financial institutions are scheduling automated tests to run overnight. One of our largest customers runs over 7,000 tests using our software as part of their nightly QA run. Clearly, automation now plays a pivotal role in the testing of financial transactions and will continue to do so in the future.”

Kirk predicts that as business pressures mount, the need to collaborate will become more pressing. “Collaboration will play a larger role in ePayment testing, certainly,” he predicts. “Currently there is a lot of wasted effort resulting from misinterpreted

specifications and the same mistakes being repeated across financial institutions. We have a very experienced staff and some of our customers have started leveraging that expertise by engaging us to help interpret specifications and to design test plans. I expect more financial institutions to see the value of collaboration as a cost-reduction endeavor.”

Looking ahead, Kirk also sees the potential of the cloud revolution. “We’ve seen considerable interest in our Web FASTest product, a web-based testing platform,” he says. “It addresses a lot of the challenges associated with large desktop implementations and offers easier deployment and administration as well as more predictable costs for our customers. One of these days we’ll all think about large desktop deployments in much the same way we think about early ATM testing: happy those days are behind us.”

Adding valueThere is a problem with the perception of testers in many organisations though. Kirk suggests that testers need to be able to show that they are really adding value. “They’re seen as perhaps a necessary evil,” he says. “They are often the first to go when companies start cutting costs, a short-sighted and risky action. In order to combat this perception, testers need to view their role as not to simply find bugs but rather to advise the organisation on improvements in product development processes and procedures. In addition to finding and reporting bugs they should be offering suggestions for preventing similar bugs from occurring in the future.

“For us it’s personal,” concludes Kirk. “The founders of Paragon came from customer-facing positions in the electronic payments industry – some would call it the ‘sharp end’ - and continue that customer-centric focus today. We have a high volume of repeat business – and that’s the best validation you can get.”

Gary Kirk President and co-founderParagon Application Systemswww.paragonedge.com

Testers need to view their role as not to simply find bugs but rather to advise the organisation on improvements in product development processes and procedures. In addition to finding and reporting bugs they should be offering suggestions for preventing similar bugs from occurring in the future.

February 2011 | TESTwww.testmagazine.co.uk

For more informationContact Grant Farrell on +44 (0) 203 056 4598

Email: [email protected]

Website: www.testfocusgroups.com

28th June 2011� One Day Event � 80 Decision Makers � 10 Thought Leading Debate Sessions

Peer-to-Peer Networking � Exhibition � Cutting Edge Content

Helping you overcome obstaclesF O C U S G R O U P S

Event Sponsor: Sponsors:

Can you predict the future?Forecast tests the performance, reliability and scalability

of IT systems. Combine with Facilita’s outstanding

professional services and expert support and the future is

no longer guesswork.

visit Facilita at:

Powerful multi-protocol testing software

TM

Facilita Software Development Limited. Tel: +44 (0)1260 298 109 | email: [email protected] | www.facilita.com

4th October Guoman Tower Hotel. London

7th December Plaisterers Hall, London

WINTER 2010

TEST | February 2011 www.testmagazine.co.uk

14 | Agile test automation

Pratik Shah is currently working as a test location lead in one of the UK-based MNCs in India. He has been working in quality assurance and software testing for nearly seven years. Here he explains why it is important for management to have a realistic approach before it opts for Agile test automation.

Agile automation

February 2011 | TESTwww.testmagazine.co.uk

Agile test automation | 15

Management expectations have often been set through the media,

vendor hype, conferences, and books extolling the virtues of Agile test automation. Some of the information is quite valid and applicable, but much of it under emphasises the special circumstances and specific considerations that apply to the projects described and over emphasises the successes. It is very important for management to get clear focus with realistic approach before they expect anything from agile test automation or before they opt for agile test automation.

Agile test automation Agile methodologies embrace iterations. Small teams work together with stakeholders to define quick prototypes, proof of concepts, or other visual means to describe the problem to be solved. The team defines the requirements for the iteration, develops the code, and defines and runs integrated test scripts, and the users verify the results. Verification occurs much earlier in the development process, allowing stakeholders to fine-tune requirements while they’re still relatively easy to change.

These can reduce risk by providing faster and clearer information on which to base project decisions. One facet of Agile development is the reduction in compartmentalisation of testing within the overall software development lifecycle. Agile development gives management and customers more control over the development process while preserving the creativity and productivity of technical work.

Principles of Agile test automation1: Test automation is more than just

execution it consists of tool support for many aspects in a test project. Many times the assumption and impression of test automation is ‘computer to run tests and prepare result for you’. This is not fully wrong but there is lot more to it than this. For example, test data preparation, test

data feed, configuring the supported systems, pre requisites achievements, post execution analysis, report generation etc.

2: Test automation progresses when supported by dedicated programmers (toolsmiths). Test automation is basically running the recorded script. Realistically there is much more to it than that and we need to customise the script according to the type of verification we want to achieve. We also need to edit the recorded script and write the code/program in that tool-supported coding language as this is the only option through which we can achieve the same. Also whenever the recorded script needs to be updated, manually coding needs to be done, hence we have to have skilled programmers to do that.

3: Test automation depends on product testability. Unless and until the product is not testable, you cannot achieve the desired result with any automation tool. This also covers the stability factor of the product. If the product is not testable and not stable, any test automation activity done could be a waste of time and you might not get cost benefit from that.

4: Test automation can distract you from good testing. Sometimes automation testers need to focus more on technical aspects of the automation tool itself or lose they may lose hold of the actual testing mission itself.

5: Proof of concept (PoC) leads to a better automation tool selection. There are many test management tools, functional testing tools, security testing tools and performance testing tools available on the market. Which tools suits you best depends mainly on your expectations and your requirements. After short-listing the tools, it is a good idea to have a sample automation activity to be done with each – proof of concept.

6: Test automation might provide long-term cost and time benefit. The investment required for test automation may be huge initially and ROI will not be instant. With Agile, this payback period really depends on how many times you have

Agile test automation is not magic; it does not solve all testing problems with the wave of a wand and has to be carefully planned to be even marginally successful. Poorly set expectations can result in a beneficial automation effort getting labelled as a failure.

TEST | February 2011 www.testmagazine.co.uk

16 | Agile test automation

executed the script for each module. Every time you run the automated script, you increase the benefit you are getting.

Management perspective focus area redefinedAgile test automation is not magic; it does not solve all testing problems with the wave of a wand and has to be carefully planned to be even marginally successful. Poorly set expectations can result in a beneficial automation effort getting labelled as a failure.

There are several areas in which we should set management expectations: intangible costs and benefits; falsely expected benefits; factors common to manual and automated testing and; organisational impacts.• Intangibles costs: These are difficult

to realistically assess. Where they can be measured, there is great variation in the financial value we place on them.

• Hands-off testing: Although the cost of people is easily measurable, any additional value of computer control is difficult to quantify.

• Improved professionalism of test organisation: This often increases motivation and productivity, and comes with the new discipline and tasks automation requires.

• The changes in the quality of tests: The quality may improve or get worse, but manual and automated tests almost always are different exercises.

• Numbers of product rolls (test cycles) before release: Agile automation often allows faster confirmation of product builds and may encourage more turns. The churning may improve productivity and quality or possibly may cause laziness, lack of attention, and degradation of quality.

• Test coverage: It may improve or lessen, depending on the effectiveness of manual testing, the automated tools, and the automated tests.

falsely expected benefits:• All tests will be automated. This isn’t

practical or desirable.• There will be immediate pay back

from automation. An immediate payback may be seen for some automation (eg, build tests), but usually the payback comes much later after the investment. It takes a lot of work to create most automated tests and the savings usually come

from the running and rerunning after they are created.

• Use of capture/playback for regression testing. This only works in the rare case when the product is so stable that there is very little expectation that any existing tests will need change in the future.

• Automatic defect reporting (without human intervention). This is often disastrous for the testing and development organisations. Problems include duplicate reports, false detection of errors, error cascades (one error triggers many test failures) and irreproducible errors, among others. Even with human review of the reports this feature of some automation tools can require far more effort than it saves.

Management work approachManagement has a strong interest in getting the best bang for its buck. Agile test automation work is one step removed from testing, which is already itself a step or two removed from revenue streams, so agile test automation must justify itself in terms of how it improves testing, or otherwise improves the business, such that it is worth the price paid.

Management should work with Agile test automation experts - mainly by asking for and promptly getting progress reports that answer certain standard questions like: What are we getting from Agile test automation? What does it cost us? Considering the cost, are we getting enough from Agile test automation? What bad things would happen if we suspended further investment in agile test automation?

Risks of Agile automation There are significant risks attached to Agile test automation. For example, it may not significantly improve test productivity unless the testers know how to test. Productivity metrics such as number of test cases created or executed per day can also be terribly misleading and could lead to making a large investment in running useless tests. Then, there are the Agile automation experts who may propose and deliver testing solutions that require too much ongoing maintenance relative to the value provided by them. These experts may also lack the expertise to conceive and deliver effective solutions; or they may be so successful that they run out of important problems to solve, and thus turn to unimportant problems.

Pratik Shah Test location lead – India [email protected]

Agile test automation work is one step removed from testing, which is already itself a step or two removed from revenue streams, so agile test automation must justify itself in terms of how it improves testing, or otherwise improves the business, such that it is worth the price paid.

February 2011 | TESTwww.testmagazine.co.uk

A U

T

T

E

V

C

O

S

I

O

M

A

N

T I

T

I

R

T

N

E

N

I

E

A

U

D

G

L

N T E P R I S E S

E

O

E

F F I C

U S

G HT O R O U

L E G E

See what you’re missing!

www.greenhat.com

The market-leading combined, integrated platform for the automated end-to-end testing of the SOA, BPM, Cloud and legacy integration technologies that run agile enterprises.

Enabling early and continuous testing from the UI through to back-end services and databases, eliminating defects and saving costs.

To avoid ‘cross words’ in production, come and talk to Green Hat.

West Coast • East Coast • London • Germany • Melbourne

15518_Greenhat_PTM_A4_aw.pdf 1 18/11/2010 12:53

TEST | February 2011 www.testmagazine.co.uk

Just how mature are companies’ software quality management processes in today’s market? Geoff Thomson chairman of the UK Testing Board set off to find out...

Mature management

18 | Testing qualifications

February 2011 | TESTwww.testmagazine.co.uk

Testing qualifications | 19

Today, maintaining a competitive edge while managing costs is the challenge that most

organisations face. High up on the IT directors’ to do list is innovation in the way costs are managed and contained, while at the same time providing a dynamic and responsive IT service to the business.

Based upon the industry standard Test Maturity Model integrated (TMMi), in 2010 Experimentus undertook its second survey across the IT Industry to understand the maturity of companies’ software quality management processes. Of the 100 plus companies, across twelve industry sectors who responded:• 68 percent were at TMMi Level 1

heading for Level 2; meaning they are working in a chaotic, hero-based way but starting to build project-based processes, compared to 72.5 percent in the previous survey.

• 37 percent were at TMMi Level 2 heading towards Level 3; meaning they have some established project based process and are moving towards implementing process at an organisational level, compared to 27.5 percent in the previous survey.

• None of the respondents had reached Level 3.

Notable resultsThis state of affairs represented a positive step forward from the initial survey, but showed there is still a long way to go.

Interestingly, the Level 2 results suggest that although software testers believe they are good at designing tests and planning testing, they are not so good at setting goals, monitoring and managing the plans. They are also not very consistent with how they estimate for testing either.

The big surprise was to see how well planned test environments were, and that for the later stages of testing (eg, User Acceptance Testing), ‘production like’ test environments appear to exist fairly consistently.

Areas for concernThe most consistent weakness was around the collection and use of metrics, with a significant amount of respondents working without metrics altogether and therefore unable to say with any degree of confidence where they are or what they have done.

With over 150,000 qualified people through recognised examination boards, it is hard not to conclude that, despite being armed with the tools and knowledge to make an impact to the quality and cost of software delivery, no allowance is made to

The most consistent weakness was around the collection and use of metrics, with a significant amount of respondents working without metrics altogether and therefore unable to say with any degree of confidence where they are or what they have done.

TEST | February 2011 www.testmagazine.co.uk

20 | Testing qualifi cations

enable the skills learnt to be put into practice. With informed management, there is nothing stopping organisations benefitting from what the students have learnt through controlled management of change. After all, why invest in training and certification if they are then unable to put what they have learnt into practice.

A question of trustThe survey results reflect a view that we have had for a while, that “too many organisations are prepared to continue to fund poor, long winded, not repeatable, costly processes but will not seriously investigate making improvements that will bring about a significant increase in software quality together with considerable cost savings.”

At a time when software testing wants to be seen as a profession and software testers want to be seen as valued members of their delivery teams, the results of this survey have highlighted two areas for serious consideration by test teams and their management.

The first is around trust – given that over 150,000 people have qualifications gained through ISEB (Information Systems Examinations Board), and ISTQB (International Software Testing Qualifications Board), with the syllabi matching closely to the TMMi Foundations standards and methods, why is it that we appear unable to have some of the basics in place within the testing areas. What is stopping these basic practices from being implemented?

Maybe focus is needed in ensuring the information learnt on courses is retained and practiced in the workplace through a controlled management of change. After all, why send them to be accredited if they cannot put into practice, best practice?

The second is around demonstrating that testing is a profession. In most

professional disciplines, people are judged by their results and achievements. The survey indicated that more than 70 percent or respondents do not have metrics in place to monitor or manage testing goals. Metrics provide a valuable demonstration of results and achievements, which will help in raising their standing amongst their peers. If testing is serious about becoming a profession, it needs to get to grips with metrics to demonstrate its worth.

Some interesting results were also revealed: it’s the higher level tasks such as Test Policy, Test Strategy and the management and reporting of testing that software testers are not so good at. Maybe this is a direct result of software testers not getting the right training in management skills.

A lot of the test managers we have met have had little or no formal training in test or project management. The impacts of poorly managed projects are wide ranging, one example that comes to mind is the project that has been reporting good progress for 6 months, and the day before live suddenly says it is only 50 percent through testing. This actually happened on a project we were reviewing, the result was a three month delay to the go live date with major repercussions for the project and test management team and additional cost of over £200k, as well as significant delays on projects that resources were due to move on to. This kind of situation is easy to fix with a little training and mentoring.

There are some bright lights shining in there around test design; however, the key issue that this survey highlights is our lack of focus on goals, objectives and metrics. Resolving these areas will make a significant difference to the benefits software testers and software testing provide to their organisations.The full survey is available at: www.experimentus.com

It’s the higher level tasks such as Test Policy, Test Strategy and the management and reporting of testing that software testers are not so good at. Maybe this is a direct result of software testers not getting the right training in management skills.

Geoff Thomson Chairman UKTBwww.uktb.org.uk

At Original Software, we have listened tomarket frustrations and want you to share in our visionary approach for managing thequality of your applications. We understandthat the need to respond faster to changingbusiness requirements means you haveto adapt the way you work when you’redelivering business-critical applications.

Our solution suite aids business agility andprovides an integrated approach to solvingyour software delivery process andmanagement challenges.

Join the RevolutionDon’t let your legacy applicationquality systems hamper yourbusiness agility

Find out why leading companies areswitching to Original Software by visiting:www.origsoft.com/business_agility

11_10_os_A4_advert_02_aw:1 11/11/2010 08:47 Page 1

February 2011 | TESTwww.testmagazine.co.uk

At Original Software, we have listened tomarket frustrations and want you to share in our visionary approach for managing thequality of your applications. We understandthat the need to respond faster to changingbusiness requirements means you haveto adapt the way you work when you’redelivering business-critical applications.

Our solution suite aids business agility andprovides an integrated approach to solvingyour software delivery process andmanagement challenges.

Join the RevolutionDon’t let your legacy applicationquality systems hamper yourbusiness agility

Find out why leading companies areswitching to Original Software by visiting:www.origsoft.com/business_agility

11_10_os_A4_advert_02_aw:1 11/11/2010 08:47 Page 1

TEST | February 2011 www.testmagazine.co.uk

Matthew Morris, a director at Desynit, an IT consultancy with specialist skills in legacy applications, explains the role of testing tools in the context of application modernisation.

Tools for modernisation

22 | Testing tools

February 2011 | TESTwww.testmagazine.co.uk

At the beginning of 2009, an IT team at a global legal services provider was faced with a serious

dilemma. As their business had expanded globally, their AS400 database, which had served their needs for many years, was no longer delivering. Critical information was not available in time for the business day in the overseas offices. Faced with an IT system that had begun to hold back their growth, should they now embark on a radical replacement programme? or could they modernise their existing legacy applications?

The situation was a familiar one. Their COBOL system operating on IBM I series DB2 database had evolved over many years, incorporating new requirements as time went by. Speed had never been a top priority, with data being

left to transfer over night. But now, this idiosyncratic system needed to up its game and increase system availability in other time zones, or else stand in the way of the global expansion of the business as a whole.

The problem was not with the technical abilities of the people managing the system, but their fear of making changes which could potentially result in some pretty serious and negative consequences system-wide. Looking for a solution to this technological headache, the IT team adopted a solution which not only helped them out on this occasion, but pretty much revolutionised the way they were able to approach IT change management. The key here was testing – robust, automated testing.

The team reviewed the code and database structure and got to the bottom of the processes slowing down their processing speeds. Once modified, performance was improved by 80 percent. It was the use of an industry leading automated testing tool which gave the team the confidence to make these improvements, completely safe in the knowledge that there would be no unwelcome repercussions.

In short, testing tools have given this business the confidence and power to make their own system changes in-house. Put that into the context of a market carefully considering the modernisation options for ageing IT platforms worldwide, and you can see why the subject of testing is coming under more scrutiny.

Application modernisation tops the agenda“Modernising legacy applications is cool again. 59 percent of IT leaders place modernisation as the top software issue,” Application Modernisation and Migration Trends in 2009/2010, Forrester Consulting.

Testing tools really come into their own in the context of application modernisation, At a time when he industry blogs and forums are buzzing with news of emerging cloud technologies, who would have thought legacy systems would be back in fashion? From the recent economic downturn a trend is emerging which favours the application modernisation route – that

Testing tools | 23

The IT team adopted a solution which not only helped them out on this occasion, but pretty much revolutionised the way they were able to approach IT change management. The key here was testing – robust, automated testing.

TEST | February 2011 www.testmagazine.co.uk

24 | Testing tools

is to say taking existing systems and migrating them to, or integrating them with, newer platforms and technologies.

Virtually all businesses over 15 years old will find themselves operating increasingly complex computing environments. Over the decades technology has accumulated from all genres resulting in slow, inflexible system, with no ability to deliver dynamic change when required. The IT director knows that in order to reduce costs they need to cut out the waste, and more importantly open the doors to innovative new technologies. What’s more the baby boomer programmers are coming up for retirement and their increasingly scarce and specialised legacy skills are about to walk out of the door, taking with them all of those undocumented workarounds and fixes that keep the show on the road daily.

So surely this is the time to start again from the ground-up? Actually, no. Businesses are terrified of the replacement route and rightly so. However attractive a clean break may appear, it’s a high risk/ high failure option. The cost of the downtime, coupled with the possibility of losing 20 years plus of accumulated business rules is inconceivable.

unlocking the legacy“IT Decision-makers acknowledge the value locked inside legacy applications and reject the concept of ‘throwing the baby out with the bath water’ inherent in big-bang replacement scenarios, ”Application Modernisation and Migration Trends in 2009/2010, Forrester Consulting

Yes, a brand new system can be brought in, but it’s unlikely to be truly tailored to the business. As numerous stakeholders compete for finite resources, the big re-write typically ends in an IT compromise, bending existing business processes to work around the new system, or as Original Software’s CEO Colin Armitage puts

it: “Businesses face a key choice in the ongoing maintenance of their software systems, completely replace or incrementally modernise.”

Test automationThe fact is that there is no business system in existence which is perfectly in tune with the business. Maybe on day one, but as soon as that system goes live, requirements start to move on, and you are back into a cycle of change management. This is where the testing-as-you-go approach needs to become a fundamental to each and every IT team. Automatic documentation and testing of each system change is accurate, saves time and mitigates risk.

The important thing here is the automation. No comparison can be drawn with manual testing. Not only is it inaccurate, its labour intensive and therefore expensive. In light of these facts, it seems surprising indeed that over 80 percent of all testing carried out today is manual. Next generation technology uses automated logic discovery and documentation in a code-free environment, so the old scripting languages can be left behind, which has particular relevance for the application modernisation approach.

A business will often find itself facing the need to update an application as the systems software or hardware platform that it is written for is no longer supported. The IT manager may believe that there is nothing wrong with the platform as it stands, but the pressure is on to modernise to make it viable for internet front ends. Suddenly the automated logic discovery in a code neutral environment offered by the automated testing tool seems like a great way to mitigate the pain and risk of leaving behind the old code base or platform.

What you see is not always what you getAnother important consideration when considering adequate testing within

In short, testing tools have given this business the confidence and power to make their own system changes in-house. Put that into the context of a market carefully considering the modernisation options for ageing IT platforms worldwide, and you can see why the subject of testing is coming under more scrutiny.

February 2011 | TESTwww.testmagazine.co.uk

Testing tools | 25

a modernisation effort, is whether you are testing deeply enough. That is to say, testing that looks only as far as the eye can see is not going to be good enough. Luckily, there are now testing tools on the market that not only give you results on the visual layer, but can also report back on what is going on underneath, and in fact in all other areas of the system.

TestBench is a product which addresses this need on prevalent legacy systems based on the I Series, System I, SQL or Oracle server. Graphical scripts are built based on the user’s interactions with the system under test. The intelligent technology behind these solutions means that they automatically recognise when an application interface has changed and scripts are updated to reflect these changes. This means that solutions are completely re-usable, no matter how often the application changes. As a result, the automation sone is no longer confined to those areas of the application that are stable and risk-free.

Agile testing and developmentAs testing tools come of age, then so does the process of IT change itself, and it is becoming increasingly apparent that there is no longer any room for the dinosaur that is the waterfall project management technique. Characterised by long periods of non-delivery while working towards distant milestones, the waterfall model makes no financial sense. A business that has committed to the application modernisation route has recognised that there is no ‘finished article’ to be delivered on the final day of the project plan. Their approach is that IT must be dynamic, adaptable and reactive to the changing needs and requirements of the business, and this is entirely in keeping with Agile practices.

At first glance then, it would seem an impossibility that testing tools could help in this environment. With no

‘finished product’, or at least a finished user interface, how can scripts be recorded and tests run? This has been described as a ‘test last scenario’. Within the application modernisation scenario or continual and incremental development, it’s time to move to a ‘test first, test continuously’ model, and again, modern testing tools can offer this option.

So key is testing to the Agile method, that test driven development (TDD) has become to some extent synonymous with the method, defined as a ‘software development technique consisting of short iterations where new test cases covering the desired improvement or new functionality are written first, then the production code necessary to pass the tests is implemented, and finally the software is re-factored to accommodate changes’(Wikipedia definition).

Once size no longer fits allOnce upon a time, when application portfolios were set in stone, a modernisation effort would mean applying a single new technology to every application. Now with the new generation of testing tools, developers can start by testing and assessing the metrics of each application in turn. Armed with this knowledge IT decision makers can make an informed decision on which modernisation technique offers the best fit for their portfolio.

And with the ability to test each change as it goes in, developers can continue to modernise their legacy systems, free of the fear of change.

The situation is neatly summarised by a friends in the legal services company mentioned at the beginning of this article: “Testing has been the key for us to modernise our applications. I don’t think that we had really understood how deeply the fear of change had become part of our working culture. The adoption of automated testing tools was the single most important IT decision made by our organisation in 2009.”

Matthew Morris Director Desynitwww.desynit.com

At a time when the industry blogs and forums are buzzing with news of emerging cloud technologies, who would have thought legacy systems would be back in fashion? From the recent economic downturn a trend is emerging which favours the application modernisation route – that is to say taking existing systems and migrating them to, or integrating them with, newer platforms and technologies.

TEST | February 2011 www.testmagazine.co.uk

26 | Penetration testing

Are rewards, bounties, incentives and prizes the best way to make sure your applications are secure? Anthony Haywood, CTO at Idappcom reports.

There’s a bounty on your applications

*REWARD WILL BE PAIDFor any information

February 2011 | TESTwww.testmagazine.co.uk

Penetration testing | 27

While on the surface it may seem that these companies are being open and honest, if a serious security flaw were identified would they raise the alarm and warn people? It’s my belief that they’d fix it quietly, release a patch and hope no-one hears about it.

In the last year there have been a number of organisations offering rewards, or ‘bounty’ programmes,

for discovering and reporting bugs in applications. Mozilla currently offers up to $3,000 for crucial or high bug identifi cation, Google pays out $1,337 for fl aws in its software and Deutsche Post is currently sifting through applications from ‘ethical’ hackers to approve teams who will go head to head and compete for its Security Cup in october. The winning team can hold aloft the trophy if they fi nd vulnerabilities in its new online secure messaging service – that’s comforting to current users. So, are these incentives the best way to make sure your applications are secure?

I’d argue that these sorts of schemes are nothing short of a publicity stunt and, in fact, can be potentially dangerous to an end users security.

The problem with ‘white hat’ hackers One concern is that, by inviting hackers to trawl all over a new application prior to its launch, just grants them more time to interrogate it and identify weaknesses which they may decide are more valuable if kept to themselves. Once the first big announcement is made detailing who has purchased the application, with where and when the product is to go live, the hacker can use this insight to breach the system and steal the corporate jewels.

A further worry is that, while on the surface it may seem that these companies are being open and honest, if a serious security flaw were identified would they raise the alarm and warn people? It’s my belief that they’d fix it quietly, release a patch and hope no-one hears about it. The hacker would happily claim the reward, promise a vow of silence and then ‘sell’ the details on the black market leaving any user, while the patch is being developed or if they fail to install the update, with a great big security void

in their defences just waiting to be exploited.

Sometimes it’s not even a flaw in the software that can cause problems. If an attack is launched against the application, causing it to fail and reboot, then this denial of service (DOS) attack can be just as costly to your organisation as if the application were breached and data stolen.

A final word of warning is that, even if the application isn’t hacked today, it doesn’t mean that tomorrow they’re not going to be able to breach it. Windows Vista is one such example. Microsoft originally hailed it as ‘it’s most secure operating system they’d ever made’ and we all know what happened next.

A proactive approach to securityIT’s never infallible and for this reason penetration testing is often heralded as the hero of the hour. That said technology has moved on and, while still valid in certain circumstances, historical penetration testing techniques are often limited in their effectiveness.

Let me explain – a traditional test is executed from outside the network perimeter with the tester seeking applications to attack. However, as these assaults are all from a single IP address, intelligent security software will recognise this behaviour as the IP doesn’t change. Within the first two or three attempts the source address is blacklisted or fire walled and all subsequent traffic is immaterial as all activities are seen and treated as malicious.

There isn’t one single piece of advice that is the answer to all your prayers. Instead you need two and both need to be conducted simultaneously if your network’s to perform in perfect harmony:

Application testing combined with intrusion detection.The reason I advocate application testing is, if you have an application that’s public-facing, and it were

REWARD WILL BE PAID*for discovering and reporting

*for discovering and reporting bugs in applications. Mozilla *bugs in applications. Mozilla currently offers up to $3,000 for *currently offers up to $3,000 for

TEST | February 2011 www.testmagazine.co.uk

28 | Penetration testing

Anthony Haywood CTO Idappcomwww.idappcom.com

Hackers continue to refine tacticsCybercriminals are shifting the target of their attacks from traditional infrastructure to mobile users and endpoint devices, according to the 2011 Global Security Report from Trustwave.

The research revealed that malicious tools became more customised, automated and persistent in 2010. This trend, combined with the popularity of mobile devices and social media, is providing the perfect recipe for cybercriminals looking to compromise business, customer and user private and sensitive information. Key report findings:• Third-party vendors continue to put companies at risk:

88 percent of breaches resulted from insecure software code or lax security practices in the management of third-party technology;

• Cybercriminals ‘got fresh’ in 2010: because in-transit credit card data is usually more recently created (more fresh) than stored data, two-thirds (66 percent) of investigations found the theft of data in transit;

• Food and beverage regained its title as the most breached industry: representing 57 percent of the investigations;

• A single organised crime syndicate may be responsible for more than 30 percent of all 2010 data breaches.

Evolving threatsAmong the most interesting and surprising elements of the report is the rate and sophistication of attacks against mobile platforms and social networking sites. As the security of mobile networks has improved, mobile devices are increasingly the target of attacks, while social networking sites are quickly becoming cybercriminals’ platform of choice to expand and propagate destructive botnets. Drive-by infections and mobile phishing attacks were among the most popular client-side attacks in 2010.• Geolocation data is helping cybercriminals launch

more sophisticated and targeted attacks against social networks;

• Mobile devices offer cybercriminals an open door to corporate authentication credentials, sensitive data and trade secrets;

• Anti-virus software is losing the battle against malware: the new breed of malware is virtually undetectable by current scanning software.

Top strategic security initiatives for 2011A key take-away from the report is that attacks are often successful in organisations that believed a comprehensive data security strategy was in place. For executives and managers who are tasked with ensuring their company does not suffer a security event, the report offers specific guidance for 2011. • Assess, reduce and monitor client-side attack surface:

monitor and inventory applications to measure adherence to standards and evaluate risk;

• Embrace social networking but educate staff: an established policy and education can help protect against attacks originating from social networking tools;

• Develop a mobile security programme: gaining control over configurations of mobile devices will help reduce risk;

• Enforce security upon third-party relationships: choose a platform and vendor with a solid security history, and require vendors to undergo third-party security testing.“This year, we expanded the analysis of our

compromise investigations, took a deeper look at t he expanding and evolving landscape of data security vulnerabilities,” said Robert J McCullen, chairman and CEO of Trustwave. “In 2011 and beyond, organisations that approach their initiatives firmly committed to including security as an integrated requirement, and not just as a checkbox, will be most resilient to attack, reduce their risk to compromise, and be able to best protect both sensitive data and reputation.”

John Yeo, director of SpiderLabs EMEA,Trustwave’s security team added, “While the myriad of new devices and services around us continue to enable our personal and professional lives, it’s imperative that there is a responsible focus on security at both the organisation and individual level. Criminals seek the path of least resistance and will take any opportunity to get hold of valuable information. We’ve seen that new or poorly managed systems are low-hanging fruit when the security implications have not been fully considered.”

The 2011 Global Security Report is available at: http://bit.ly/fjWsNZ

compromised the financial impact to the organisation could potentially be fatal. There are technologies available that can test your device or application with a barrage of millions upon millions of iterations, using different broken or mutated protocols and techniques, in an effort to crash the system. If a hacker were to do this, and caused it to fall over or reboot, this denial of service could be at best embarrassing but at worst detrimental to your organisation.

Intrusion detection, capable of spotting zero day exploits, must be deployed to audit and test the recognition and response capabilities

of your corporate security defences. It will substantiate that, not only is the network security deployed and configured correctly, but that it’s capable of protecting the application that you’re about to make live or have already launched irrespective of what the service it supports is – be it email, a web service, anything. The device looks for characteristics in behaviour to determine if an incoming request to the product or service is likely to be good and valid or if it’s indicative of malicious behaviour. This provides not only reassurance, but all important proof, that the

network security is capable of identifying and mitigating the latest threats and security evasion techniques.

While we wait with baited breath to see who will lift Deutsche Post’s Security Cup we mustn’t lose sight of our own challenges. My best advice would be that, instead of waiting for the outcome and relying on others to keep you informed of vulnerabilities in your applications, you must regularly inspect your defences to make sure they’re standing strong with no chinks. If you don’t the bounty may as well be on your head.

February 2011 | TESTwww.testmagazine.co.uk

For exclusive news, features, opinion, comment, directory, digital archive and much more visit

www.testmagazine.co.uk

Subscribe to TEST free!

Published by 31 Media Ltd

www.31media.co.uk

Telephone: +44 (0) 870 863 6930

Facsimile: +44 (0) 870 085 8837

Email: [email protected]

Website: www.31media.co.uk

Inside: Small-scale testing | Reporting | Enhanced application testingPhil Kirkham tackles technical debt

DEALING WITH DEBT

Visit T.E.S.T online at www.testmagazineonline.com

I N T O U C H W I T H T E C H N O L O G Y

T H E E U R O P E A N S O F T W A R E T E S T E R

Volume 2: Issue 2: June 2010

Inside:

16-page T.E.S.T Digest

T.E.S.T

TH

E E

UR

OP

EA

N S

OF

TW

AR

E T

ES

TE

R

VO

LU

ME

2: I

SS

UE

2: J

UN

E 2

01

0

Small-scale testing | Reporting | Enhanced application testingPhil Kirkham tackles technical debt

Phil Kirkham tackles technical debt

DEALING WITH DEBTWITH DEBTWITH DEBT

Visit T.E.S.T online at www.testmagazineonline.com

I N T O U C H W I T H T E C H N O L O G Y

E S T E RE S T E R

Inside:

16-page T.E.S.T Digest

Inside: Offshore testing | Penetration testing | Business alignment

Yogesh Singh and Ruchika Malhotra on optimising web performance testing

INDIA: TESTING POWERHOUSE

Visit T.E.S.T online at www.testmagazineonline.com

INDIA: TESTING

I N T O U C H W I T H T E C H N O L O G Y

T H E E U R O P E A N S O F T W A R E T E S T E RVolume 2: Issue 3: September 2010

T.E.S.T

TH

E E

UR

OP

EA

N S

OF

TW

AR

E T

ES

TE

RV

OL

UM

E 2

: IS

SU

E 3

: SE

PT

EM

BE

R 2

01

0

Offshore testing | Penetration testing | Business alignment

Yogesh Singh and Ruchika Malhotra Yogesh Singh and Ruchika Malhotra Yogesh Singh and Ruchika Malhotra Yogesh Singh and Ruchika Malhotra Yogesh Singh and Ruchika Malhotra Yogesh Singh and Ruchika Malhotra Yogesh Singh and Ruchika Malhotra Yogesh Singh and Ruchika Malhotra Yogesh Singh and Ruchika Malhotra on optimising web performance testingon optimising web performance testingon optimising web performance testingon optimising web performance testingon optimising web performance testingon optimising web performance testingon optimising web performance testingon optimising web performance testingon optimising web performance testingon optimising web performance testingon optimising web performance testingon optimising web performance testing

INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING POWERHOUSEPOWERHOUSEPOWERHOUSEPOWERHOUSEPOWERHOUSEPOWERHOUSE

Visit T.E.S.T online at www.testmagazineonline.com

INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING INDIA: TESTING

I N T O U C H W I T H T E C H N O L O G Y

O F T W A R E T E S T E RE S T E R

Inside: Performance testing | Agile methods | Stress testing

Ian Kennedy on the challenges of testing major IT projects

THE RACE WITH COMPLEXITY

Visit T.E.S.T online at www.testmagazine.co.uk

I N T O U C H W I T H T E C H N O L O G Y

T H E E U R O P E A N S O F T W A R E T E S T E RVolume 2: Issue 4: December 2010

TESTEXPO

PREVIEW ON PAGE 36

T.E.S.T

TH

E E

UR

OP

EA

N S

OF

TW

AR

E T

ES

TE

R

VO

LU

ME

2: I

SS

UE

4: D

EC

EM

BE

R 2

01

0

INNOVAT ION FOR SOFTWARE QUAL I TY

INNOVAT ION FOR SOFTWARE QUAL I TY

TEST | February 2011 www.testmagazine.co.uk

30 | Test best practice

Steve Miller, vice president of ALM solutions for SmartBear Software outlines the best practices for uniting both automated and manual test approaches to improve your software releases.

Combining the automated with the manual

February 2011 | TESTwww.testmagazine.co.uk

Test best practice | 31

Many teams experiment with automation by trying to use an existing manual tester or programmer then when the automation effort fails, they scratch their heads to figure out why. It’s simple: an automation engineer brings years of experience that reduces re-work and is dedicated so that other manual testing and programming tasks do not interfere with their role of automating tests.

Software development teams are always looking for an edge to produce features more quickly

while retaining a high level of software quality. Most software quality teams realise that it takes both automated and manual test efforts to keep pace with quickening release cycles but are not sure how to get the most out of their testing efforts.

Best practices for planning your automated test effortMany companies run their regression test cases manually, so when does it make sense to begin automating your regression test cases? In most cases, it’s a good idea to consider automating your test cases when you can no longer run the regression test cases on each build created. For example, if you are doing daily or weekly builds of your code for the quality assurance team, and you cannot quickly run your full regression test cases with each build, it is time to consider automating them. When investing in automation, spend your time wisely using best practice approaches.

Best Practice 1 – Hire a dedicated automation engineerMany teams experiment with automation by trying to use an existing manual tester or programmer then when the automation effort fails, they scratch their heads to figure out why. It’s simple: an automation engineer brings years of experience that reduces re-work and is dedicated so that other manual testing and programming tasks do not interfere with their role of automating tests.

If costs of another head count are an issue, consider the statistics: standard industry results show that it’s 30-100 times less expensive to find defects in QA than once your software is released to customers! Most companies find that head count concerns dissipate as they start testing their applications daily and reducing QA time while improving software quality.

Best Practice 2 – Start small by attacking your smoke tests firstDon’t try to automate everything under the sun. Instead, start small – a good place to begin is by automating your smoke tests. Smoke tests are the basic

tests you run on a new build to ensure nothing major was broken with the new build. This group may include only 20 or 25 tests, but by starting with just these, you quickly to see immediate impact from your efforts.

Best Practice 3 – Automate your regression testsOnce you have automated your smoke tests, move on to your regression tests. Regression tests ensure that the new build has not broken existing features. Automating your regression tests may involve automating a large number of tests, so take a methodical approach and focus on the areas of highest impact:1. frequently-performed tests – Start

by automating the regression tests that are frequently performed. Automating a test that gets run once a release cycle isn’t nearly as impactful as automating a test case that is run 100 times during a release cycle.

2. Time-consuming tests – Some tests take hours to run: they involve setting up database table entries, running user interface tests, then querying the database to ensure the data was handled correctly. When done manually, these tests can take hours, so automating these tests can free your day up for less time consuming tests.

3. High-precision tests – Look for tests that require a high degree of precision. By definition, these test cases have a low tolerance for error: if a mistake is made when running the test, you have to scrap everything and start over again. Such a test may involve complex mathematical validations, or a complex series of steps you have to follow to execute a test case that, when interrupted, forces you to start over again. Once these tests are automated, you will get more consistent results and reduce the stress of running them manually.

Best Practice 4 – Intelligently organise your automated tests Based on project and team sizeIf you have a small team with one automation engineer, a few manual testers and a few programmers, the likelihood that you will need to split up the automation test effort between team members is small. Keep the

TEST | February 2011 www.testmagazine.co.uk

32 | Test best practice

structure simple by organising your automation tests with folders inside a single project, with each folder housing the tests for a functional area of your software. It is also a good practice to have a ‘common’ folder that contains common test cases that can be re-used by calling them from test cases that reside in your functional area folders. Examples of re-usable test cases are those for logging in and out of the software, sending emails, etc.

If you have multiple product lines and automation engineers, you will have issues if the automation engineers need to access test cases from within a single project because they will have concurrency and source code checkout issues. To prevent these problems, create a project suite that contains multiple projects (one for each product line, one for common tests, etc). Within each project, organise them with folders that are separated by functional area so that you can quickly find test cases that relate to areas of your software. By having multiple projects, automation engineers can check those out separately without the worry of overwriting someone else’s work.

Best Practice 5 – Keep your tests protected with source controlEver lost your hard drive or overwritten something by mistake? We all have done this, and recovering from it can be simple or can be impossible. By using a source control system (like Subversion, Perforce, ClearCase, TFS, etc), you can prevent loss of data. As you make changes to your test cases, check them into your Source Control system and if you ever need to roll back to the prior version, it is simple to do.

Best practices for planning your manual test effortWhile automated tests are great for reducing time spent running regression tests, you still need manual tests for testing new features or enhancing existing features of your software. When planning out your manual test effort, best practices dictate that you take a methodical approach that produces great results and is repeatable.

Best Practice 1 – Great testing starts with great requirementsEver worked on a software project that spent as much time in the QA phase as it did in development? One where the end result was lots of re-work, missed deadlines and frustrated team members? Much of

this re-work can be eliminated by first producing great requirements. By great requirements, we’re not talking about heavy requirements that fill up bound notebooks and elicit siestas during team reviews. A good requirement has three attributes:• Succinct yet descriptive narrative;• Explicit list of business rules;• Prototype – a mock up or wireframe of the functionality.

Best Practice 2 – Create positive and negative testsWhen creating your test plan, ensure that you have positive test cases (those that ensure the functionality works as designed), negative test cases (those that ensure that any data entry and uncommon use issues are handled gracefully), performance test cases (to ensure that the new release performs as well as or better than the prior release), and relational tests (those that ensure referential integrity, etc).

Best Practice 3 – Ensure test cases have requirement traceabilityWhen creating your test plan, the best way to ensure you have enough test coverage for each requirement is to create a traceability matrix that shows the number of types of test cases for each requirement. By doing this, you will quickly spot requirements that do not have adequate test coverage or missing test cases.

Best Practice 4 – Publish your test cases to developers earlyOnce your testers have completed their test cases, publish them to the programmers so that they can see the tests that will be run. Your programmers should review the test cases to ensure that their code will accommodate logic for each of the tests; this simple tactic will dramatically reduce re-work during the QA phase.

uniting your automated and manual test effortsLet’s imagine that your automation engineers have automated your smoke and regression test cases and your manual test engineers have created a complete set of test cases that have great traceability and test coverage. The development team has just shipped the first version of the code to the QA team and plans to perform daily builds each day of the testing cycle. Here are some best practices to keep the QA phase operating efficiently.

While automated tests are great for reducing time spent running regression tests, you still need manual tests for testing new features or enhancing existing features of your software. When planning out your manual test effort, best practices dictate that you take a methodical approach that produces great results and is repeatable.

For exclusive news, features, opinion, comment, directory, digital archive and much more visit

www.testmagazine.co.uk

The Whole Story

www.31media.co.uk

Print Digital Online

INNOVAT ION FOR SOFTWARE QUAL I TY

34 | Feature

TEST | February 2011 www.testmagazine.co.uk

34 | Test best practice

Best Practice 1 – Schedule your automation runs daily during the QA phaseNow that you have set up your automation test cases, it is important to run them each day so that you can quickly discover if the new build has broken any of the existing functionality.

When doing this, there are a couple of approaches you can take. If your builds are being done by a continuous integration tool (like Automated Build Studio (http://www.automatedqa.com/products/abs/), Hudson, Cruise Control, etc.), then you can launch your automated tests from the continuous integration tool.

If you are doing builds manually or if you prefer to have the automation launch at a specific time, you can schedule them to launch using a scheduling tool such as SmartBear’s Software Planner Application Lifecycle Management (ALM) tool. A good scheduling tool should be able to launch the automated tests on a specific machine at specific times each day of the week and then log the results of the test run on dashboards, so that you can easily see how many automated tests ran, how many passed and how many failed. You also want to be able to see which tests failed, so your programmers can check the code to fix any issues the new build caused.

Best Practice 2 – create reproducible defectsNothing drains time in QA like reporting defects that are not reproducible. Each time a tester reports a defect that is not reproducible, it takes more time for the programmer to report that it is not reproducible, time for the tester to re-document how to reproduce it, and more time for the programmer to try again.

So how do we solve this? The best way is to publish a narrated movie that shows what you did to reproduce it. Do this with a free product called Jing (http://www.jingproject.com) that allows you to highlight your application, record the keystrokes, then produce a movie (with narration if you narrate it with a headset) that is accessible via a URL. Include the URL of the movie on the defect you send to the programmer and the programmer has everything they need to see your defect in action!

optimising your test efforts during the QA cycleDuring the QA phase, it is important to meet daily as a team for 15 minutes (referred to by Agile shops as a Daily Scrum Meeting) to assess your test progression and to prioritise defects so that the most important ones are addressed. If your test management and defect tracking tools have dashboards that show test progression, make use of those during your meeting by presenting them interactively during your meeting (use a projector or online conferencing to review those as a team).

Best Practice 1 – Review test case progressionThe first indicator you should review is how much progress the QA team is making towards running all the test cases for the release.

Diagram 2 is an example of a dashboard that shows day-by-day how many test cases are run, how many passed, how many failed and how many are still awaiting run. A large number of failed test cases signal a quality problem. If you find that test cases are not being executed at a pace that allow you to finish all tests within your QA window, this knowledge allows you to adjust by adding more help or extending working hours to get it done.

Best Practice 2 – Prioritise defects dailyThe next indicator to review is the number of defects by priority and assignee. This information allows you to determine if specific programmers are overloaded with defect work and helps you to more evenly distribute the load. When prioritising defects, we like to prioritise them based on severity and how important they are to the software release. The key to this tactic is to objectively define your severity levels so that it is clear how problematic they are. We use these severity levels:1 – Crash (crashes the software);2 – Major Bug (with no workaround);3 – Workround (major defect

with a workaround);4 – Trivial Bug.

Based on these severities, you can choose what priority they should be fixed in (1-Fix ASAP, 2-Fix Soon, 3-Fix If Time). Below are some dashboards

you might consider when evaluating defect priorities:

In the graphs, you can see that most defects are major with no workaround, which implies a quality issue. You will also see that too many defects are categorised as high priority, which means that your team needs to make tougher decisions on how your prioritise them to ensure that the most important ones are fixed first.

Evaluating defects by assignee can indicate if a specific programmer is overloaded with work.

using retrospectives to improve future testing effortsOnce your software release makes it to production, it is important to look back at the things you did right and the things you can improve upon so that you can take these ‘lessons learned’ into the next development effort you embark on. This approach is sometimes referred to as a ‘post mortem’ or ‘retrospective’.

Best Practice 1 – Analyse your project variancesIf you used tools to plan out your work efforts (programming and testing hours for each requirement) and you recorded the time each person spent on each requirement, you then have valuable information on how well your team is able to estimate tasks. Software Planner and many other project management tools have the ability to track estimated versus actual hours. Using these tools, you can capture the release analytics:

In Table 1, notice that the first sprint was under-estimated. In sprint 2, a correction was made (based on the retrospective) so that estimated hours were buffered and the end result was that the sprint came in under variance. By collecting the variance information release-by-release and sprint-by-sprint (if using Agile), you can make adjustments by buffering estimates in upcoming releases.

Best Practice 2 – Analyse quality assurance metricsIt is also important to track how many test cases were run and how many defects were discovered during the release. Table 2 is an example of how we tracked this information for a prior release:

February 2011 | TESTwww.testmagazine.co.uk

Test best practice | 35

Best Practice 3 – Document your retrospectiveEquipped with the analytics discussed above in best practices 1 and 2, you are now ready to hold your Retrospective meeting. Schedule the meeting and ask everyone who participated in the software release to bring into the meeting a list of three things they think the team did well and three things the team can improve upon.

During the retrospective meeting, go around the room and have each person discuss the three things done wrong and right. As each person presents their opinions, you will start to see commonalities (people will agree on the things done right and wrong), tally up the commonalities and score them.

Once done, analyse the things the team thought could be improved on and create action items (assigned to specific people) to follow up with a plan to improve those things in the next release. Once done, document your retrospective (using MS Word or something similar) and publish your retrospective in a central repository so that team members can review it in the future. If using Software Planner, you can use the Shared Documents area to store the retrospective and all team members can access it.

When you begin the planning for your next release, be sure to access the retrospective to refresh your memory on the things done well (so that you can continue to do them well) and to remind yourself of the action items taken to improve the things that were not done well.

Steve Miller Vice president of ALM solutions SmartBear Softwarehttp://smartbear.com

DIAGRAM 1

DIAGRAM 2

GRAPH 1

TABLE 1

GRAPH 2

TABLE 2

TEST | February 2011 www.testmagazine.co.uk

Slimming World is the largest and most advanced slimming organisation in the uK

and was founded in 1969 by Margaret Miles-Bramwell oBE, fRSA. Today there are more than 6,700 groups held weekly nationwide via a network of 3,000 Slimming World trained consultants who run their own franchised clubs in local communities.

For many years Slimming World used a paper-based system for keeping track of its groups. Members would weigh in and a consultant would fill out a paper-based sheet and manually send

it in to the Slimming World head office. Once the data reached head office a traditional system of manual data inputting took place, a painstaking task that was not only an inefficient use of staff time, but also the data analysis was slower to research.

Slimming World executives wanted to match their service to their members and their significant achievements with an advanced IT system that would make it easier for their consultants to administer their groups. So with a high level of quality and care for their members and consultants Slimming World developed XpressWeigh a bespoke application specifically designed to optimise the way

The slimming organisation Slimming World employed a network emulation solution to ensure that when its in-house developed XpressWeigh software went live, it performed exactly as expected.

Emulating the network

36 | Network emulation

Test Center of Excellence

ERP Testing Services

Application Infrastructure Management

Test Automation Services

Test Process Consulting

Performance Testing Services

Security Testing Services

Certification Services

Usability Testing

User Acceptance Testing Services

Compatibility and Interoperability Testing Services

Internationalization and Localization Testing

Embedded Systems Testing

Migration Testing

System Integration Testing

Mobile Device Testing Services

Mobile Application Testing Services

Game Testing Serviceswww.applabs.com [email protected]

AppLabs is the Independent Testing Partner to some of the World’s

Largest Airlines, Banks and Stock Exchanges, Healthcare and

Pharmaceutical Companies, Retail Giants and Insurance Providers.

While our customers may have several development

partners they have only ONE Independent Testing

Partner, AppLabs.

TEST | February 2011 www.testmagazine.co.uk

38 | Network emulation

consultants support members. As well as offering weight loss to ordinary members, Slimming World offers a service to a number of GPs and Primary Care Trusts where patients who need to lose weight for their health are referred to groups. The company invests a great deal into ensuring that all its methods and systems are in line with the most up to date scientific research and methods.

XpressWeigh has enabled the company’s consultants to update their members’ details within minutes rather than weeks. This means it’s faster for them and they are able to send in the most up-to-date information which can be logged and analysed, in record time. This information, such as new weight statistics are all essential in tracking the improvements being delivered by Slimming World to ensure the best service to members.

Network emulationOnce the decision had been made to develop ‘XpressWeigh’ it was passed to their IT team to make sure that every member in the UK, Northern Ireland, Eire and the USA would not only have full access to the new online system but also a positive experience. Sean Chapman, from Slimming World’s IT department, decided to research how best to go about achieving a quality performance experience and while at a software event heard a talk given by Frank Puranik of iTrinegy on ‘Will it work in the WAN?’. The talk seemed to ask the questions Chapman had been asking but he says it also offered the answers.

“It was imperative that our new bespoke application would work in the WAN with many different types of network connections, from clients in

large cities to small communities where the network wasn’t very sophisticated,” says chapman, “We needed to understand the areas that had poor internet connections, areas where latency would impact our application, and some of our consultants were using old dial-up modems and or had low bandwidth. After listening to Frank’s presentation I knew that he would be able to help us understand how we could make it work in all network environments.”

Specialising in emulating different types of networks (WAN, wireless, satellite etc) iTrinegy introduced its INE Compact, a small inline network emulator (book size) appliance which was perfect for replicating all the network conditions that Slimming World’s consultants would encounter.

WAN emulation – the safe, controllable alternativeWAN emulation/network simulation technology behaves like a real WAN or wireless environment but can be deployed in the same room as your normal test rig or even on your desktop. It allows you to recreate a wide variety of different WAN or wireless conditions and enables you to ensure that software can be tested during prototype, development, quality assurance and pre-deployment testing.

By being able to manipulate the network characteristics such as bandwidth, latency, packet loss etc., it is possible to replicate a wide variety of network environments in which to conduct your tests and see how the application behaves.

Typical network impairments that a network emulator should be able to produce include:• Bandwidth restrictions;• Delay (latency and jitter);

XpressWeigh has enabled the company’s consultants to update their members’ details within minutes rather than weeks. This means it’s faster for them and they are able to send in the most up-to-date information which can be logged and analysed, in record time. This information, such as new weight statistics are all essential in tracking the improvements being delivered by Slimming World to ensure the best service to members.

February 2011 | TESTwww.testmagazine.co.uk

Network emulation | 39

• Packet re-ordering, packet error and packet loss;

• Traffic shaping and traffic prioritisation (QoS).

A good network emulator will also be able to recreate the following types of network:• High latency WANs (national,

international and satellite);• Wireless networks (eg Wi-Fi,

WiMAX and 3G);• Jittery networks – such as cause

VoIP deployments a problem;• Networks that lose and/

or damage traffic;• QoS type networks, including

MPLS, ATM and VLANs.It should also be possible to apply

different impairments to different traffic as would happen in a real WAN. Network emulators can also work seamlessly with load generation and performance tools to further enhance the testing of applications.

The benefitsUtilising WAN emulation technology as part of the testing programme enables the tester to conduct much more realistic pre-deployment checks on application behaviour. Being able to spot and correct network-related problems earlier in the application lifecycle greatly enhances the opportunity for a successful roll-out into the production network as well as helping to save money and time, which in the current economic climate has to be a good thing. If, through the deployment of this technology, the value of the testing team is seen to be enhanced as well, then that can’t be a bad thing either.

The INE Compact emulator gave Chapman’s team the ability to simulate various inter-site WAN technologies such as ISDN and ADSL etc while

offering the opportunity to simulate the running of the system under high loads on unpredictable networks.

There are many simulation parameters offered by the emulator, including link speed, packet loss, error rates and delay which allowed Chapman to fully test XpressWeigh prior to its release. “INE Compact has helped ensure that XpressWeigh works for every consultant throughout Slimming World,” he says, “and that we have been able to adapt the areas where the network was difficult, this has enabled us to guarantee a positive experience for all our Consultants.”

Network emulators can also be used to continually test the changing conditions an application may encounter. “We continue to test with the emulator any new updates and also regression testing. This has been really useful when testing new code as the INE Compact catches any possible problems prior to rolling out any new updates,” says Chapman. “One of the big surprises was how easy the emulator was to set-up and use, we started to receive meaningful data within an hour, and we were able to test, tweak and work on the application so that when we went live there were no surprises. We have never had any issues with the emulator, we found it so straight forward and the documentation is simple and easy to use.

“INE Compact has been a very useful addition and has helped us achieve, on time and within budget, exactly what we needed it to do. Our XpressWeigh system has been successful due, in part, to network emulation technology which helped us understand the issues that the network throws our way.”www.itrinegy.com

“We continue to test with the emulator any new updates and also regression testing. This has been really useful when testing new code as the INE Compact catches any possible problems prior to rolling out any new updates,”

TEST | February 2011 www.testmagazine.co.uk

40 | TEST company profile

Facilita has created the Forecast™ product suite which is used across multiple business sectors to performance test applications, websites and IT infrastructures of all sizes and complexity. With this class-leading testing software and unbeatable support and services Facilita will help you ensure that your IT systems are reliable, scalable and tuned for optimal performance.

Forecast, the thinking tester's power toolA sound investment: A good load testing tool is one of the most important IT investments that an organisation can make. The risks and costs associated with inadequate testing are enormous. Load testing is challenging and without good tools and support will consume expensive resources and waste a great deal of effort.

Forecast has been created to meet the challenges of load testing, now and in the future. The core of the product is tried and trusted and incorporates more than a decade of experience but is designed to evolve in step with advancing technology.

Realistic load testing: Forecast tests the reliability, performance and scalability of IT systems by realistically simulating from one to many thousands of users executing a mix of business processes using individually configurable data.

Comprehensive technology support: Forecast provides one of the widest ranges of protocol support of any load testing tool.

1. Forecast Web thoroughly tests web-based applications and web services, identifies system bottlenecks, improves application quality and optimises network and server infrastructures. Forecast Web supports a comprehensive and growing list of protocols, standards and data formats including HTTP/HTTPS, SOAP, XML, JSON and Ajax.

2. Forecast Java is a powerful and technically advanced solution for load testing Java applications. It targets any non-GUI client-side Java API with support for all Java remoting technologies including RMI, IIOP, CORBA and Web Services.

3. Forecast Citrix simulates multiple Citrix clients and validates the Citrix environment for scalability and reliability in addition to the performance of the hosted applications. This non-intrusive approach provides very accurate client performance measurements unlike server based solutions.

4. Forecast .NET simulates multiple concurrent users of applications with client-side .NET technology.

5. Forecast WinDriver is a unique solution for performance testing Windows applications that are impossible or uneconomic to test using other methods or where user experience timings are required. WinDriver automates the client user interface and can control from one to many hundreds of concurrent client instances or desktops.

6. Forecast can also target less mainstream technology such as proprietary messaging protocols and systems using the OSI protocol stack.

Powerful yet easy to use: Skilled testers love using Forecast because of the power and flexibility that it provides. Creating working tests is made easy with Forecast's script recording and generation features and the ability to compose complex test scenarios rapidly with a few mouse clicks. The powerful functionality of Forecast ensures that even the most challenging applications can be full tested.

Supports Waterfall and Agile (and everything in between): Forecast has the features demanded by QA teams like automatic test script creation, test data management, real-time monitoring and comprehensive charting and reporting.

Forecast is successfully deployed in Agile ‘Test Driven Development’ (TDD) environments and integrates with automated test (continuous build) infrastructures. The functionality of Forecast is fully programmable and test scripts are written in standard languages (Java, C#, C++ etc). Forecast provides the flexibility of open source alternatives along with comprehensive technical support and the features of a high-end enterprise commercial tool.

flexible licensing: Geographical freedom allows licenses to be moved within an organisation without additional costs. Temporary high concurrency licenses for ‘spike’ testing are available with a sensible pricing model. Licenses can be rented for short term projects with a ‘stop the clock’ agreement or purchased for perpetual use.

Our philosophy is to provide value and to avoid hidden costs. For example, server monitoring and the analysis of server metrics are not separately chargeable items and a license for Web testing includes all supported Web protocols.

ServicesIn addition to comprehensive support and training, Facilita offers mentoring where an experienced Facilita consultant will work closely with the test team either to ‘jump start’ a project or to cultivate advanced testing techniques. Even with Forecast’s outstanding script automation features, scripting is challenging for some applications. Facilita offers a direct scripting service to help clients overcome this problem.

We can advise on all aspects of performance testing and carry out testing either by providing expert consultants or fully managed testing services.

facilita Tel: +44 (0) 1260 298109 Email: [email protected] Web: www.facilita.com

facilitaLoad testing solutions that deliver results

Can you predict the future?Forecast tests the performance, reliability and scalability

of IT systems. Combine with Facilita’s outstanding

professional services and expert support and the future is

no longer guesswork.

visit Facilita at:

Powerful multi-protocol testing software

TM

Facilita Software Development Limited. Tel: +44 (0)1260 298 109 | email: [email protected] | www.facilita.com

4th October Guoman Tower Hotel. London

7th December Plaisterers Hall, London

WINTER 2010

February 2011 | TESTwww.testmagazine.co.uk

TEST company profile | 41

www.seapine.com Phone:+44 (0) 208-899-6775 Email: [email protected] Kingdom, Ireland, and Benelux: Seapine Software Ltd. Building 3, Chiswick Park, 566 Chiswick High Road, Chiswick, London, W4 5YA uK

Americas (Corporate Headquarters): Seapine Software, Inc. 5412 Courseview Drive, Suite 200, Mason, ohio 45040 uSA Phone: 513-754-1655

With over 8,500 customers worldwide, Seapine Software Inc is a recognised, award-winning, leading provider of quality-centric application lifecycle management (ALM) solutions. With headquarters in Cincinnati, Ohio and offices in London, Melbourne, and Munich, Seapine is uniquely positioned to directly provide sales, support, and services around the world.

Built on flexible architectures using open standards, Seapine Software’s cross-platform ALM tools support industry best practices, integrate into all popular development environments, and run on Microsoft Windows, Linux, Sun Solaris, and Apple Macintosh platforms.Seapine Software's integrated software development and testing tools streamline your development and QA processes – improving quality, and saving you significant time and money.

TestTrack RMTestTrack RM centralises requirements management, enabling all stakeholders to stay informed of new requirements, participate in the review process, and understand the impact of changes on their deliverables. Easy to install, use, and maintain, TestTrack RM features comprehensive workflow and process automation, easy customisability, advanced filters and reports, and role-based security. Whether as a standalone tool or part of Seapine’s integrated ALM solution, TestTrack RM helps teams keep development projects on track by facilitating collaboration, automating traceability, and satisfying compliance needs.

TestTrack Pro TestTrack Pro is a powerful, configurable, and easy to use issue management solution that tracks and manages defects, feature requests, change requests, and other work items. Its timesaving communication and reporting features keep team members informed and on schedule. TestTrack Pro supports MS SQL Server, Oracle, and other ODBC databases, and its open interface is easy to integrate into your development and customer support processes.

TestTrack TCM TestTrack TCM, a highly scalable, cross-platform test case management solution, manages all areas of the software testing process including test case creation, scheduling, execution, measurement, and reporting. Easy to install, use, and maintain, TestTrack TCM features comprehensive workflow and process automation, easy customisability, advanced filters and reports, and role-based security. Reporting and graphing tools, along with user-definable data filters, allow you to easily measure the progress and quality of your testing effort.

QA Wizard Pro QA Wizard Pro completely automates the functional and regression testing of Web, Windows, and Java applications, helping quality assurance teams increase test coverage. Featuring a next-generation scripting language, QA Wizard Pro includes advanced object searching, smart matching a global application repository, data-driven testing support, validation checkpoints, and built-in debugging. QA Wizard Pro can be used to test popular languages and technologies like C#, VB.NET, C++, Win32, Qt, AJAX, ActiveX, JavaScript, HTML, Delphi, Java, and Infragistics Windows Forms controls.

Surround SCM Surround SCM, Seapine’s cross-platform software configuration management solution, controls access to source files and other development assets, and tracks changes over time. All data is stored in industry-standard relational database management systems for greater security, scalability, data management, and reporting. Surround SCM’s change automation, caching proxy server, labels, and virtual branching tools streamline parallel development and provide complete control over the software change process.

Seapine SoftwareTM

TEST | February 2011 www.testmagazine.co.uk

42 | TEST company profile

The Green Hat differenceIn one software suite, Green Hat automates the validation, visualisation and virtualisation of unit, functional, regression, system, simulation, performance and integration testing, as well as performance monitoring. Green Hat offers code-free and adaptable testing from the User Interface (UI) through to back-end services and databases. Reducing testing time from weeks to minutes, Green Hat customers enjoy rapid payback on their investment.

Green Hat’s testing suite supports quality assurance across the whole lifecycle, and different development methodologies including Agile and test-driven approaches. Industry vertical solutions using protocols like SWIFT, FIX, IATA or HL7 are all simply handled. Unique pre-built quality policies enable governance, and the re-use of test assets promotes high efficiency. Customers experience value quickly through the high usability of Green Hat’s software.

Focusing on minimising manual and repetitive activities, Green Hat works with other application lifecycle management (ALM) technologies to provide customers with value-add solutions that slot into their Agile testing, continuous testing, upgrade assurance, governance and policy compliance. Enterprises invested in HP and IBM Rational products can simply extend their test and change management processes to the complex test environments managed by Green Hat and get full integration.

Green Hat provides the broadest set of testing capabilities for enterprises with a strategic investment in legacy integration, SOA, BPM, cloud and other component-based environments, reducing the risk and cost associated with defects in processes and applications. The Green Hat difference includes:

• Purpose built end-to-end integration testing of complex events, business processes and composite applications. Organisations benefit by having UI testing combined with SOA, BPM and cloud testing in one integrated suite.

• Unrivalled insight into the side-effect impacts of changes made to composite applications and processes, enabling a comprehensive approach to testing that eliminates defects early in the lifecycle.

• Virtualisation for missing or incomplete components to enable system testing at all stages of development. Organisations benefit through being unhindered by unavailable systems or costly access to third party systems, licences or hardware. Green Hat pioneered ‘stubbing’, and organisations benefit by having virtualisation as an integrated function, rather than a separate product.

• Scaling out these environments, test automations and virtualisations into the cloud, with seamless integration between Green Hat’s products and leading cloud providers, freeing you from the constraints of real hardware without the administrative overhead.

• ‘Out-of-the-box’ deep integration with all major SOA, enterprise service bus (ESB) platforms, BPM runtime environments, governance products, and application lifecycle management (ALM) products.

• ‘Out-of the box’ support for over 70 technologies and platforms, as well as transport protocols for industry vertical solutions. Also provided is an application programming interface (API) for testing custom protocols, and integration with UDDI registries/repositories.

• Helping organisations at an early stage of project or integration deployment to build an appropriate testing methodology as part of a wider SOA project methodology.

Corporate overviewSince 1996, Green Hat has constantly delivered innovation in test automation. With offices that span North America, Europe and Asia/Pacific, Green Hat’s mission is to simplify the complexity associated with testing, and make processes more efficient. Green Hat delivers the market leading combined, integrated suite for automated, end-to-end testing of the legacy integration, Service Oriented Architecture (SOA), Business Process Management (BPM) and emerging cloud technologies that run Agile enterprises.

Green Hat partners with global technology companies including HP, IBM, Oracle, SAP, Software AG, and TIBCO to deliver unrivalled breadth and depth of platform support for highly integrated test automation. Green Hat also works closely with the horizontal and vertical practices of global system integrators including Accenture, Atos Origin, CapGemini, Cognizant, CSC, Fujitsu, Infosys, Logica, Sapient, Tata Consulting and Wipro, as well as a significant number of regional and country-specific specialists. Strong partner relationships help deliver on customer initiatives, including testing centres of excellence. Supporting the whole development lifecycle and enabling early and continuous testing, Green Hat’s unique test automation software increases organisational agility, improves process efficiency, assures quality, lowers costs and mitigates risk.

Helping enterprises globallyGreen Hat is proud to have hundreds of global enterprises as customers, and this number does not include the consulting organisations who are party to many of these installations with their own staff or outsourcing arrangements. Green Hat customers enjoy global support and cite outstanding responsiveness to their current and future requirements. Green Hat’s customers span industry sectors including financial services, telecommunications, retail, transportation, healthcare, government, and energy.

Green Hat

[email protected] www.greenhat.com

February 2011 | TESTwww.testmagazine.co.uk

TEST company profile | 43

for more information, please visit http://www.microfocus.com/cqa-uk/

Continuous Quality AssuranceMicro Focus Continuous Quality Assurance (CQA) ensures that quality assurance is embedded throughout the entire development lifecycle – from requirements definition to ‘go live’.

CQA puts the focus on identifying and eliminating defects at the beginning of the process, rather than removing them at the end of development. It provides capabilities across three key areas:

Requirements: Micro Focus uniquely combines requirements definition, visualisation, and management into a single ‘3-Dimensional’ solution. This gives managers, analysts and developers the right level of detail about how software should be engineered. Removing ambiguity means the direction of the development and QA teams is clear, dramatically reducing the risk of poor business outcomes.

Change: Development teams regain control in their constantly shifting world with a single ‘source of truth’ to prioritize and collaborate on defects, tasks, requirements, test plans, and other in-flux artefacts. Even when software is built by global teams with complex environments and methods, Micro Focus controls change and increases the quality of outputs.

Quality: Micro Focus automates the entire quality process from inception through to software delivery. Unlike solutions that emphasize ‘back end’ testing, Micro Focus ensures that tests are planned early and synchronised with business goals, even as requirements and realities change.

Bringing the business and end-users into the process early makes business requirements the priority from the outset as software under development and test is continually aligned with the needs of business users.

CQA provides an open framework which integrates diverse toolsets, teams and environments, giving managers continuous control and visibility over the development process to ensure that quality output is delivered on time.

By ensuring correct deliverables, automating test processes, and encouraging reuse and integration, Continuous Quality Assurance continually and efficiently validates enterprise critical software.

The cornerstones of Micro Focus Continuous Quality Assurance are:

• Requirements Definition and Management Solutions;

• Software Change and Configuration Management Solutions;

• Automated Software Quality and Load Testing Solutions.

Requirements Caliber® is an enterprise software requirements definition and management suite that facilitates collaboration, impact analysis and communication, enabling software teams to deliver key project milestones with greater speed and accuracy.

• Streamlined requirements collaboration;• End to end traceability of requirements;• Fast and easy simulation to verify requirements;• Secure, centralized requirements repository.

Change StarTeam® is a fully integrated, cost-effective software change and configuration management tool. Designed for both centralized and geographically distributed software development environments, it delivers:

• A single source of key information for distributed teams;

• Streamlined collaboration through a unified view of code and change requests;

• Industry leading scalability combined with low total cost of ownership.

QualitySilk is a comprehensive automated software quality management solution suite which:

• Ensures that developed applications are reliable and meet the needs of business users;

• Automates the testing process, providing higher quality applications at a lower cost;

• Prevents or discovers quality issues early in the development cycle, reducing rework and speeding delivery.

SilkTest enables users to rapidly create test automation, ensuring continuous validation of quality throughout the development lifecycle. Users can move away from manual-testing dominated software lifecycles, to ones where automated tests continually test software for quality and improve time to market.

Take testing to the cloud Users can test and diagnose Internet-facing applications under immense global peak loads on the cloud without having to manage complex infrastructures.

Among other benefits, SilkPerformer® CloudBurst gives development and quality teams:

• Simulation of peak demand loads through onsite and cloud-based resources for scalable, powerful and cost effective peak load testing;

• Web 2.0 client emulation to test even today’s rich internet applications effectively.

Micro Focus Continuous Quality Assurance transforms ‘quality’ into a predictable managed path; moving from reactively accepting extra cost at the end of the process, to confronting waste head on and focusing on innovation.

Micro Focus, a member of the FTSE 250, provides innovative software that enables companies to dramatically improve the business value of their enterprise applications. Micro Focus Enterprise Application Modernization and Management software enables customers’ business applications to respond rapidly to market changes and embrace modern architectures with reduced cost and risk.

Micro focus

TEST | February 2011 www.testmagazine.co.uk

44 | TEST company profile

original Software

With a world class record of innovation, Original Software offers a solution focused completely on the goal of effective quality management. By embracing the full spectrum of Application Quality Management across a wide range of applications and environments, the company partners with customers and helps make quality a business imperative. Solutions include a quality management platform, manual testing, full test automation and test data management, all delivered with the control of business risk, cost, time and resources in mind.

Setting new standards for application qualityToday’s applications are becoming increasingly complex and are critical in providing competitive advantage to the business. Failures in these key applications result in loss of revenue, goodwill and user confidence, and create an unwelcome additional workload in an already stretched environment. Managers responsible for quality have to be able to implement processes and technology that will support these important business objectives in a pragmatic and achievable way, without negatively impacting current projects.

These core needs are what inspired Original Software to innovate and provide practical solutions for Application Quality Management (AQM) and Automated Software Quality (ASQ). The company has helped customers achieve real successes by implementing an effective ‘application quality eco-system’ that delivers greater business agility, faster time to market, reduced risk, decreased costs, increased productivity and an early return on investment.

These successes have been built on a solution that provides a dynamic approach to quality management and automation, empowering all stakeholders in the quality process, as well as uniquely addressing all layers of the application stack. Automation has been achieved without creating a dependency on specialised skills and by minimising ongoing maintenance burdens.

An innovative approachInnovation is in the DNA at Original Software. Its intuitive solution suite directly tackles application quality issues and helps organisations achieve the ultimate goal of application excellence.

Empowering all stakeholdersThe design of the solution helps customers build an ‘application quality eco-system’ that extends beyond just the QA team, reaching all the relevant stakeholders within the business. The technology enables everyone involved in the delivery of IT projects to participate in the quality process – from the business analyst to

the business user and from the developer to the tester. Management executives are fully empowered by having instant visibility of projects underway.

Quality that is truly code-freeOriginal Software has observed the script maintenance and exclusivity problems caused by code-driven automation solutions and has built a solution suite that requires no programming skills. This empowers all users to define and execute their tests without the need to use any kind of code, freeing them from the automation specialist bottleneck. Not only is the technology easy to use, but quality processes are accelerated, allowing for faster delivery of business-critical projects.

Top to bottom qualityQuality needs to be addressed at all layers of the business application. Original Software gives organisations the ability to check every element of an application - from the visual layer, through to the underlying service processes and messages, as well as into the database.

Addressing test data issuesData drives the quality process and as such cannot be ignored. Original Software enables the building and management of a compact test environment from production data quickly and in a data privacy compliant manner, avoiding legal and security risks. It also manages the state of that data so that it is synchronised with test scripts, enabling swift recovery and shortening test cycles.

A holistic approach to qualityOriginal Software’s integrated solution suite is uniquely positioned to address all the quality needs of an application, regardless of the development methodology used. Being methodology neutral, the company can help in Agile, Waterfall or any other project type. The company provides the ability to unite all aspects of the software quality lifecycle. It helps manage the requirements, design, build, test planning and control, test execution, test environment and deployment of business applications from one central point that gives everyone involved a unified view of project status and avoids the release of an application that is not ready for use.

Helping businesses around the worldOriginal Software’s innovative approach to solving real pain-points in the Application Quality Life Cycle has been recognised by leading multinational customers and industry

analysts alike. In a 2010 report, Ovum stated: “While other companies have diversified, into other test types and sometimes outside testing completely, Original has stuck more firmly to a value proposition almost solely around unsolved challenges in functional test automation. It has filled out some yawning gaps and attempted to make test automation more accessible to non-technical testers.”

More than 400 organisations operating in over 30 countries use Original Software solutions. The company is proud of its partnerships with the likes of Coca-Cola, Unilever, HSBC, FedEx, Pfizer, DHL, HMV and many others.

www.origsoft.com [email protected] Tel: +44 (0)1256 338 666 fax: +44 (0)1256 338 678Grove House, Chineham Court, Basingstoke, Hampshire, RG24 8AG

Delivering quality through innovation

February 2011 | TESTwww.testmagazine.co.uk

TEST company profile | 45

Spirent Communications plc Tel: +44(0)7834752083 Email: [email protected] Web: www.spirent.com

For over 20 years Parasoft has been studying how to efficiently create quality computer code. Our solutions leverage this research to deliver automated quality assurance as a continuous process throughout the SDLC. This promotes strong code foundations, solid functional components, and robust business processes. Whether you are delivering Service-Orientated Architectures (SOA), evolving legacy systems, or improving quality processes – draw on our expertise and award winning products to increase productivity and the quality of your business applications.

Parasoft's full-lifecycle quality platform ensures secure, reliable, compliant business processes. It was built from the ground up to prevent errors involving the integrated components – as well as reduce the complexity of testing in today's distributed, heterogeneous environments.

What we doParasoft's SOA solution allows you to discover and augment expectations around design/development policy and test case creation. These defined policies are automatically enforced, allowing your development team to prevent errors instead of finding and fixing them later in the cycle. This significantly increases team productivity and consistency.

End-to-end testing: Continuously validate all critical aspects of complex transactions which may extend through web interfaces, backend services, ESBs, databases, and everything in between.

Advanced web app testing: Guide the team in developing robust, noiseless regression tests for rich and highly-dynamic browser-based applications.

Application behavior virtualisation: Automatically emulate the behavior of services, then deploys them across multiple environments – streamlining collaborative development and testing activities. Services can be emulated from functional tests or actual runtime environment data.

Load/performance testing: Verify application performance and functionality under heavy load. Existing end-to-end functional tests are leveraged for load testing, removing the barrier to comprehensive and continuous performance monitoring.

Specialised platform support: Access and execute tests against a variety of platforms (AmberPoint, HP, IBM, Microsoft, Oracle/BEA, Progress Sonic, Software AG/webMethods, TIBCO).

Security testing: Prevent security vulnerabilities through penetration testing and execution of complex authentication, encryption, and access control test scenarios.

Trace code execution: Provide seamless integration between SOA layers by identifying, isolating, and replaying actions in a multi-layered system.

Continuous regression testing: Validate that business processes continuously meet expectations across multiple layers of heterogeneous systems. This reduces the risk of change and enables rapid and agile responses to business demands.

Multi-layer verification: Ensure that all aspects of the application meet uniform expectations around security, reliability, performance, and maintainability.

Policy enforcement: Provide governance and policy-validation for composite applications in BPM, SOA, and cloud environments to ensure interoperability and consistency across all SOA layers.

Please contact us to arrange either a one to one briefing session or a free evaluation.

Web: www.parasoft.com Email: [email protected] Tel: +44 (0) 208 263 6005

ParasoftImproving productivity by delivering quality as a continuous process

TEST | February 2011 www.testmagazine.co.uk

The TEST Focus Groups is a complimentary event specially designed and targeted at senior software testers, testing managers, QA & project managers, who wish to discuss and debate some of their most pressing challenges in a well thought out yet informal setting.

TEST Magazine, the TEST Focus Groups sister product, spends a lot of time spends a lot of time speaking and listening to its customers and then seeking out innovative ways to meet their needs. It has become apparent that senior decision makers wish to discuss their current challenges in a meaningful and structured manner with a view to finding pragmatic and workable solutions to what are invariably complex issues. Suppliers, who are naturally keen to meet these professionals want to gain a clearer understanding of these challenges and identify how, through meaningful dialogue, they can assist.

This logic coupled with TEST Magazine’s consistent desire to drive the market forward lead us to launch the TEST Focus Groups for 2011!

Due to the demands put on modern managers and the subsequent limited opportunities available to join together and voice opinions – the challenges consistently faced by today’s army of testers and testing management tend not to get resolved as quickly as enterprise would like. As a market-leading publisher and events business the organiser understands there should be a format that empowers meaningful debates to assist managers & directors overcome their issues. The TEST Focus Groups therefore provides ten specially designed syndicate rooms, each containing a specialist subject for delegates to discuss and debate the matter in hand with a view to finding pragmatic and workable solutions.

With some of the industry’s leading minds on hand to help facilitate and steer each session the TEST Focus Groups will quickly become a ‘must-attend’ event for anyone serious about software testing & QA. Add to this there are plenty of networking opportunities available in addition to a small exhibition, and each delegate is provided a fabulous opportunity to interact with their peers, source the latest products and services, and develop meaningful relationships in an informal yet professional setting.

Subjects to be debated are:

People or Technology – Who Gets the Cash?The Value of Testing RequirementsDoes The user Matter?Agile TestingCrowd TestingoutsourcingQualifications, Accreditation, & ExamsEvent Sponsors SubjectIdentifying Tester Related RisksTester Training

If you are interested in being a delegate at the TEST focus Groups please visit: www.testfocusgroups.com/delegates.html

or to register visit: www.testfocusgroups.com/register.html

If you are interested in sponsoring this event and hosting a session please visit: www.testfocusgroups.com/sponsor.html

or to discuss any aspect of the event please contact Grant farrell on +44 (0) 203 056 4598 or email: [email protected]

www.testfocusgroups.com +44 (0) 870 863 6930 [email protected]

TEST focus Groups

F O C U S G R O U P S

46 | TEST company profi le

February 2011 | TESTwww.testmagazine.co.uk

TEST company profile | 47

our testing expertiseWe provide testing experts across the following disciplines:

functional Testing: including System Testing, Integration Testing, Regression Testing and User Acceptance Testing;

Automated Software Testing: including Test Tool selection, evaluation & implementation, creation of automated test frameworks;

Performance Testing: including Stress Testing, Load Testing, Soak Testing and Scalability Testing;

operational Acceptance Testing: including disaster recovery and failover;

Web Testing: including cross browser compatibility and usability;

Migration Testing: including data conversion and application migration;

Agile Testing;

Test Environments Management.

The testing talent we provide• Test analysts;

• Test leads;

• Test programme managers;

• Automated test specialists;

• Test environment managers;

• Heads of testing;

• Performance testers;

• Operational acceptance testers.

Our expert knowledge of the testing market means you recruit the best possible professionals for your business. When a more flexible approach is required, we have developed a range of creative fixed price solutions that will ensure you receive a testing service tailored to your individual requirements.

Our specialist networkWorking across a nationwide network of offices, we offer employers and jobseekers a highly specialised recruitment service. Whether you are looking for a permanent or contract position across a diverse range of skill sets, business sectors and levels of seniority, we can help you.

Tailored technical solutionsWith over 5,000 contractors on assignment and thousands of candidates placed into very specialised permanent roles every year, we have fast become the pre-eminent technology expert. Our track record extends to all areas of IT and technical recruitment, from small-scale contingency through to large-scale campaign and recruitment management solutions.

Unique database of high calibre jobseekersAs we believe our clients should deal with true industry experts, we also deliver recruitment and workforce related solutions through the following niche practices:

• Digital;

• Defence;

• Development;

• ERP;

• Finance Technology;

• Infrastructure;

• Leadership;

• Public, voluntary and not-for-profit;

• Projects, change and interim management;

• Security;

• Technology Sales;

• Telecoms.

We build networks and maintain relationships with candidates across these areas, giving our clients access to high calibre jobseekers with specific skills sets.

To speak to a specialist testing consultant, please contact: Sarah Martin, senior consultant, Tel: +44 (0)1273 739272 Email: [email protected] Web: hays.co.uk/it

Web: hays.co.uk/it Email: [email protected] Tel: +44 (0)1273 739272

HaysExperts in the delivery of testing resourceSetting the UK standard in testing recruitmentWe believe that our clients should deal with industry experts when engaging with a supplier. Our testing practice provides a direct route straight to the heart of the testing community. By engaging with our specialists, clients gain instant access to a network of testing professionals who rely on us to keep them informed of the best and most exciting new roles as they come available.

TEST | February 2011 www.testmagazine.co.uk

48 | The last word...

“My wife recently gave me some great advice – ‘Shut Up and Test!’ Thanks honey!” Dave Whalen ponders the perils of speaking out...

Shut up and test!

I recently decided to take a break from managerial test roles and return to life as a simple tester. I assumed

the stress would be less, I'd attend fewer meetings, I could just concentrate on my test assignments and go home at the end of the day and leave work at work. Yeah right! The truth was – I really missed getting my hands dirty and doing the actual testing. My role as a test manager was much more ‘conceptual’. Providing test estimates, writing test plans and strategies, giving my ‘expert opinion’. And attending seemingly endless meetings! When I wasn't in a meeting, I was in an offi ce all by myself, rarely interacting with anyone. It killed me. I absolutely hated it! I know – I'll be a consultant!

As a consultant it's easy to shift gears a little. All I had to do was modify my resume, remove all the managerial stuff and replace it with hands-on testing stuff, and cast it into the water and see what bites. I got a lot of bites. But in spite of my best efforts, I was typically found to be over-qualified. A bit more tweaking was necessary. So I reworked my resume again and took a look at some of my responses to interview questions and revised them. It worked - I've been able to land a couple of pure testing roles. This is usually where the “and he lived happily ever after” line comes in. Not so much.

I found it really hard to take off my test manager's hat. I was pretty successful as a test manager. I've written about test management, spoken about test management and taught test management, I know everything there is to possibly know about test management. I thought to myself - these people don't know how lucky they are to get me. I could assume my ‘tester’ role and at the same time help them improve their testing skills, test

planning, defect management etc. It didn't quite work out that way.

I learned a few things about myself though. First, that I can be a little over-bearing. OK – a lot. I've had to learn to just ‘shut up and test’. After all, that's what they hired me to do. Not improve their test management. But it's hard. After 20 years of doing this I've seen what works and what doesn't. Whenever I saw something about to be implemented that had failed for me in the past I spent more time raising the ‘this ain't gonna work’ alarm rather than helping the team move forward. I admit it; I was somewhat of a whiner. That's OK though – I'd just save up a big ‘I told you so’ to use at the first opportunity. I rarely got that far.

Second – my opinions were rarely wanted. Just test! But you don't understand – I am after all... me! People pay money to listen to me (sort of) and I'm giving it to you for free. What's wrong with you people?

So what I've learned is this. Just because they don't do things my way, it doesn't mean they are doing it wrong. They're just different. Give 'em a chance. If I don't like something or disagree - discuss it as a team member rather that crashing down on them with my ‘vast testing knowledge and experience’. People appreciate listening to ideas rather than having them shoved down their throats by ‘the consultant’. Consult – don't dictate! Tell them your experience; offer suggestions; be there to help. I now tell test managers that I may disagree with something (I probably will). Take it for what it is – an idea. It's me. It's what I do. You don't have to agree with me, but at least give me the opportunity to stick my two cents-worth in. Let me share my ideas and opinions – in private! You don't have to agree with me – just listen. At the end of the day, we may disagree behind closed doors, but once they make a decision, I'm their biggest cheerleader.

Is it easy? No. I still struggle with

it everyday. Much of it is just my personality. Most of my previous consulting gigs were aimed at process improvement. Analyse, and make recommendations to improve. That's not my role anymore. I'll be honest, maybe it's time for me to return to a test management role or a problem solving consultant role. But until then - with the exception of this article – I'll just "shut up and test."

Dave Whalen President and senior software entomologistWhalen Technologiessoftwareentomologist.wordpress.com

I learned a few things about myself though. First, that I can be a little over-bearing. OK – a lot. I've had to learn to just ‘shut up and test’. After all, that's what they hired me to do. Not improve their test management. But it's hard. After 20 years of doing this I've seen what works and what doesn't. Whenever I saw something about to be implemented that had failed for me in the past I spent more time raising the ‘this ain't gonna work’ alarm rather than helping the team move forward. I admit it; I was somewhat of a whiner. That's OK though – I'd just save up a big ‘I told you so’ to use at the first opportunity.

the last word...

31 Media will keep you up to date with our own products and offers including VitAL Magazine. If you do not wish to receive this information please write to the Circulation Manager at the address given.

Please tick here ■ if you do not wish to receive relevant business information from other carefully selected companies.

News, Views, Strategy, Management, Case Studies and Opinion Pieces

Subscribe FREE to the most VitAL source of information

www.vital-mag.net/subscribeInspiration for the modern business

vital

VISIT VITAL ONLINE AT: WWW.VITAL-MAG.NET

Inspiration for the modern businessInspiration for the modern business

Volume 4 : Issue 1 : September / October 2010Inspiration for the modern businessInspiration for the modern businessvital

Punching above your weight

Using the cloud for rapid growth

The need for speedDoes faster broadband

mean faster applications?

Rediscovering the value of ITDave Ramsden says

it’s time to add more value

Punching above your weight

Using the cloud for rapid growth

The need for speedDoes faster broadband

mean faster applications?

FEATURE FOCUS: SOFTWARE AS A CHOICE – THE RISE OF SaaS: 28-31

VitA

L : In

sp

iratio

n fo

r the

mo

de

rn b

usin

ess

Vo

lum

e 4

: Issu

e 2

: No

ve

mb

er/D

ec

em

be

r 20

10

Inspiration for the modern businessVolume 4 : Issue 2 : November / December 2010

vital

The consumerisation of ITThe changing face of IT in the workplace

IT governanceSecuring the IT estate

Closing the gap between business & ITShifting the focus on to business benefi ts

16-PAGE itSMF

CONFERENCE PREVIEW

FEATURE FOCUS: SOFTWARE AS A CHOICE – THE RISE OF SaaS: 28-31

The consumerisation of ITThe changing face of IT in the workplace

IT governanceSecuring the IT estate

between business & ITbetween business & ITbetween business & ITbetween business & ITbetween business & IT

16-PAGE itSMF

CONFERENCE PREVIEW

FEATURE FOCUS: CONTROLLING THE COMPLEXITY: 22-25

VitA

L : In

sp

iratio

n fo

r the

mo

de

rn b

usin

ess

Vo

lum

e 4

: Issu

e 3

: Ja

nu

ary

/Fe

bru

ary

20

11

Inspiration for the modern businessVolume 4 : Issue 3 : January / February 2011

vital

The human factorCyber-crime targets the individual

Striking a balanceGetting the work/life equation right

Entrepreneurial ITUnleashing your business potential

February 2011 | TESTwww.testmagazine.co.uk31 Media will keep you up to date with our own products and offers including VitAL Magazine. If you do not wish to receive this information please write to the Circulation Manager at the address given.

Please tick here ■ if you do not wish to receive relevant business information from other carefully selected companies.

News, Views, Strategy, Management, Case Studies and Opinion Pieces

Subscribe FREE to the most VitAL source of information

www.vital-mag.net/subscribeInspiration for the modern business

vital

VISIT VITAL ONLINE AT: WWW.VITAL-MAG.NET

Inspiration for the modern businessInspiration for the modern business

Volume 4 : Issue 1 : September / October 2010Inspiration for the modern businessInspiration for the modern businessvital

Punching above your weight

Using the cloud for rapid growth

The need for speedDoes faster broadband

mean faster applications?

Rediscovering the value of ITDave Ramsden says

it’s time to add more value

Punching above your weight

Using the cloud for rapid growth

The need for speedDoes faster broadband

mean faster applications?

FEATURE FOCUS: SOFTWARE AS A CHOICE – THE RISE OF SaaS: 28-31

VitA

L : In

sp

iratio

n fo

r the

mo

de

rn b

usin

ess

Vo

lum

e 4

: Issu

e 2

: No

ve

mb

er/D

ec

em

be

r 20

10

Inspiration for the modern businessVolume 4 : Issue 2 : November / December 2010

vital

The consumerisation of ITThe changing face of IT in the workplace

IT governanceSecuring the IT estate

Closing the gap between business & ITShifting the focus on to business benefi ts

16-PAGE itSMF

CONFERENCE PREVIEW

FEATURE FOCUS: SOFTWARE AS A CHOICE – THE RISE OF SaaS: 28-31

The consumerisation of ITThe changing face of IT in the workplace

IT governanceSecuring the IT estate

between business & ITbetween business & ITbetween business & ITbetween business & ITbetween business & IT

16-PAGE itSMF

CONFERENCE PREVIEW

FEATURE FOCUS: CONTROLLING THE COMPLEXITY: 22-25

VitA

L : In

sp

iratio

n fo

r the

mo

de

rn b

usin

ess

Vo

lum

e 4

: Issu

e 3

: Ja

nu

ary

/Fe

bru

ary

20

11

Inspiration for the modern businessVolume 4 : Issue 3 : January / February 2011

vital

The human factorCyber-crime targets the individual

Striking a balanceGetting the work/life equation right

Entrepreneurial ITUnleashing your business potential

50 | Feature

TEST | February 2011 www.testmagazine.co.uk