12
How a Leading Bank Automated Extensive ESB and API Testing

Bank_Automated_API_Testing

Embed Size (px)

Citation preview

Page 1: Bank_Automated_API_Testing

How a Leading Bank Automated Extensive ESB and API Testing

Page 2: Bank_Automated_API_Testing

1

To reduce the risks associated with their business-critical transactions, a leading New Zealand bank and

financial-services provider wanted to extend their existing GUI-based testing to more extensively

exercise the application logic within internal systems. This logic resides within their ESB: a message

broker component that features 80+ services and 750+ operations. The ESB represents a single point of

failure within the infrastructure—but the company was unable to test it directly or automatically. With

IntegrationQA's experts and Parasoft's API Testing solution, the company gained a service testing

methodology supported by automated, reusable tests that performed a broad scope of testing directly

from the service/API layer. The result: they saved over $2.1 million NZD in testing and development

costs over an 18 month period.

The Challenge: Creating Reusable, Easily-Maintainable

ESB Service Tests that Automate 3 Key Test Patterns To achieve their quality objectives, the company needed reusable test assets that would enable

exhaustive, direct testing of ESB services. These tests had to be able to robustly exercise any of the

available ESB services via three main test patterns:

Confirmation - Smoke test

Evaluation - Transformation, data variance, and error handling

Replication - Business workflow scenarios

Page 3: Bank_Automated_API_Testing

2

All three test patterns had to be automated in a way that was reusable and easily maintained across

different messaging protocols (e.g., messaging queues, FTP, SOAP, REST) and message payload formats

(e.g., XML, fixed length, JSON).

The Solution: IntegrationQA + Parasoft API Testing To help the company establish the necessary testing methodology and processes, IntegrationQA

proposed a plan to provide a test design that could be easily applied across the numerous services that

needed to be tested. This involved establishing a framework which relied heavily on data constraints

and parameterisation of principal fields within the messaging client. Testing would be automated using

Parasoft's API Testing solution, Parasoft SOAtest.

Confirmation Test Patterns

Smoke Test The confirmation or smoke test successfully established connectivity using Parasoft's MQ or SOAP

messaging client. It also performed data verification by confirming whether the response satisfied

expectations. Since the message payload was literal XML, XPaths were used to automatically check the

message's expected Response Code in order to determine whether the test passed or failed. A positive

response code not only indicated a successful transmission through the ESB, but also that the

HOST/mainframe was able to successfully process the data sent in the request. The successful

implementation of this test was the starting point of the framework going forward.

Regression Suite The overall ESB regression suite comprised a high-level sub-suite of smoke tests for each service, plus a

separate low-level sub-suite for each service operation. This suite was applied as a regression test for

any deployment to the ESB and across any environment.

Notes:

The smoke test was intended to confirm successful deployment of a service to the ESB, so a

positive or negative response from the provider was accepted as proof that the service was

deployed and operational. However, IntegrationQA went a step further and also configured the

test to confirm whether the Provider response was positive (as expected). This increased the

functional test team's confidence that the integrated systems were operational and ready.

Although the test team's work was project-oriented, the implemented ESB regression suite was

ESB-oriented. The ESB regression suite was the core repository of service smoke tests. Project-

based test suites were first based upon in-scope ESB services from the repository; if the project

ESB service was new, then it was created at the project level, then added to the repository after

the project went live.

Page 4: Bank_Automated_API_Testing

3

The XML payload for a smoke test is displayed as literal XML for the service request

Evaluation Test Patterns The following evaluation tests (Transformation, Data Variance & Error Handling) establish the

robustness of the test design.

Transformation Tests Building on the smoke test, an automated transformation test was a powerful test of the input to (and

output from) the ESB. A literal XML message that covered all potential service data fields was selected

for the test. Parasoft SOAtest was used to parse and compare values of the same messages in different

formats.

The two request messages were retrieved from the ESB message store database. The ESB Request was

in XML and parsed via XPath per element. The HOST Request, a fixed-length format message based on

z/OS IBM copybook technology, was passed to a custom parsing script written in JavaScript. Here,

parsing was dictated by constraints defined on an external data pool. A final step compared the parsed

data between the two message formats—again, using custom JavaScript. The same steps were run for

the Response component of the transaction.

Notes:

Transformation tests were runtime tests that were ideal for uncovering small differences

between XML data and fixed-length data. For example, it could detect transformation defects

such as changed date formats, decimalised rate values, and dollar values.

Page 5: Bank_Automated_API_Testing

4

This evaluation tested the ESB's transformation logic and mapping control between messages. If

the ESB’s transformation and mapping logic was modified, either intentionally or

unintentionally, then this test would uncover that change during regression test execution.

JavaScript was leveraged here because SOAtest enabled quick implementation without

compilation.

A third-party Eclipse add-on (LegStar) was leveraged to load IBM copybooks to generate an XSD

via an automatically-generated Java package; pipe the appropriate fixed length message into the

appropriate class and an XML instance of the fixed length message is created. With some

application of the transformation steps to the data, this twist allows for more manageable test

script set-up since it removes much of the manual work in setting the constraints for parsing the

fixed-length message via a spreadsheet data pool.

Steps for sending, retrieving, parsing and comparing the message data

Page 6: Bank_Automated_API_Testing

5

The spreadsheet that constrained and drove the parsing for each element in the messages

Data Variance Tests An automated Data Variance test provided test coverage for all possible iterations of an operation

within a service. Test data from an external data pool was fed into a parameterised message structure

within SOAtest. This way, all possible message structures for a service’s operation or method would be

tested. The test pattern evaluated boundary rules and data type rules of fields within a request; it was

leveraged to trigger various response iterations from the Provider.

Notes:

Data variance tests were structured to first focus on a test of all mandatory fields, then all

optional fields, and finally boundary analysis with maximum-length tests on each field in a

message. Once these tests were established, the focus turned to business-oriented scenarios,

such as listing all possible data entries for a field (if finite) or running through various iterations

of logical branches of calculations for the Provider to process.

Testing data returned from the Provider was a bit trickier since it typically depended upon

existing data from the Provider’s back-end database tables. In some cases, it was possible to

create data and then delete it once the test was complete. However, the nature of the

company's business rules did not often allow for direct deletion of records. In some cases, an

additional step was added before a message client test could be called. This step would query

for data that met certain criteria, then a message client would return this data via the service

layer.

Page 7: Bank_Automated_API_Testing

6

A data variance test; the XML is represented as a form and the key element (CountryCode) is parameterized

A sample spreadsheet that covers various data scenarios

Page 8: Bank_Automated_API_Testing

7

Error Handling Tests Using the same Data Variance pattern, and making alterations only to the external data pool, an Error

Handling test pattern was created which covered all business error scenarios. Verification of the

returned error code was evaluated with corresponding expected results within the external data pool.

The Provider always returned specific error descriptions to the ESB, but the ESB often would return a

generic error code. The ESB development team tended to prepare generic error codes for services and

only extended the error handling logic where the business or Consumer required a more defined

response from the ESB.

Notes:

Before an error test scenario began, a positive run (usually the smoke test) was executed to

establish a baseline and ensure that the ESB and Provider were operational.

From a technical point of view, error scenarios that might only appear during a technological

failure were also tested to ensure that the ESB could identify and report when invalid data or

badly-formed messages were received for a service.

An error handling variance test in Parasoft SOAtest

Page 9: Bank_Automated_API_Testing

8

A sample spreadsheet that covers various error scenarios

Replication Test Patterns The final stage was replication of the business workflow at the service layer. This was achieved by

reusing the established test assets and linking the services together to execute a business workflow—as

the end-user would experience it.

Arguably, this could be viewed as the most valuable test executed, since it includes calling all services

within the project scope and replicating the system behaviour during business workflows. It was also

highly-effective as a regression test because it was not hampered by a front-end user interface and it

could be executed relatively quickly, even compared to an automated functional test.

Notes:

Scenario testing, whether automated or manual, can prove complicated and cumbersome. It

was essential that any scenario tests would take on the existing framework to allow for easy

manipulation or changes when required and that these changes would be global for all other

test patterns.

Key data was fed in at key points along the message steps and the results from each message

were carried forward down the line in an attempt to achieve the expected end result.

Page 10: Bank_Automated_API_Testing

9

Replicating the business workflow at the service layer in Parasoft SOAtest

Outcome: Risk Reduction, Cost Savings, and Additional

Time for Validating Business Requirements This initial project was the proof of concept for this approach, and the results were extremely positive. It

was easy to demonstrate value to stakeholders since the test flows clearly demonstrated the evolution

of the test pattern taking shape. The success of the proof of concept led to the opportunity to apply the

framework to another project. This second implementation was considered a success as well. By

following the same design patterns for the test assets, test scripting was smooth and effortless—freeing

up more time for analysing the technical specs and business requirements.

With the solution that IntegrationQA designed and implemented using the Parasoft API Testing solution,

the company achieved:

Maintainability - The framework decoupled the test logic from the actual test tools: tests were

largely data-driven and parameterised. This facilitated test maintenance/updating as the

application evolved.

Reusability - The scripted tests were designed to be extracted from the project where they

originated and dropped into other projects or into a service test repository.

Scalability - The framework supports a range from acute testing to larger performance testing.

Page 11: Bank_Automated_API_Testing

10

Cost Savings – The automation reduces the time and cost involved in regression testing the ESB

implementation and back-end logic.

Reduced Risk - "White-box testing" the inner components of the IT infrastructure minimised risk

for traditional test phases going forward. This direct testing enabled the team to expose a high

number of critical defects during the testing phase.

Over 18 months, the company has saved over $2.1 million NZD in testing and development costs (based

on the company's internal defect cost valuations).

About IntegrationQA IntegrationQA is an independent software design, development-QA and test consultancy located in

Wellington, New Zealand and Sydney, Australia. Established in 2009, after years of collective systems

integration experience, led us to form our own practice to provide our accrued knowledge to clients

throughout the Australia-New Zealand region. Our goal is to improve a client’s IT infrastructure through

better technical design, improved delivery practices and better software testing. Already, we have been

able to provide significant cost reductions and improved efficiency to clients who in turn have

dramatically altered their design, development and test approaches.

Contacting IntegrationQA NEW ZEALAND AUSTRALIA

Level 7 Level 39

12-22 Johnston Street 2 Park Street

Wellington 6011 Sydney 2000

T: +64 4 473 8535 T: +61 2 803 53413

About Parasoft Parasoft researches and develops software solutions that help organizations deliver defect-free

software efficiently. We reduce the time, effort, and cost of delivering secure, reliable, and compliant

software. Parasoft's enterprise and embedded development solutions are the industry's most

comprehensive—including static analysis, unit testing, requirements traceability, coverage analysis,

functional & load testing, dev/test environment management, and more. The majority of Fortune 500

companies rely on Parasoft in order to produce top-quality software consistently and efficiently as they

pursue agile, lean, DevOps, compliance, and safety-critical development initiatives.

Contacting Parasoft USA Phone: (888) 305-0041 Email: [email protected]

Page 12: Bank_Automated_API_Testing

11

NORDICS Phone: +31-70-3922000 Email: [email protected]

GERMANY Phone: +49 731 880309-0 Email: [email protected]

POLAND Phone: +48 12 290 91 01 Email: [email protected]

UK Phone: +44 (0)208 263 6005 Email: [email protected]

FRANCE Phone: (33 1) 64 89 26 00, Email: [email protected]

ITALY Phone: (+39) 06 96 03 86 74 Email: [email protected]

OTHER See http://www.parasoft.com/contacts

Author Information This paper was written by:

Chris Wellington ([email protected]), Managing Director at IntegrationQA

Andrew Saunders ([email protected]), Senior Consultant at IntegrationQA

Cynthia Dunlop ([email protected]), Lead Technical Writer at Parasoft

© 2015 IntegrationQA Limited and Parasoft Corporation

All rights reserved. Parasoft and all Parasoft products and services listed within are trademarks or registered trademarks of Parasoft Corporation. All other products, services, and companies

are trademarks, registered trademarks, or servicemarks of their respective holders in the US and/or other countries.