Upload
trinhlien
View
216
Download
3
Embed Size (px)
Citation preview
International Journal of Engineering Technology, Management and Applied Sciences
www.ijetmas.com April 2015, Volume 3 Issue 4, ISSN 2349-4476
116 Shikha Sharma, Dr. Aman Kumar Sharma
Empirical Analysis of Web Service Testing Tools
Shikha Sharma Dr. Aman Kumar Sharma
Research Scholar Associate Professor
Computer Science Department Computer Science Department
Himachal Pradesh University, Himachal Pradesh University,
Shimla Shimla
ABSTRACT Web service testing is challenging area for researchers. The rise in the popularity of web services is
because of the reason that it allows us to utilize and integrate existing software applications to create
new business services. Basically web services are web-based software applications designed to be
published, discovered and invoked for remote use. These applications can then be programmatically
loosely coupled through the web to form more complex ones. There are several open source testing
tools available in market having multiple features and functionality to test web services. The
comparative study of these tools is helpful in the evaluation of the response time which is an
important parameter for performance testing. In this study, four open source software testing tools
are studied and evaluated based on response time.
Keywords Web services, Performance Testing, Functional Testing, Tools, Test Cases.
1. INTRODUCTION
Software Testing is the process of executing a program with the intent of finding errors [5]. During
testing process, only failures are observed by which presence of faults is deduced. It is a one step in
software process that could be viewed as destructive rather than constructive [6]. However, software
testing is not just an error detection. It is operating the software under controlled conditions to verify
that it behaves as specified, to detect errors and to validate that what has been specified is what the
user actually wanted.
Building a successful software depends on two fundamental principles— functionality and
performance. Functionality is the sum or any aspect of what a product, such as a software
application, can do for a user [9]. Performance is defined as the system„s ability to complete
transactions and to furnish information rapidly and accurately despite high multi-user interaction or
constrained hardware resources [1].
Performance testing refers to the testing performed to determine the system‟s responsiveness
and stability under a particular workload. This testing will give out the response time of the same
web service on various test tools. In functional testing, input sets are passed through the system and
are tested against expected output. This type of testing is known as black box testing [3]. The
approach followed in this study is based on the concept of equivalence class partitioning. It is
a software testing technique that divides the input data of a software unit into partitions of equivalent
data from which test cases can be derived. In principle, test cases are designed to cover each partition
at least once. This technique tries to define test cases that uncover classes of errors, thereby reducing
the total number of test cases that must be developed. An advantage of this approach is reduction in
the time required for testing software due to lesser number of test cases. Equivalence partitioning is
used to test the temperature conversion web service where test cases of valid as well as invalid inputs
are considered and their corresponding output is generated.
International Journal of Engineering Technology, Management and Applied Sciences
www.ijetmas.com April 2015, Volume 3 Issue 4, ISSN 2349-4476
117 Shikha Sharma, Dr. Aman Kumar Sharma
In this comparative study, the response time of various testing tools on a particular web
service has been taken into account. Basically, a web service is an interface that describes a
collection of network accessible operations through standardized XML messaging. A Web service is
described using a standard, formal XML notion, called its service description. It covers all the
necessary details to interact with the service, including message formats, transport protocols and
location. The interface hides the implementation details of the service, allowing it to be used
independently of the hardware or software platform on which it is implemented and also
independently of the programming language in which it is written. This allows and encourages web
services-based applications to be loosely coupled, component-oriented, cross-technology
implementations. Web Services fulfill a specific task or a set of tasks. They can be used alone or with
other Web Services to carry out a complex aggregation or a business transaction [2].
The organization of this paper consists of following sections: Section 1 lays the basis of the
Study, Section 2 provides an overview of testing tools considered for study, in Section 3 comparative
study of the selected tools has been given. Section 4 gives the result and Section 5 concludes the
study along with scope for future work.
2. TESTING TOOLS
Testing tool is basically a program to do various testing tasks [4]. Now-a-days testing is done with
the help of various testing tools [7]. Many web service testing tools are available however, the most
prominent and widely used tools are analyzed in this study namely SoapUI, Storm, SoapSonar
Personal and .Net Webservice Studio. These are the some of the tools used for the performance
testing of web services.
2.1 SoapUI:
SoapUI [10] is a free and open source cross-platform functional testing solution developed by
smartbear. In a single test environment, SoapUI provides complete test coverage and supports all the
standard protocols and technologies. It assists programmers in developing SOAPbased web services.
It allows the developer to generate stubs of soap calls for the operations declared in a WSDL file.
Additionally, it is possible to use SoapUI to send soap messages to the web service and display the
outputs; this can be used for preliminary testing purposes. SoapUI has some outstanding features like
drag and drop test creation, multi environment support, REST discovery, click and run tests.
2.2 Storm: Storm [11] is a free and open-source tool for testing web services. It is developed by Erik Araojo.
Storm is developed in F# language and is available for free to use, distributed under New BSD
license. It allows us to test web services written using any technology (.Net, Java, etc.). Storm
supports dynamic invocation of web service methods even those that have input parameters of
complex data types and also facilitates editing/manipulation of raw soap requests. Its GUI is very
simple and user friendly. Multiple web services can be tested simultaneously that saves time and
speed up the testing schedule.
2.3 SoapSonar Personal:
SoapSonar Personal [8] is an open source software testing tool for, XML, REST and SOAP based
web Services. The core focus is on functional, performance, interoperability and security testing of
service endpoints by performing client simulation and automated generation of client messages. It
has a clear graphical user interface. Tests are created via drag and drop selection. It accepts WSDL
1.1 and WSDL 2.0 documents. SoapSonar Personal can automatically change variables in message
headers and message body. It can parse the WSDL documents and generate a list of the operations
International Journal of Engineering Technology, Management and Applied Sciences
www.ijetmas.com April 2015, Volume 3 Issue 4, ISSN 2349-4476
118 Shikha Sharma, Dr. Aman Kumar Sharma
available on interface described by it. It can be used to send SOAP request messages to the target
Web service and capture the response.
2.4 .Net Webservice Studio: .Net Webservice Studio [12] tool is meant for webservice implementers to test their webservices
without having to write the client code. This could also be used to access other webservices whose
WSDL endpoint is known. It adds the support for WCF, Nullable Types and REST style API to
allow a complete composite type testing from one tool. Web Service Studio is a tool to invoke
webmethods interactively. The user can provide a WSDL endpoint.
3. COMPARATIVE STUDY OF VARIOUS TESTING TOOLS
3.1 Technical Overview
The four testing tools chosen for comparison are based on different platforms and technologies.
These tools differ from each other on various aspects like operating system support, programming
language they use and other system requirements. All these requirements are necessary for the
installation of these tools on the system. A detailed technical overview of them is shown in Table 1.
Table 1: Technical overview of selected tools. Tool Technolog
y
support
First
releas
e
Latest
versio
n
Programmi
ng language
Os
support
requireme
nts
Develope
d by
website
soapUI Web-
HTTP
HTTPS
SOAP
REST
2005 4.5.1/
June
27,201
2
java Cross
platform
JRE 1.6+ Smartbear
software
http://www.soapui.
org
Storm SOAP 2008 1.1/oct.
2008
F# Microsof
t
windows
.net
framework
2.0
Erik
Araojo
http://storm.codepl
ex.com
SoapSonar
Personal
SOAP
REST
JSON
June
2010
5.0/no
v 2014
java Cross
platform
JRE Cross
check
networks
http//www.crossch
ecknet.com
/products/soapsona
r.php
.Net web
service
Studio
Wcf
REST
SOAP
May
30,
2008
2.0.2,
May
2008
.net Cross
platform
.net
framework
http://webservicest
udio.
codeplex.com
3.2 Configuration of System
The computer system used for the testing of web service is same for all the tools. In order to compare
the representative testing tools, here a web service for temperature conversion is considered. The
configuration of each test tool is essential so as to perform the testing. It includes installation, setting
up test environment, test parameters, test data collection and report analysis. Tests are run on an Intel
Core 2 Duo 2.0 GHz processor machine with 2 GB RAM, running Microsoft Windows 7 Ultimate,
and 100 Mbps of Ethernet connection. The tests were conducted together to get fair and transparent
results. The reason was to minimize the effect of Internet connection‟s performance on the test
results and to obtain realistic measurements. The performance of Internet varies depending on the
time of day and other factors such as internet traffic, subscribed users.
3.3 Comparison and Test Evaluation
Test cases for valid as well as invalid inputs are considered for the testing purpose. Wind-chill in
Celsius and Wind-chill in Fahrenheit functions are considered for testing. The valid input consists of
International Journal of Engineering Technology, Management and Applied Sciences
www.ijetmas.com April 2015, Volume 3 Issue 4, ISSN 2349-4476
119 Shikha Sharma, Dr. Aman Kumar Sharma
any integer value where as the invalid input consists of any character like „a‟ or a blank space. The
results are provided in the tabular form for all test cases.
For Valid Inputs: Wind-chill in Celsius: Table 2 shows the temperature conversion from Celsius to Fahrenheit taking
wind speed into consideration. The inputs are 99 ° C as temperature and 123 as wind speed giving
output as 368.37 F.
It has been observed from table 2 that SoapUI gives different response byte which is 359 as
compared to other testing tools which give 361 as response byte. Storm takes the maximum time to
process the user‟s request while SoapUI takes minimum time.
Table 2: Temperature conversion from Celsius to Fahrenheit according to wind speed for valid
input.
S.No Tool name Input output Request
bytes
Response
bytes
Response
code
Response
Time(ms)
nCelsius Nwind
speed
1. SoapUI 99 123 368.37 --------- 359 --------- 295
2. Storm 99 123 368.37 ---------- 361 200 518.033
3. SoapSonar
Personal
99 123 368.37 459 361 200 473.7
4. .Netweb
service
Studio
99 123 368.37 ---------- 361 200 ---------
Wind-chill in Fahrenheit:
Table 4 shows the temperature conversion from Fahrenheit to Celsius taking wind speed into
consideration. The inputs are 20F as temperature and 124 as wind speed giving output as 79.083°C .
Table 3: Temperature conversion from Celsius to Fahrenheit according to wind speed for valid
input.
s.no Tool name Input
output Request
bytes
Response
bytes
Response
code
Response
Time(ms)
nFahrenhei
t
Nwind
speed
1. SoapUI 20 124 79.083 -------- 374 --------- 560
2. Storm 20 124 79.083 -------- 376 200 569.032
3. SoapSonar 20 124 79.083 471 376 200 335.2
4. .Netweb
service Studio
20 124 79.083 --------- 376 200 --------
As the table 3 clearly shows that SoapUI again gives different response byte which is 374 as
compared to other testing tools which give 376 as response byte. Response time is maximum for
Storm while it is minimum for SoapSonar Personal.
International Journal of Engineering Technology, Management and Applied Sciences
www.ijetmas.com April 2015, Volume 3 Issue 4, ISSN 2349-4476
120 Shikha Sharma, Dr. Aman Kumar Sharma
For Invalid Input:
Here the invalid input provided is a character ‟a‟ which generates error in all tools however provides
response time and response bytes. The response bytes given by all testing tools are same which is
411 bytes.
All testing tools give an error when invalid input in the form of character string “a” is provided.
Storm takes the maximum response time while response time for SoapSonar Personal is minimum.
Table 4: output for invalid input ‘a’.
Tool name Input Output Request
bytes
Response
bytes
Response
code/status
code
Response
time(ms)
Status
description
1 SoapUI “a” Error
------- 411 ---------- 804 Ok
2 Storm “a” Error
-------- 411 200 982 Ok
3 SoapSonar
Personal
“a” Error 405 411 200 691.7 Ok
4 .Net
webservice
Studio
“a” Error -------- 411 200 ------- Ok
4. RESULTS
The preference of particular testing tool is based on the type of web service tested and the response
time associated with that tool. Tests are run and results are calculated for temperature conversion
web service in SoapUI, Storm, SoapSonar Personal and .Net Webservice Studio.
Average Response Time: Response time refers to the amount of time an application takes to return
the results of a request to the user. The average of all response times is taken from the values given
in Table2, Table 3 and Table 4.
Table 5: Response time calculated from valid as well as invalid inputs.
Tool name SoapUI Storm SoapSonar
Personal
.Net Webservice
Studio
Average response
time(ms)
553 689.688 500.2 --------------
Table 5 gives the average response time for the testing tools and as is clear from the table that
SoapSonar Personal provides minimum response time compared to all. However .Net webservice
Studio doesn‟t provide response time at all. It only performs the functional testing in which it simply
generates the output for a given input.
Hence it is clear that SoapSonar Personal is better than all for a particular web service. Results may
vary depending on the type of web service used, load time and system configuration.
5. CONCLUSION AND FUTURE SCOPE
In the present scenario, web service technology turns out to be the latest trend and provides a new
model of web. The rapid growth of web service market necessitated developing of testing
methodologies, hence different methods and different tools are proposed to test web services. In this
International Journal of Engineering Technology, Management and Applied Sciences
www.ijetmas.com April 2015, Volume 3 Issue 4, ISSN 2349-4476
121 Shikha Sharma, Dr. Aman Kumar Sharma
paper, a comparative study of open source web service testing tools with technical overview and
features has been presented. Comparison is made on the basis of response time. Tools are evaluated
by analyzing the web service and collecting the test results.
As is clear from Table 2, Table 3, Table 4 and Table 5 that all the web service testing tools provide
response time and the output except .Net Webservice Studio. Hence, further enhancement of .Net
Webservice Studio is possible so that it can also provide the response time parameter.
REFERENCES:
1. H. Sarojadevi, 2011, Performance Testing: Methodologies and Tools, Journal of Information
Engineering and Applications, ISSN 2224- 5758 (print) ISSN 2224-896X (online) Vol 1, No.5,.
2. IBM , Web Services Conceptual Architecture (WSCA 1.0)
3. Aggrawal, K.K. and Singh, Yogesh, 2007,”Software engineering”, 3rd
edition, New age
international publishers.
4. Sharmila, S., Ramadevi, E. Analysis of Performance Testing on Web Applications.
5. Pankaj Jalote, 2008, Integrated approach to software engineering, 3rd
edition, springer.
6. Roger Pressman, 2001, Software Engineering: A Practitioner's Approach, 5th
edition,
Mcgraw hill .
7. Shruti N. Pardeshi, March 2013, Study of Testing Strategies and available Tools,
International Journal of Scientific and Research Publications, ISSN 2250-3153, Volume 3, Issue 3,
8. www.crosschecknet.com/products/soapsonar.php, “SOAPSONAR”, accessed on 4 Jan, 2015
at 1400 hrs
9. www.searchsoa.techtarget.com/definition/functionality, accessed on 2nd
march 2015 at 1100
hrs
10. www.soapui.org, “SOAP”, accessed on 17 Dec, 2014 at 1500 hrs.
11. www.storm.codeplex.com, “STORM”, accessed on 1 Dec, 2014 at 1100 hrs
12. www.webservicestudio.codeplex.com, “.NET WEBSERVICE STUDIO”, accessed on 31
Jan, 2015 at 1500 hrs