PID2347659
PID2347659
PID2347659
net/publication/261260567
CITATIONS READS
4 909
2 authors, including:
Dhiauddin Suffian
MIMOS Technology Solutions Sdn. Bhd.
15 PUBLICATIONS 77 CITATIONS
SEE PROFILE
All content following this page was uploaded by Dhiauddin Suffian on 01 January 2016.
Abstract— This research focuses on the study and evaluation of performance test is very crucial to achieve this target. This is
response time differences given by three tools used for where the issues start to come into picture.
performance testing. The motivation for this research work is to
understand the behavior of various performance testing tools Several issues have been observed related to tools when
towards determining the accuracy of the response time result. It conducting performance testing such as tools compatibility
is conducted with the aim of demonstrating and proving that with the software under test, tools installation, tools setup, tools
differences of response time do exist between different tools when flexibility in doing test both for client and server side and also
conducting performance test for the same webpage as well as the one that becomes the focus of this research, which is
analyzing the reasons behind that situation. A static HTML response time generated by the tools.
webpage is put under load test for 1, 100, 200, 300, 400 and 500
concurrent users performed by the three tools. The findings The research problem falls into area on demonstrating and
clearly showed that different performance testing tool gave proving that the response time for conducting performance test
different response time when conducting load testing on same for the same website is different when using different
webpage. The findings are also supported with the justification performance testing tools. In addition, the research also
for these differences, which involve architecture and simulation suggests potential reasons or root cause behind these
mechanism of the respective tool. The summary and future work differences. This work tries to answer the question on “Why do
is presented at the end of the research different performance testing tools produce different response
time that are not even closed to each other?”
Keywords- performance testing; response time; load test; open
source tool; concurrent users The research discussion is organized into several sections.
Section II discusses the prior related works on webpage
response time and calculation of response time. Section III
I. INTRODUCTION outlines the overview of the common features of the testing
Performance testing is one of the test strategies performed tools used for the experiment. Section IV describes the
for the software under test, usually at system testing level. By environment setup while Section V drills down further on
conducting this test, we could assess the readiness of the discussing and elaborating the findings of the experiment.
software system in handling users’ loads thus ensure it could Section VI concludes the overall research together with
response within an acceptable time range as expected by the recommendation for future works
end users. This type of testing becomes more crucial if it
involves heavy integration with other external systems since II. RELATED WORKS
performance degradation might take place.
Ignoring performance test means that your system is not fully Most previous work on performance testing tools
tested, especially from the risk and operational profile comparison ignored on different result reported by each tools
perspectives. [1][2][4][5][7][8][9][10][11][12].VCAA[1] uses pricing and
user friendliness as a criteria to decide which tool to use while
The trigger point to embark on this research is driven by JDS[2] mentions about the ability to emulate a complex
real experiences as independent testing team in dealing with business process and support of unlimited number of
various issues, difficulties, challenges as well as successes in concurrent users.
doing performance test. These involve conducting performance
test for various type of software: standalone, web services, web Testingrefeclections.com [3] concludes that accuracy of
application, mobile application as well as grid and cloud load and response time is something we need to evaluate
applications. Suitable and correct approach or strategy of against our particular application and not something to
performance test need to be put in place for software so that compare when determining the tool to use or buy. But, there is
performance defects can be fixed and the test results are no work so far to understand why they are different against
accurate. Thus, the selection and use of right testing tool for each other at the first place. Shall we have a framework or
memorandum of understanding (MOU) about the uniformity of
A. Tool A
Tool A is an open source tool that is purely developed on
Java platform. It sits as desktop based tool for performance
testing. It is designed to serve functional, load and stress Figure 1. Test setup for comparison of performance testing tools
testing, in which it is extensible to write own test to suit the
scenario. Tool A can simulate heavy load on the application, As presented in the diagram, strict rule for conducting the
server and even the network. Besides able to give instant visual test is to make sure both client and server machines must be
feedback, this tool is also capable in performing load and stress rebooted after completing each cycle of test for each tool used,
testing via distributed approach. It supports protocols such as so that the machine resources such as RAM and CPU are at
HTTP, JMS, JDBC, FTP, SOAP as well as LDAP. Other their fresh states. Another important element for this test is to
features of this tool are it can be used across platforms, use static webpage in order to maintain the page size for correct
supports full multithreading framework besides allows caching analysis of the results obtained.
and offline analysis with replaying of test results.
While the test was running for a particular tool, the services
for the other remaining tools as well as other unnecessary
B. Tool B
services were stopped or disabled. As for the server, it only
Tool B is an open source load testing tool that is developed hosted static HTML pages with no cache and cookies. Running
using C++ language. It can perform heavy load tests with processes on this machine were kept to a minimum since
performance measurements using scripted HTTP and HTTPS. unnecessary processes and server-side logging were disabled.
This feature-rich GUI-based web server benchmarking tool can Load testing was performed from 1 user access and gradually
only runs on Windows-based platform. Its performance scripts increased to 100, 200, 300, 400 and 500 concurrent users with
are recorded using own proprietary language that is simple and zero ramped-up time.
can support custom functions, variable scopes, and random or
sequential lists The hardware specifications for both machines are outlined
as below:
C. Tool C CPU/processor : Intel Pentium D 3.4 GHz
Tool C is a proprietary tool and is one of the established RAM/memory : 2 GB
performance testing tools in the market. It is built on Eclipse Hard disk storage : 80 GB
and Java. It offers automated performance testing for web and Network Card : Integrated 10/100/1000 Ethernet
served based application so that the scalability of those
applications can be validated. This tool can be used across As for the software specifications, the details are as
platforms such as Windows, UNIX and Linux. System follows:
performance bottlenecks and their root cause can be identified
by using this tool. Besides the capability to create code free Client machine
tests, this tool is also able to automate test data variation and Operating system : Windows XP SP2
enables insertion of custom Java code for flexible test Java JDK : JDK 1.6.0 update 21
customization. In terms of operating systems, it supports Tool : Tool A (open source); Tool B
operating systems such as Windows, Linux, and z/OS. (open source); Tool C (proprietary)
Server machine C. Result of Tool C
Operating system : Windows Server 2003 Enterprise
Edition SP1 The result of load test for Tool C is presented below in
Java JDK : JDK 1.6.0 update 21 Table III:
Web server : Internet Information Services 6
TABLE III. PERFORMANCE TEST RESULT OF TOOL C
HTML page size : 65.8 KB (Page: 7 KB; Image 1:
25.2 KB; Image 2: 33.6 KB)
B. Result of Tool B
The result of load test for Tool B is presented below in Figure 2. Performance (load) test result for round 1
Table II: