Response time difference analysis of performance testing tools
1. Performance testing : Analyzing
Differences of Response Time
between Performance Testing
Tools
By
Spoorthi Sham
1PI14SSE12
12-05-2015 1CSPA Seminar
2. Performance Testing
• Performance testing is the process of determining the
speed or effectiveness of a computer, network, software
program or device.
• This process can involve quantitative tests done in a lab,
such as measuring the response time or the number of
MIPS (millions of instructions per second) at which a
system functions.
12-05-2015 2CSPA Seminar
3. How is it performed?
Tools used..
12-05-2015 3CSPA Seminar
4. Issues with the tools
• Several issues have been observed related to tools when conducting performance testing:
tools compatibility with the software under test
tools installation
tools setup
tools flexibility in doing test both for client and server side
response time generated by the tools
To demonstrate and
prove that response
time from different
performance testing
tools is different
To suggest potential
reasons or root
cause behind
response time
differences
To answer the question:
“Why do different performance
testing tools produce different
response time?”
Research
Focus
12-05-2015 4CSPA Seminar
5. • Most previous work on performance testing tools comparison ignored
on different result reported by each tools
• There is no work so far to understand why they are different against
tools
• Each tool claims they are better than the others but none able to
justify the performance testing results against the real world.
Related Works
12-05-2015 5CSPA Seminar
6. Tool A
•Open source tool purely
developed on Java platform
•Sits as desktop-based tool
•Serves functional, load and
stress testing which is extensible
to write own test to suit the
scenario.
•Can simulate heavy load on the
application, server and even the
network
•Able to give instant visual
feedback and capable to do load
and stress testing via distributed
approach
•Supports protocols such as HTTP,
JMS, JDBC, FTP, SOAP as well as
LDAP
•Used across platforms and
supports full multithreading
framework
•Allows caching and offline
analysis with replaying of test
results
Tool B
• Open source load testing
tool
• Developed using C++
language
• Can perform heavy load
tests using scripted HTTP
and HTTPS
• Feature-rich GUI-based web
server benchmarking tool
• Only runs on Windows-
based platform
• Performance scripts are
recorded using own
proprietary language
• Support custom functions,
variable scopes, and
random or sequential lists
Tool C
• Proprietary tool (one of the
established performance
tools in the market)
• Built on Eclipse and Java
• Offers automated
performance testing for
web and served based
application
• Can be used across
platforms (Windows, UNIX
and Linux)
• Capable to create code free
tests, automate test data
variation and enables
insertion of custom Java
code for flexible test
customization.
• Supports operating systems
such as Windows, Linux,
and z/OS
Overview of Performance Testing Tools
12-05-2015 6CSPA Seminar
7. Test Environment Setup
Hardware Specification (Both Machines)
CPU/processor : Intel Pentium D 3.4 GHz
RAM/memory : 2 GB
HDD storage : 80 GB
Network Card : Integrated 10/100/1000 Ethernet
Server machine
Operating system : Windows Server 2003 Enterprise Edition SP1
Java JDK : JDK 1.6.0 update 21
Web server : Internet Information Services 6
HTML page size : 65.8 KB (Page: 7 KB; Image 1: 25.2 KB;
Image 2: 33.6 KB)
Client machine
Operating system : Windows XP SP2
Java JDK : JDK 1.6.0 update 21
Tool : Tool A (open source);
Tool B (open source);
Tool C (proprietary)
12-05-2015 7CSPA Seminar
14. Potential Reasons for Response Time Differences
• Some fundamental reasons:
capturing and simulating the load used for the performance test
method of calculating metrics gathered by each tool
language to develop the tools
architecture of the respective tools
• Architecture differs greatly:
Tool A and C developed by using Java and they require JVM to run so the
value setting for Java Heap Size plays a role to generate the best user load
without putting extra burden to the client
Tool B architecture relies on web relay daemon facility allowing CORBA-
based communication to be transmitted between machines during executing
the performance test
Findings
12-05-2015 14CSPA Seminar
15. Conclusion
• Different performance testing tools do give a different response
time.
• Currently, there is no tool able to tell us if application is fast
enough in term of user experience in a reality
• It is crucial for performance testers to understand that there is no
tool that is able to automate and tell us about the full picture of the
application's performance that is going to be in a real world
• It is back to human brain to analyze the information given and
performance testing tools is just one of the way that can be used to
achieve that.
12-05-2015 15CSPA Seminar