Content explained in this article can be applied for a wide range of performance testing applications though we will focus mainly on how we can use Jmeter to test the performance of server applications. We will investigate on how to automate performance testing of a web application written in
Spring boot framework. We will look how we can automate the process to a point where where we can get a report file for different test scenarios.
Good Stuff Happens in 1:1 Meetings: Why you need them and how to do them well
How to automate JMeter Performance Tests
1. How to automate tests with JMETER
What is performance testing?
Performance is a major concern in modern day software engineering. Performance testing is the
process of examining the performance of a system using suitable methods. According to
Wikipedia, performance testing is in general, a testing practice performed to determine how a
system performs in terms of responsiveness and stability under a particular workload.
In today’s complex software systems, doing performance tests manually is a nightmare. Todays
software systems have many number of use cases implemented. Hence manually testing each
of them for different configurations (for different heap sizes), is a cumbersome task. Hence we
need automatic methods of doing performance testing.
Apache JMeter
JMeter is a popular, freely available software that
helps us to perform performance testing, with minimal
effort. Jmeter provides a wide range of applications
with many features.
JMeter can be downloaded from this link.
(https://jmeter.apache.org/download_jmeter.cgi)
What we are going to do?
Content explained in this article can be applied for a wide range of performance testing
applications though we will focus mainly on how we can use Jmeter to test the performance of
server applications.
We will investigate on how to automate performance testing of a web application written in
Spring boot framework. We will look how we can automate the process to a point where where
we can get a report file for different test scenarios.
Resource Requirements
In general. Jmeter performance tests are done using two or more machines; one machine to run
the application and other machine(s) for running the Jmeter application. As a best practice, we
close all other applications (except the application of interest and the JMeter) to make sure that
2. test results are not affected by other programs. You can use either two machines or a single
machine as you wish in this tutorial.
Throughout this tutorial, I’ll refer the machine running the Jmeter as Machine1, machine running
the web application as Machine2.
Spring boot
Spring Boot makes it easy to create stand-alone, production-grade Spring based applications
that you can "just run". It eliminates the burden of writing extra codes and configurations to run a
web service.
In this study, we will use a simple echo web service written in Spring boot, which will echo the
message we provide as the query parameter. Building this service is beyond the scope of this
study, and I will make another tutorial on how to build a spring web application. For this tutorial
you can use the sample application I’ve built.
(https://github.com/PasinduTennage/springboot-test/tree/master/complete/src/main/java/controll
er)
Simply download this as a zip file or clone this to your machine. Cd into complete folder.
Execute “mvn clean install”. It will create 2 executable jars in /target folder. To test your
application, run “java -jar target/gs-actuator-service-0.1.0.jar ”
Open a browser and type “http://localhost:9000/echo?message=test”. If the deployment is
successful, the browser should display “test”.
Building the JMX
First we will write the script to run in the Jmeter service. Fortunately we don’t have to write it
from the scratch, thanks to Jmeter GUI. We can generate the JMX file using the following steps.
3. 1. Open JMeter
2. Right click on Test Plan > Add > Threads (Users) > Thread group
3. Fill the dialog box as shown in following figure.
Name: group1
Number of threads: ${__P(group1.threads)}
4. Duration: ${__P(group1.seconds)}
4. Right click on group1 Add > Sampler > HTTP Request
Fill the parameters as follows
Name: HTTP Request
Method: GET
Server Name / IP: ${__P(group1.host)}
Port: ${__P(group1.port)}
Path: /echo
Click Add at the very bottom of window
Name: message
Value: ${__P(group1.data)}
Click Include Equals
5. Right click group1> Add > Listener > Summary Report
5. 6. Save the file (springboot_echo.jmx)
Test Scripts
In this section we will write bash scripts to run the tests for different scenarios. We will run our
Spring application for different heap sizes, different concurrent users and for different message
sizes.
First we will write the script to run in Machine2. (start.sh)
This script takes two arguments; heap_size and num_users. Then it kills all the running java
applications to make sure that our web service is not up and running. Then it runs the
executable jar file that will spawn the echo web service. /path should be replaced by the actual
path to executable jar file.
Next, we will write scripts to run in Machine1.
In this test we are going to generate messages of different sizes, that will be passed as
arguments to echo service. We will use a python script (payloadGenerator.py) to generate and
write strings of given sizes to files.
Note that we have to provide the path
for output files. We will see how this
done in the next script
(run-performance-test.sh).
6.
7. Lines 1 - 3 define the configurations on which we will perform the tests. We will be testing for 2
different number of concurrent users, two different types of heap sizes and for two message
sizes.
NOTE: Numbers indicated in line number 3, message_sizes should be exactly same as
the values of message_sizes in payload-generator.py
Lines 5 - 28 define the paths of different resources and filenames, that will be used in the testing
process. Since the names are self explanatory, I will not go through each lines, instead will
focus only on important lines.
Line number 14 defines the JTL-Splitter path. JTL-Splitter is a tool developed by WSO2
(https://wso2.com/) which can be used to remove the first n minutes from the resulting JTL file
(output of the JMETER test). You may wonder why we need to remove the first n minutes from
the jtl file!!
The reason is JAVA just in time compilation. While running the JAVA application, JVM uses JIT
compilation which will improve the performance of application. Since we are exercising the
application using the same workload (same string) throughout the test, in the first few minutes
8. application learns the JIT optimizations. Hence the stable performance results are available only
after the first few minutes.
You can build the JTL splitter using this repository.
(https://github.com/wso2/performance-common/tree/master/components/jtl-splitter).
Download as a zip file and build JTL splitter using “mvn clean install”. Executable jar file will be
created in the the /target directory.
Line 16 defines the dashboard path. JMeter has the functionality to generate performance
results with useful graphs in the form of web pages. In the following section you will find how to
generate these dashboards. Dashboard path is the root directory of the output files.
performance-report.py file (line 22) is the file that is used to generate the final output file
(denoted by line 24) of this series of tests.
performance-report.py
NOTE: Numbers indicated in line numbers 31, 32, 33 in performance-report.py should be
exactly same as the values of line numbers 1, 2, 3 of run-performance-test.sh.
9. Test duration at line 30 in run-performance-test.sh denotes the number of seconds each test
will be run. Split time at line number 32 accounts for the number of minutes that will be removed
from the beginning of each JTL file.
Lines 34 - 40 are to make sure that the outputs generated from the tests are not already
available. If past test outputs reside, they will be deleted.
Line 45 will generate a series of files which have strings of sizes mentioned in message_sizes.
Line 47 - 91 runs the JMX script we prepared for each scenario. Since the code is self
explanatory, I will focus only on the important lines here.
nohup command at line 64 is used together with the & sign at the end to make sure that
command prompt doesn’t hang up after running the target script at Machine2. sshpass -p is
used to start the ssh session without having to enter the Machine2 password manually each
time. “javawso2” is the password of Machine2.
Before running the jmeter, we need to make sure that the server is up and running at Machine2.
Hence as indicated at line 69, a curl command is used to check the service. If the service is not
yet started, Machine1 will wait for 10 seconds as denoted in line 74.
Line 78 reads the message to echo from the respective file.
Line 83 (there is no new line after 83 - whole command should be in a single line) starts jmeter.
We provide the values for the parameters we defined in jmx file.
After generating the jtl files, each file is split using jmeter-splitter as denoted by lines 97 - 115.
Lines 119 - 136 generate the dashboard files for each jtl file we generated. Then at line 144, we
generate the final csv file.
Place the start.sh and the spring boot executable at Machine2 location specified by line 10 of
run-performance-test.sh. Place all the dependencies as declared in lines 12, 14, 18, 20, 22 of
run-performance-test.sh in Machine1.
Run /bin/bash run-performance-test.sh from Machine1.
So that’s it. You just did a performance test of a web service. Cheers!!!