2. SOME INTRODUCTION
Performance testing is in general testing performed to determine how
a system performs in terms of responsiveness and stability under a
particular workload. It can also serve to investigate, measure,
validate or verify other quality attributes of the system, such as
scalability, reliability and resource usage.
There are different types of performance testing:
under a specific expected load
Stress testing :: Stress testing is normally used to understand the upper limits of capacity
within the system.
Soak testing :: usually done to determine if the system can sustain the continuous expected
load.
Spike testing :: Spike testing is done by suddenly increasing the number of or load generated
by, users by a very large amount
Load Testing tool:
Load testing :: A load test is usually conducted to understand the behaviour of the system
Jmeter, Load runner etc.
Why Jmeter:
Its open source + fulfill almost all requirement.
3. INSTALLING AND CONFIGURING JMETER
Install Java
Your PC might already have java installed on your system, if not
just install it from here.
Check your java version:
Go to cmd and type java –version. You will get something like
this:
4. INSTALLING AND CONFIGURING JMETER
• ADD PATH OF JAVA INSTALLATION IN ENVIRONMENT PATH VARIABLES.
Go to Control Panel >> System >>
Advanced System Setting >> Environment
Variable
Go again to cmd and check the java
version showing correct or not.
5. DOWNLOAD JMETER
Download jmeter from
https://jmeter.apache.org/download_jmeter.cgi
This will download a zip file. Unzip this to a directory. Better to
unzip in C://
Start jmeter from cmd.
• Traverse to the jmeter installed path , for my case this is :
C:apache-jmeter-2.9bin
• And run jmeter.bat
This will open the GUI of Jmeter.
(If not, the download the dependencies ).
6. START WORKING WITH JMETER
Add a thread group: Go to
Test Plan>> Add >> Thread group
Meaning of different parameter:
Number of Thread : This is
the number of users who will
simultaneously use the app.
7. ADD AN HTTP REQUEST PARAMETER
Go to Thread Group >> ADD >> Sampler >> HTTP request.
Give the URL address. Port number etc. this takes default ones.
8. ADD COOKIE MANAGER
GB app depends very much on cookies and most importantly to switch over
to tabs etc, you need this .
Go to Thread Group>> Config Element >> HTTP Cookie
manager
Select Clear cookies each iteration.
9. ADD CSV DATA SET CONFIG -1
Pre-Configuration::
This is most important element to add, specially if you
want to test some app, which needs login.
Our Glassbeam app also require authentication, and at
once, only one user remain active.
So you need to create maximum number of users in
your database. Eg. If you want to test the app with 20
users, you should have 20 users created on DB(in our
case Mysql DB).
Add all the username and password in a notepad in
comma separated format and save it as test.csv.
Save this file under bin of jmeter. ( Just to avoid the
huddels)
10. ADD CSV DATA SET CONFIG -2
Adding in Jmeter
Go to Thread Group >> Add >> Config Element >> CSV Data Set Config.
Only thing you need to change here is Filename , this will be complete path of your just saved csv file and variable name which
is username and password.( Note down this variable name—This will be used )
On
o
11. ADD LISTENERS
There are many listeners available you can add any
of those but best ones are: (Thread Group>>Listener>> Component)
Summary report
View results tree
Aggregate Report
Graph result.
12. ASSERTION OF RESPONSE
Allow you to assert fact about responses received from HTTP
request
ADD> Assertion >> Response Assertion
13. ADD RECORDING CONTROLLER
This is one of the most important element to add.
Go to Thread Group >> Logic Controller >>
Recording Controller
You have to do nothing else here.
14. RECORDING THE PLAN
Now you need to record the steps what you want to do
while performance testing.
Go to workbench>> Non text element >> HTTP proxy
server
Keep everything as default ones .
From Target Controller select Thread Group >>
Recording controller.
15. RECORDING THE PLAN -2
URL patterns to exclude/include:
You can add patterns to exclude or include the scripts which
you don’t care. Eg. The response time of jpg,png image etc we
usually don’t care (these are anyways in ms).
16. RECORDING THE PLAN -3
Now you need to just start the jmeter server. Just
click the Start button.
Now start will change to stop.
18. START RECORDING
Now go to Browser and type the URL: Remember
this URL should be same as mentioned in HTTP
request.
This will start recording all the steps you will
perform under Recording Controller.
You can see various scripts under
Recording controller. These are the
scripts ran to show the Login page.
19. LOGIN WITH DIFFERENT USERNAME-1
Just go ahead with your recording and Login with
your username and Password.
Identify the script which is doing the Login
operation. In our case, this is ticketlogin script.
20. LOGIN WITH DIFFERENT USERNAME-2
Now this is the time to remind you csv data set
config.
Replace the actual username and password with
variables, eg. ${username} and password.(see the
image)
These variable will take value from the excel sheet
you placed under bin.
21. ADDING THE TEST CASES
Now whatever you will do , this will get recorded in
jmeter. For example, search for aruba , selecting a
facet etc. For every operation a lot of scripts will get
executed, you need to either filter out or select only
the important ones. Keep recording all the major
steps which may take maximum time.
22. PLAY BACK
Now you are done with your record, this is time to run it.
Stop recording and Click play button to start playback.
Keep monitoring the result in Summary table, graph
result etc.
Remember to add the desired number of thread etc. in
thread group.
You can add assertion in between each result. This
need to add manually only. This will be not part of
recording.
Note the top right corner of jmeter, this will tell you
whether Jmeter is running or done with the test.
More about listeners like Tree result, summay table etc.
can be found on Jmeter official website.
24. SOME IMPORTANT TERMS
Label: In the label section you will able to see all the recorded http request, during test run or after test run.
Samples: Samples denote to the number of http request ran for given thread. Like if we have one http request and we run it with 5 users, than the number
of samples will be 5x1=5.
Same if the sample ran two times for the single user, than the number of samples for 5 users will be 5x2=10.
Average: Average is the average response time for that particular http request. This response time is in millisecond. Like in the image you can see for first
label, in which the number of sample is 4 because that sample run 2 time for single user and i ran the test with 2 user. So for 4 samples the average
response time is 401 ms.
Min: Min denotes to the minimum response time taken by the http request. Like if the minimum response time for first four samples is 266 ms. It means one
http request responded in 266 ms out of four samples.
Max: Max denotes to the maximum response time taken by the http request. Like if the maximum response time for first four samples is 552 ms. It means
one http request responded in 552 ms out of four samples.
90% Line :The 90% line tells you that 90% of the samples fell at or below that number. However, it is more meaningful than average in terms of
SLA. We expect it within 2x of average time. That is, if average time is 500ms, we expect 90% line is less than 1000ms. Otherwise the system
fluctuates a lot.
Error %: This denotes the error percentage in samples during run. This error can be of 404(file not found), or may be exception or any kind of error during
test run will be shown in Error %.
Throughput: The throughput is the number of requests per unit of time (seconds, minutes, hours) that are sent to your server during the test.A1