This topic focuses on effective reporting and its associated challenges while using JMeter. It delves into the importance of metrics and KPIs for effective performance reporting, followed by a brief overview of JMeter's built-in listeners (reporting elements) like Aggregate Listener, Graph Listeners etc.
The 3rd and the final part covers the inadequacies of these listeners and use of third party/external reporting tools that provide enhanced reporting (ant + xslt).
The new BlazeMeter reporting plugin is introduced as a quick and ready to use solution for JMeter reporting.
Sub-topics:
* Importance of effective performance test reporting
* Typical performance testing metrics
* JMeter reporting entities (Listeners)
* Shortcomings of existing JMeter reporting elements
* Generating advanced JMeter reports using ant + xslt
* Building reporting tools frameworks
* How the blazemeter reporting plugin can alleviate the challenges in JMeter reports
* Details on the blazemeter reporting plugin
4. PERFORMANCE ATTRIBUTES
• Speed / Responsiveness
• How fast does the page load?
• How quickly can the system process a transaction?
• Scalability
• Can the application handle the expected end user load?
• Does the application throughput degrade as the user load increases?
5. PERFORMANCE ATTRIBUTES…
• Efficiency and Capacity Planning
• Are you using the right resources
• Can your infrastructure carry the
load?
• Reliability/Availability/
Recoverability
• What is the mean time between
failure (MTBF)?
• Does the application recover after
a crash? Does it lose user data
after crash?
6. UNDERSTANDING PERFORMANCE KPIS
System Metrics Server Platform Metrics
• CPU • DB
• Memory • App-server
• Disk / IO • Application
• Network
Response Time
Requests / sec
Internet
User Load User Load
Application Metrics Browser Rendering Metrics*
• Response Time • Total Rendering Time
• Throughput • Heavy Images/CSS/JS
• Error Rate • DNS Lookup
End User
7. UNDERSTANDING PERFORMANCE KPIS…
Response Time Throughput
DB
Inter Response Time
Web App
Server
net Server
Server
DB
Server
Total Response Time = Throughput =
Network latency + Application latency + [TRANSACTIONS] / Second
Browser Rendering Time
•Measured from the end-user perspective •Transactions are specific to applications
•Time taken to completely respond to request •In its simplest form, it is requests / sec
•TTLB TTFB
Error
•Defined in terms of the success of the request
•Error at HTTP level (404, 501)
•Application level error
8. CREATING LOAD TEST REPORTS
Capture Application Metrics Capture Server Metrics
• Response Time • CPU / Memory / Disk / IO
• Throughput 1. Capture • Network
• Errors • Application
• Platform
Correlate Application Metrics 2. Correlate Correlate System Metrics
• User Load - Response Time • User Load - Server Metrics
• User Load - Throughput • User Load - Network
• User Load - Errors • User Load - Platform
3. Plot / Tabulate
Tables Graph / Charts
• Response Time • Scatter / Line
(avg/min/max/%/stddev) 4. Trends / • Overlay
• Throughput (average) Thresholds
• Errors (success % / types)
5. Customize / Trends / Thresholds
Summarize Summarize • Response Time Trends
• Overall Performance • Throughput Trends
• Important Trends • Threshold Violation
• Threshold Violations 6 . Compare • Utilization (Server Metrics) Trends
9. SAMPLE REPORT ELEMENTS (SNAPSHOTS)
Photo Credits:
• http://msdn.microsoft.com/en-us/library/bb924371.aspx
• Sanitized past projects
10. JMETER REPORTING ELEMENTS (LISTENERS)
• JMeter Listeners
• JMeter elements that display
performance test metrics /
output
• Various types of Listeners
(Raw / Aggregated /
Graphical)
• Doesn’t have inherent
capability to measure system
metrics*
• Useful for basic analysis
11. GENERATING ADVANCED JMETER REPORTS
JMeter Report using xslt stylesheet Other Reporting Options
• JMeter CSV results + Excel
• Style-sheet under ‘extras’ folder
• Process results programmatically
• .jtl output must be in xml format (perl / python etc.)
– jmeter.save.saveservice.output.for • BlazeMeter Reporting Plug-in
mat=xml
• Integrate using ant
Photo Credits:
• http://www.programmerplanet.org/pages/projects/jmeter-ant-
task.php
13. BLAZEMETER REPORTING PLUGIN
BENEFITS
• Store a report per test run,
including
• Script that was used to run the
test
• Logs & JTL file
• Compare results of two test runs
• See an improvement trend
• Compare current with previous in
real time
• Share with co-workers
14. KPIS AVAILABLE IN A JMETER TEST
RESPONSE TIME - THE TIME IT TAKES A REQUEST TO FULLY LOAD
• Indicates the performance level of the entire system under test (web server +
DB).
• Represents the average response time during a specific minute of the test.