2. ⢠The testing performed to evaluate the response time, throughput,
and utilization of the system, to execute its required functions in
comparison with different versions of the same products or a
different competitive product is called performance testing.
⢠It is done to ensure that a product,
â # of transactions processed by the product in particular period of time
(throughput)
â Delay between the point request and the first response from the
product (response time)
â Delay caused by the application, OS, by the environment (latency)
â Is available and running under different load conditions (availability)
â deciding what kind of resources are needed for the product for
different load conditions (capacity planning )
â Is comparable to and better than that of the competitors for different
parameters.(bench marking)
4. ⢠Performance testing is complex and expensive
due to large resources requirement and time
it takes.
⢠Hence this testing need careful planning.
⢠A good n# of defect uncover may require
design and architecture change
5. Collecting requirements
⢠It is the first step in planning the performance testing
⢠Testing need elaborate documentation and
environment setup and expected result may not well
known in advance.
⢠This requirement should be testable
â Feature involving manual intervention cannot be tested
because it depends on how fast user responds with input
⢠Requirement needs to clearly state what factor need to
be measured and improved
⢠Requirements need to be associated with numbers or
percentage that is desire.
6. ⢠Sources for deriving requirements of performance
testing
â Performance compared to the previous release of the
same product
⢠Atm withdrawal will be faster than previous release by 10%
â Performance compared to the competitive products
⢠Faster than competitive bank ATM
â Performance compared to absolute numbers derived from
actual need
⢠Atm capable of 1000 transaction /day, each transaction not taking
more than 1 minute
â Performance numbers derived from architecture and
design
⢠There is an expectation that the source code is written in such a
way that those numbers are met
8. ⢠Two type of requirements
â Generic requirement
⢠All the products in that area should meet those
performance expectation
â Time taken to load a page, time taken to navigate from one
screen to other
â Specific requirement
⢠Depends on implementation particular product
â Time taken to ATM withdrawal
9. Writing test cases
⢠Test case should have
â List of operations or business transactions to be tested
â Steps for executing those operations or transactions
â List of product, OS parameters that impact the
performance testing
â Resource and their configuration (network, hardware)
â The expected results (response time, throughput,
latency)
â Product version / competitive product to be compared
10. ⢠Test cases execute repeatedly for different
parameter, configuration
⢠It need more effort and time so not all
transactions are included in testing
â Prioritize test cases, so that highest priority test
cases executes before other
11. Automating performance test cases
⢠Performance testing is automated because
â Performance testing is repetitive
â Performance test cases cannot be effective without
automation
â The results need to be accurate
â Performance test takes in to account several
parameter and combinations that need to
remembered
â Analysis of performance take in to account of resource
utilization, log file, trace file that need to be collected
at regular interval
12. ⢠End to end automation is required for
performance testing.
⢠Not only the steps of test cases but also setup
required for test cases, creating different load
condition, executing the steps for transaction
of competitive product are have to be scripted
⢠So we need to use standard tools and
practices.
13. Executing performance test cases
⢠Performance testing involves less effort for execution
but more effort on planning, data collection and
analysis.
⢠Executing performance test cases may mean involving
certain automated script.
⢠Data need to executing performance test cases
â Start and end time of execution
â Log and trace file of the product
â Utilization of resources on a periodic basis
â Configuration of all environment
â Response time, throughput, latency and so on as specified
in test cases documentation at regular intervals
14. ⢠What performance a product delivers for
different configuration of hardware and network
setup is need to be included during execution â
configuration performance testing
⢠This endures that performance of product is
compatible with different hardware.
⢠Once performance test are executed and various
data point are collected, next step is to plot them.
⢠Plotting the data help in making a quick analysis.
15. Analysis the performance test result
⢠Analysis the performance test result require multi-
dimensional thinking.
⢠This is the most complex part of performance testing where
product knowledge, analytical thinking and statistical
background are essential.
⢠Some calculations are required before analysis
â Calculating mean the performance test result data
â Calculating the standard deviation
â Removing noise and re-plotting and calculating mean, std
deviation
â Differentiating the performance data when the resources are
available completely as against when some background
activities were going on.
16. ⢠For publishing performance numbers, performance test are
repeated multiple times and the average of those values
are taken.
⢠This increases the chance that performance data can be
reproduced at customer site.
⢠It is also depends on consistency of the product delivers
those performance number.
⢠Standard deviation is represents how much the data varies
from the mean.
⢠There are some data out of range may cause the graph to
be cluttered and prevent meaningful analysis, such value
has to removed â such process is called noise removal.
17. ⢠Some of activities such as garbage collection/ defragmentation in
memory management of operating system, are initiated in
background and degradation in performance may be observed.
⢠Once all done, the analysis of performance data is carried out to
conclude
â Whether performance of the product is consistent
â What performance is expected for what types of configuration
â What parameters impact performance and how they can be used to
derive better performance
â What is the effect of product technologies such as caching on
performance improvement
â What is optimum throughput/response time of product for a set of
factors(resources and load)
â What performance requirements are met compared to old version and
competetitive product
18. Performance tuning
⢠Analysis performance data helps I narrowing
down the list of parameter that really impact the
performance result and improve product
performance.
⢠Performance test cases are repeated for those
parameters for further analysis.
⢠The combination of those parameter can also
cause changes in performance
⢠Steps to optimize performance tuning
â Tuning the product parameter
â Tuning operating system and parameter
19. ⢠Important note for tuning product parameter
â Repeating the performance test for different
values of each parameter.
â Some parameter changes may also need to
change other parameter changes.
â Repeat the performance test for default values of
parameters.
â Repeat performance test for low and high values
of each parameter.
20. ⢠Various parameters provided by the operating system
â File system relates parameters (# of open files permitted)
â Disk management parameter(simultaneous read/write)
â Memory management parameter (page size, number of
pages)
â Processor management parameter (multiprocessor)
â Network parameter (TCP/IP setting)
⢠Result of performance tuning are published in the form
of guide- called performance tuning guide.
21. Performance benchmarking
⢠Performance benchmarking is about comparing the
performance of the product with competitive product .
⢠No two product have same architecture, design,
functionality, ,hence it is difficult to compare.
⢠End user transaction/scenario could be one approach to
comparison.
⢠Steps involved in performance bench marking:
â Identifying the transactions / scenarios and the test
configuration
â Comparing the performance of different products
â Tuning the parameters of the products being compared fairly to
deliver the best performance
â Publish result of the performance bench marking
22. ⢠Step1:
â Comparable transactions are selected for performance benchmarking
â Test cases for all the products are executed in the same test bed
⢠Step 2:
â Once tests are executed, compare the results.
â It is important that in performance benchmarking all product should be tuned to the
same degree.
â There are three outcomes
⢠Positive: outperform with respect to competition.
⢠Neutral: set of transaction are comparable with competition
⢠Negative: under-perform compared to competition
⢠Step 3:
â third outcome detrimental for the success of produce, hence performance tuning need
to be performed for this set of transactions using same configuration
â Repeat this tuning for all situations(+ve, neutral, -ve)
⢠Step 4:
â Internal publish has all three outcomes and recommended set of actions
â Positive outcomes are published as marketing guarantee
23. Capacity planning
⢠In capacity planning, performance result and requirements are taken as
input and the configuration needed to satisfy that set of requirements are
derived
⢠Capacity planning necessitates a clear understanding of the resource
requirement for transactions.
⢠Some transactions required CPU, some require network intensive and so
on
⢠Some transactions require combination of resources for performing better.
⢠This understanding is prerequisite for capacity planning
⢠Load pattern can
â Minimum required - immediate need (short term),
â Typical configuration, under that configuration product work fine for meeting
the performance requirements (medium term)
â Special configuration â planning for future requirements (long term)
24. ⢠Load balancing
â Ensures that the multiple machines available are
used equally to service the transactions
⢠Availability
â Machine clusters ensures availability.
â In a cluster there are multiple machines with
shared data so that one machine goes down the
transaction can be handled by other machine
25. Tools for performance testing
⢠Two types
â Functional performance tools
⢠Help in recording the playing back the transactions and obtaining
performance numbers
⢠Eg:
â Winrunner from mecury
â QA partner from compuware
â Silktest from segue
â Load tools
⢠Simulate load conditions for performance testing without having
to keep that many users or machines
⢠eg:
â Loadrunner from mecury
â QA Load from compuware
â Silkperformer from segue
27. Obtain measurable testable
requirement
Create a performance test plan
Design test cases
Automate test cases
Evaluate exit criteria
Perform and analyze performance
test cases
Evaluate entry criteria
28. ⢠Test plan need the following details
â Resource requirements
⢠Resources needed to perform performance testing need to be
planned and obtained
â Test bed, test lab setup
⢠Testing configuration in simulated or real life environment
â Responsibilities
⢠Multiple team are there to perform this test
⢠Prepare a matrix of responsibilities
â Setting up product trace, audits and traces
⢠What traces and audit has to be collected in planned in advance
â Entry and exit criteria
⢠Performance test require a stable product due to its complexity
⢠Performance test starts after product met certain criteria
⢠Similarly set of criteria is defined to conclude the result of
performance test