This document provides checklists to help ensure performance testing is properly planned and executed across different stages. The pre-project checklist covers items that need to be completed upfront like meetings, environment details, scope, metrics. The pre-test checklist verifies the environment is ready. The post-test checklist focuses on reporting results. Finally, the post-project checklist includes review meetings to sign off on the performance testing effort. The checklists are meant to improve test quality and reduce wasted test cycles by establishing criteria for each testing stage.
2. Page 1
Table of Contents
1 Checklist Purpose and Goals ..........................................................................................................................2
2 Pre-Project........................................................................................................................................................2
2.1 Meetings..............................................................................................................................................2
2.2 Environment Details ...........................................................................................................................2
2.3 Project and Build Schedules ...............................................................................................................3
2.4 Contact Lists .......................................................................................................................................3
2.5 Scope...................................................................................................................................................3
2.6 Test Types............................................................................................................................................4
2.7 Workload Characterization..................................................................................................................4
2.8 Data Model..........................................................................................................................................4
2.9 Performance Metrics...........................................................................................................................5
2.10 Monitoring Metrics .............................................................................................................................5
2.11 Test Results .........................................................................................................................................5
3 Pre-Test.............................................................................................................................................................5
3.1 Available Environment .......................................................................................................................5
3.2 Stable Environment.............................................................................................................................5
3.3 Test Support ........................................................................................................................................6
3.4 Performance Team ..............................................................................................................................6
4 Post-Test............................................................................................................................................................6
4.1 Test Analysis Findings ........................................................................................................................6
4.2 Review Meeting..................................................................................................................................6
5 Post-Project.......................................................................................................................................................6
5.1 Review Meeting..................................................................................................................................6
5.2 Readiness Release Meeting.................................................................................................................6
5.3 Executive Summary............................................................................................................................6
6 Performance Testing Checklists......................................................................................................................6
6.1 Pre-Project...........................................................................................................................................7
6.2 Pre-Test ...............................................................................................................................................8
6.3 Post Test ..............................................................................................................................................9
6.4 Post Project .......................................................................................................................................10
3. Page 2
1 Checklist Purpose and Goals
Performance testing is an activity that spans beyond the Performance Testing Team. We help the project
team to understand the role performance testing can play in the software development life cycle and we rely on
other subject matter experts to provide guidance, insight, and support so that we may conduct a thorough and
valid set of performance tests. Performance Test Checklists help to set expectations for the individuals involved
in the performance testing areas of support, test preparation, test execution, and analysis.
The main purpose of performance testing checklists is to establish criteria for each stage of testing to
improve test quality. While some of these items will be required in order to move forward, others are important
suggestions that should be completed, but will not block testing if they are not done. This organization will reduce
the number of tests repeated due to issues such as misconfigured servers or a mismatch of software across a server
farm. The reduced test cycles will allow for a more detailed analysis of the test results, which will provide more
insight into the performance of the application under test. Checklists also help ensure that the environment is set
up properly, that proper traffic and data models exist, and that support resources are available during testing and
analysis. Finally, fewer test cycles per project mean more projects can be tested in the same amount of time.
It is important to note that the Performance Engineer cannot commit to timelines until the full scope of
work is known. That typically means all the pre-project information must be provided BEFORE the final schedule
can be set. Without it, the engineer cannot create the Performance Test Plan. Even though the information is
provided in a timely fashion, the schedule may need adjustment if unforeseen complexity is discovered. Examples
of such situations include changing UIs based on the state of the page, data that is not reusable, complicated
looping constructs, excessive parameterization, large user flows, or returning the application under test to its pre-
test condition. Extra care and time must be taken in order to account for these situations appropriately. Many
times, these conditions remain hidden until test case scripting begins. It is therefore imperative that a sample
environment be available as soon as possible so that the engineer can see how the performance testing tool works
with the application under test.
2 Pre-Project
These are the items that must be completed as part of the project set up and are done once per project.
They serve the main goal of providing the necessary information to draft the Performance Test Plan, the
Performance Test Schedule and other testing documents.
2.1 Meetings
The Performance Engineer should be invited to all team meetings starting from conception and MUST be
invited to all meetings concerning performance of the application under test. The engineer can speak to
performance relative to design and start gathering information for the Performance Test Plan. The engineer is also
available to execute proof-of-concept tests to investigate proposed designs and configurations before the
application design is finalized.
2.2 Environment Details
Performance testing should not be a ”black box” testing effort restricted to analysis of end-to-end response
times and throughput. Many performance testing tools have the ability to monitor servers individually as well as
many of the components that run on those servers. This additional information provides valuable insight into the
actual operation of the application while under load. The Performance Engineer needs a high-level architecture
diagram of the application under test along with any design documents and UI or technical specifications in order
4. Page 3
to fully understand the application and properly test it. A separate listing of the server specifications should also
be provided so that server and application monitoring can be included in the performance test results.
2.3 Project and Build Schedules
Proper performance testing requires enough time to set up the environment, create the test data, develop
the test scripts, create the test scenarios, actually execute the tests, analyze the results, and report those results.
New builds sometimes require the re-writing of scripts to accommodate the changes. Builds for performance
testing should be restricted to those that actually affect performance. However, the final version must undergo a
full suite of performance tests regardless of the level of performance improvements for it.
The Performance Engineer must have the overall project schedule and the proposed build schedule. The
engineer will use the information to determine test timelines and inform the project team of any slack in the
schedule or any unreachable dates. The engineer can also provide input into the creation of the schedule so that
full performance testing is possible. Finally, extra time should be provided for unforeseen issues that typically
arise during script and data creation.
2.4 Contact Lists
The Performance Engineer must have access to other subject matter experts to support the performance
testing effort. The most crucial contact is the Project Manager, who is the liaison between the engineer and the
rest of the project team. The following list details the contact list by role. However, some individuals may serve
more than one role.
1. Product Manager/Business Analyst – an SME regarding the business and customer needs. This person will
determine the use cases that must be turned into performance test cases and help decide the transaction
targets for the performance tests.
2. Engineer/Developer – an SME who can speak to the way the application works at the code level. Multi-
tiered or component applications may require a separate contact for each tier or component.
3. Systems Architect – the architect who designed the application.
4. Systems Administrator – an SME who understands how the hardware is connected and can provide support
during the actual performance tests.
5. Database Administrator (DBA) – an SME who understand the database of the application under test and
provides support during test execution and results analysis.
6. Functional QA Analyst – the person who will execute the manual test cases.
7. Performance Engineer – the person who will execute the performance tests.
2.5 Scope
Not every piece of functionality requires performance testing. The project team must provide the
Performance Engineer with a subset of the functional test cases that must be included in any performance test.
Generally speaking, performance test cases are the test cases that receive the bulk of use (e. g. the 80/20 rule).
Functionality with a small user population and infrequent use does not require performance testing. For existing
applications, production logs will reveal what areas receive the most traffic and changes to the test scenarios can
5. Page 4
be made as needed. For new applications, the Performance Engineer can work with the project team to define
scope and coverage.
2.6 Test Types
Performance testing is an umbrella term that describes several types of tests that are used to measure
performance. A common standard is to execute 1X (anticipated peak load plus 1a 10 to 20% buffer), 2X (double
anticipated peak load), and longevity (8, 12, or 24 hours at 1X) tests at a minimum, and to execute fail-over and
capacity testing if time permits. Additional testing can be accommodated as needed. The following list details the
different testing types.
1. Component – tests an individual component without full integration with the rest of the application. This
type of tests allows testing to start before the full application is available and provides
development/engineering with some insight into expected performance.
2. Load – tests the application at given user level (e .g. 100 users, 1X). This type of test is used to ensure a
given platform meets requirements and provides some insight into the ability to handle future growth and
helps with hardware planning.
3. Scale/Capacity – tests that determine the limit of a given set of hardware and/or configuration. This type
of test helps determine the number of servers to support a required user population.
4. Longevity/Duration – tests run for extended periods that typically involve at least 1 work shift and serve
to evaluate the stability of the application.
5. Failover/Disaster Recover – tests to determine the system response to a denial of service.
6. Tuning – tests to configure settings and hardware to optimize performance. This type of test is conducted
after the code is optimized for performance.
2.7 Workload Characterization
A proper performance test requires that the application under test receive traffic similar to what will be
seen in production. Workload Characterization (i. e. the traffic model) defines how the test cases are run. It is
usually defined as some type of rate (e. g. transactions/hour) with or without a certain level of concurrency among
the user population. Using an inaccurate traffic model may skew results or provide a false sense of security. These
numbers are done at the test case level, meaning each test case will have its own transaction targets and set of
virtual users, which should match the expected concurrency.
2.8 Data Model
The data model describes the types of users that will be used to execute the performance test scripts. Each
test case will have its own user type and these users must have the same characteristics as the production users
they will emulate, such as number of classes, number of tests taken, etc. in the case of an eLearning application.
The Performance Engineer generally creates the testing data based on the characteristics of the users.
The data model also describes the initial condition of the database. The testing database must be the same
order of magnitude as the production database and have a similar user population. Relevant test information
should mimic production as much as possible to simulate a real world experience.
6. Page 5
2.9 Performance Metrics
The purpose of testing is to verify the application works as expected and meets the Service Level
Agreements (SLAs). Performance metrics are the means that we will use to measure performance. Test results
will be compared to performance targets and the relevant information will be communicated to the project team.
Performance metrics include response times, server utilization, and uptime.
2.10 Monitoring Metrics
Monitoring metrics provide the information used to compare test results to test goals (e. g. performance
metrics). A common standard is to measure response time, throughput, and some server metrics (Memory, Disk,
CPU, and Network). However, any PERFMON metric for Windows systems and any rstatd metric for UNIX-
based systems can be monitored along with many application servers (e. g. Apache, IIS) and databases (e. g.
Oracle, SQL Server) in tools like LoadRunner. These metrics are integrated with the test tool so that several
metrics can be placed on the same graph without worry about timestamp correlation. For tools without integrated
monitoring, the same metrics are available from the operating system. The Performance Engineer just needs to
spend more time integrating the disparate information into the reports. Please keep in mind the project team must
specify any additional monitoring beyond the standard. The developers and architect know what monitors are best
for their applications.
2.11 Test Results
Test results should be maintained in some type of dashboard with a trend analysis. While the performance
test report from the Performance Engineer will provide the bulk of the results, the other SMEs (e. g. systems,
engineering/development, database, testing) must also report on their respective pieces so that a full evaluation
of the application under test can be made.
3 Pre-Test
These are the items that must be completed before a test can start and are done once per test. It ensures
that everything is in the proper state before testing begins so that the test cycle is not wasted.
3.1 Available Environment
The environment must be available as per the scheduling calendar. Two performance tests cannot run
concurrently in the same environment without putting the accuracy of the test results in question. The Performance
Engineer will coordinate with other engineers and the environment administrator to ensure the environment is
dedicated to only one test at a time. Concurrent tests are allowed if they do not share components, but please be
aware they could all share the same group of load generation equipment and network. The Performance Engineer
will coordinate with the Performance Testing Team with respect to test equipment used to generate load.
3.2 Stable Environment
A proper performance test requires a fully functional environment (except for proof-of-concept and
component tests) with production-like data. Servers must be configured as they will be in production and the
application settings must match what will be done in production. The Release Manager and Project Manager must
verify that the application has been installed correctly with the correct settings. A functional smoke test should be
conducted to verify major functionality is working as expected.
7. Page 6
3.3 Test Support
Every performance test requires systems and database support both for monitoring and resolution of
issues. Testing becomes delayed when time is wasted tracking down a person to look at an issue observed during
the test or explaining things to someone who has just been assigned to the project. Under tight situations, the test
window may be given to another project because the test cannot continue.
3.4 Performance Team
The Performance Engineer will verify the tests assets and equipment once the environment has been
verified. The engineer will run a small performance smoke test and will send out a notification to stakeholders
once that smoke test passes.
4 Post-Test
These are the items that must be completed before the next test can start. It is done once per test and helps
determine the best next steps.
4.1 Test Analysis Findings
The Performance Engineer will provide the test start and end times along with a report of the findings.
The other SMEs (e. g. systems, engineering/development, and database) will report on their individual areas.
4.2 Review Meeting
The Project Manager schedules a meeting to discuss the test results and determine next steps. Testing
continues when the test is deemed successful or any issues have been resolved satisfactorily.
5 Post-Project
These are the items that must be completed before the project can be closed and is done once per project.
5.1 Review Meeting
The project team will meet to discuss the overall test results. Each baseline will be evaluated and a decision
regarding accepting it or running it again will be decided.
5.2 Readiness Release Meeting
The Performance Engineer will provide information for the readiness release meeting and be available to
explain findings. The group must sign off on the test results.
5.3 Executive Summary
The Performance Engineer will provide a report that summarizes the test results and findings.
6 Performance Testing Checklists
The following pages are the actual checklists used for performance testing and may be distributed as
needed. They can also be printed and attached to a computer monitor or corkboard for easy access, or included in
the overall project plan for the project. The items in BOLD are the responsibility of the Performance Engineer.
The project team will determine who is responsible for the other times.
8. Page 7
Performance Testing Pre-Project Checklist
Project: _______________________________________________________________________
Release Date: __________________
6.1 Pre-Project
__ Performance Engineer included on meeting invites
__ Performance Engineer included on project distribution lists
__ Project schedule provided
__ Build schedule provided
__ Vacation schedule of project team members provided
__ Contact list of support personnel and subject matter experts provided
__ Performance test cases provided
__ High level architecture diagram provided
__ Individual server specifications provided
__ Design documents provided
__ Technical specifications provided
__ Required test types provided
__ Workload characterization/traffic model provided
__ Data model provided
__ Reporting requirements provided
__ Non-standard performance monitoring metrics provided
__ Performance requirements provided
__ Performance Test Plan completed
__ Test results spreadsheet created
9. Page 8
Performance Testing Pre-Test Checklist
Project: _______________________________________________________________________
Release Date: __________________
6.2 Pre-Test
__ Environment is available and testing scheduled in calendar
__ Server and application components have been installed and verified
__ Functional smoke test passed
__ Test support resources available
__ Performance test equipment validated
__ Performance smoke test passed
10. Page 9
Post-Test Checklist
Project: _______________________________________________________________________
Release Date: __________________
6.3 Post Test
__ Test start and end times provided to project team
__ Test report provided
__ Non-Tool reports provided
__ Results spreadsheet updated
__ Test report review meeting scheduled
__ Next steps determined