SlideShare uma empresa Scribd logo
1 de 11
Performance Testing Checklists
Revision: Final
James Venetsanakos
Last Updated: April 14, 2017
Page 1
Table of Contents
1 Checklist Purpose and Goals ..........................................................................................................................2
2 Pre-Project........................................................................................................................................................2
2.1 Meetings..............................................................................................................................................2
2.2 Environment Details ...........................................................................................................................2
2.3 Project and Build Schedules ...............................................................................................................3
2.4 Contact Lists .......................................................................................................................................3
2.5 Scope...................................................................................................................................................3
2.6 Test Types............................................................................................................................................4
2.7 Workload Characterization..................................................................................................................4
2.8 Data Model..........................................................................................................................................4
2.9 Performance Metrics...........................................................................................................................5
2.10 Monitoring Metrics .............................................................................................................................5
2.11 Test Results .........................................................................................................................................5
3 Pre-Test.............................................................................................................................................................5
3.1 Available Environment .......................................................................................................................5
3.2 Stable Environment.............................................................................................................................5
3.3 Test Support ........................................................................................................................................6
3.4 Performance Team ..............................................................................................................................6
4 Post-Test............................................................................................................................................................6
4.1 Test Analysis Findings ........................................................................................................................6
4.2 Review Meeting..................................................................................................................................6
5 Post-Project.......................................................................................................................................................6
5.1 Review Meeting..................................................................................................................................6
5.2 Readiness Release Meeting.................................................................................................................6
5.3 Executive Summary............................................................................................................................6
6 Performance Testing Checklists......................................................................................................................6
6.1 Pre-Project...........................................................................................................................................7
6.2 Pre-Test ...............................................................................................................................................8
6.3 Post Test ..............................................................................................................................................9
6.4 Post Project .......................................................................................................................................10
Page 2
1 Checklist Purpose and Goals
Performance testing is an activity that spans beyond the Performance Testing Team. We help the project
team to understand the role performance testing can play in the software development life cycle and we rely on
other subject matter experts to provide guidance, insight, and support so that we may conduct a thorough and
valid set of performance tests. Performance Test Checklists help to set expectations for the individuals involved
in the performance testing areas of support, test preparation, test execution, and analysis.
The main purpose of performance testing checklists is to establish criteria for each stage of testing to
improve test quality. While some of these items will be required in order to move forward, others are important
suggestions that should be completed, but will not block testing if they are not done. This organization will reduce
the number of tests repeated due to issues such as misconfigured servers or a mismatch of software across a server
farm. The reduced test cycles will allow for a more detailed analysis of the test results, which will provide more
insight into the performance of the application under test. Checklists also help ensure that the environment is set
up properly, that proper traffic and data models exist, and that support resources are available during testing and
analysis. Finally, fewer test cycles per project mean more projects can be tested in the same amount of time.
It is important to note that the Performance Engineer cannot commit to timelines until the full scope of
work is known. That typically means all the pre-project information must be provided BEFORE the final schedule
can be set. Without it, the engineer cannot create the Performance Test Plan. Even though the information is
provided in a timely fashion, the schedule may need adjustment if unforeseen complexity is discovered. Examples
of such situations include changing UIs based on the state of the page, data that is not reusable, complicated
looping constructs, excessive parameterization, large user flows, or returning the application under test to its pre-
test condition. Extra care and time must be taken in order to account for these situations appropriately. Many
times, these conditions remain hidden until test case scripting begins. It is therefore imperative that a sample
environment be available as soon as possible so that the engineer can see how the performance testing tool works
with the application under test.
2 Pre-Project
These are the items that must be completed as part of the project set up and are done once per project.
They serve the main goal of providing the necessary information to draft the Performance Test Plan, the
Performance Test Schedule and other testing documents.
2.1 Meetings
The Performance Engineer should be invited to all team meetings starting from conception and MUST be
invited to all meetings concerning performance of the application under test. The engineer can speak to
performance relative to design and start gathering information for the Performance Test Plan. The engineer is also
available to execute proof-of-concept tests to investigate proposed designs and configurations before the
application design is finalized.
2.2 Environment Details
Performance testing should not be a ”black box” testing effort restricted to analysis of end-to-end response
times and throughput. Many performance testing tools have the ability to monitor servers individually as well as
many of the components that run on those servers. This additional information provides valuable insight into the
actual operation of the application while under load. The Performance Engineer needs a high-level architecture
diagram of the application under test along with any design documents and UI or technical specifications in order
Page 3
to fully understand the application and properly test it. A separate listing of the server specifications should also
be provided so that server and application monitoring can be included in the performance test results.
2.3 Project and Build Schedules
Proper performance testing requires enough time to set up the environment, create the test data, develop
the test scripts, create the test scenarios, actually execute the tests, analyze the results, and report those results.
New builds sometimes require the re-writing of scripts to accommodate the changes. Builds for performance
testing should be restricted to those that actually affect performance. However, the final version must undergo a
full suite of performance tests regardless of the level of performance improvements for it.
The Performance Engineer must have the overall project schedule and the proposed build schedule. The
engineer will use the information to determine test timelines and inform the project team of any slack in the
schedule or any unreachable dates. The engineer can also provide input into the creation of the schedule so that
full performance testing is possible. Finally, extra time should be provided for unforeseen issues that typically
arise during script and data creation.
2.4 Contact Lists
The Performance Engineer must have access to other subject matter experts to support the performance
testing effort. The most crucial contact is the Project Manager, who is the liaison between the engineer and the
rest of the project team. The following list details the contact list by role. However, some individuals may serve
more than one role.
1. Product Manager/Business Analyst – an SME regarding the business and customer needs. This person will
determine the use cases that must be turned into performance test cases and help decide the transaction
targets for the performance tests.
2. Engineer/Developer – an SME who can speak to the way the application works at the code level. Multi-
tiered or component applications may require a separate contact for each tier or component.
3. Systems Architect – the architect who designed the application.
4. Systems Administrator – an SME who understands how the hardware is connected and can provide support
during the actual performance tests.
5. Database Administrator (DBA) – an SME who understand the database of the application under test and
provides support during test execution and results analysis.
6. Functional QA Analyst – the person who will execute the manual test cases.
7. Performance Engineer – the person who will execute the performance tests.
2.5 Scope
Not every piece of functionality requires performance testing. The project team must provide the
Performance Engineer with a subset of the functional test cases that must be included in any performance test.
Generally speaking, performance test cases are the test cases that receive the bulk of use (e. g. the 80/20 rule).
Functionality with a small user population and infrequent use does not require performance testing. For existing
applications, production logs will reveal what areas receive the most traffic and changes to the test scenarios can
Page 4
be made as needed. For new applications, the Performance Engineer can work with the project team to define
scope and coverage.
2.6 Test Types
Performance testing is an umbrella term that describes several types of tests that are used to measure
performance. A common standard is to execute 1X (anticipated peak load plus 1a 10 to 20% buffer), 2X (double
anticipated peak load), and longevity (8, 12, or 24 hours at 1X) tests at a minimum, and to execute fail-over and
capacity testing if time permits. Additional testing can be accommodated as needed. The following list details the
different testing types.
1. Component – tests an individual component without full integration with the rest of the application. This
type of tests allows testing to start before the full application is available and provides
development/engineering with some insight into expected performance.
2. Load – tests the application at given user level (e .g. 100 users, 1X). This type of test is used to ensure a
given platform meets requirements and provides some insight into the ability to handle future growth and
helps with hardware planning.
3. Scale/Capacity – tests that determine the limit of a given set of hardware and/or configuration. This type
of test helps determine the number of servers to support a required user population.
4. Longevity/Duration – tests run for extended periods that typically involve at least 1 work shift and serve
to evaluate the stability of the application.
5. Failover/Disaster Recover – tests to determine the system response to a denial of service.
6. Tuning – tests to configure settings and hardware to optimize performance. This type of test is conducted
after the code is optimized for performance.
2.7 Workload Characterization
A proper performance test requires that the application under test receive traffic similar to what will be
seen in production. Workload Characterization (i. e. the traffic model) defines how the test cases are run. It is
usually defined as some type of rate (e. g. transactions/hour) with or without a certain level of concurrency among
the user population. Using an inaccurate traffic model may skew results or provide a false sense of security. These
numbers are done at the test case level, meaning each test case will have its own transaction targets and set of
virtual users, which should match the expected concurrency.
2.8 Data Model
The data model describes the types of users that will be used to execute the performance test scripts. Each
test case will have its own user type and these users must have the same characteristics as the production users
they will emulate, such as number of classes, number of tests taken, etc. in the case of an eLearning application.
The Performance Engineer generally creates the testing data based on the characteristics of the users.
The data model also describes the initial condition of the database. The testing database must be the same
order of magnitude as the production database and have a similar user population. Relevant test information
should mimic production as much as possible to simulate a real world experience.
Page 5
2.9 Performance Metrics
The purpose of testing is to verify the application works as expected and meets the Service Level
Agreements (SLAs). Performance metrics are the means that we will use to measure performance. Test results
will be compared to performance targets and the relevant information will be communicated to the project team.
Performance metrics include response times, server utilization, and uptime.
2.10 Monitoring Metrics
Monitoring metrics provide the information used to compare test results to test goals (e. g. performance
metrics). A common standard is to measure response time, throughput, and some server metrics (Memory, Disk,
CPU, and Network). However, any PERFMON metric for Windows systems and any rstatd metric for UNIX-
based systems can be monitored along with many application servers (e. g. Apache, IIS) and databases (e. g.
Oracle, SQL Server) in tools like LoadRunner. These metrics are integrated with the test tool so that several
metrics can be placed on the same graph without worry about timestamp correlation. For tools without integrated
monitoring, the same metrics are available from the operating system. The Performance Engineer just needs to
spend more time integrating the disparate information into the reports. Please keep in mind the project team must
specify any additional monitoring beyond the standard. The developers and architect know what monitors are best
for their applications.
2.11 Test Results
Test results should be maintained in some type of dashboard with a trend analysis. While the performance
test report from the Performance Engineer will provide the bulk of the results, the other SMEs (e. g. systems,
engineering/development, database, testing) must also report on their respective pieces so that a full evaluation
of the application under test can be made.
3 Pre-Test
These are the items that must be completed before a test can start and are done once per test. It ensures
that everything is in the proper state before testing begins so that the test cycle is not wasted.
3.1 Available Environment
The environment must be available as per the scheduling calendar. Two performance tests cannot run
concurrently in the same environment without putting the accuracy of the test results in question. The Performance
Engineer will coordinate with other engineers and the environment administrator to ensure the environment is
dedicated to only one test at a time. Concurrent tests are allowed if they do not share components, but please be
aware they could all share the same group of load generation equipment and network. The Performance Engineer
will coordinate with the Performance Testing Team with respect to test equipment used to generate load.
3.2 Stable Environment
A proper performance test requires a fully functional environment (except for proof-of-concept and
component tests) with production-like data. Servers must be configured as they will be in production and the
application settings must match what will be done in production. The Release Manager and Project Manager must
verify that the application has been installed correctly with the correct settings. A functional smoke test should be
conducted to verify major functionality is working as expected.
Page 6
3.3 Test Support
Every performance test requires systems and database support both for monitoring and resolution of
issues. Testing becomes delayed when time is wasted tracking down a person to look at an issue observed during
the test or explaining things to someone who has just been assigned to the project. Under tight situations, the test
window may be given to another project because the test cannot continue.
3.4 Performance Team
The Performance Engineer will verify the tests assets and equipment once the environment has been
verified. The engineer will run a small performance smoke test and will send out a notification to stakeholders
once that smoke test passes.
4 Post-Test
These are the items that must be completed before the next test can start. It is done once per test and helps
determine the best next steps.
4.1 Test Analysis Findings
The Performance Engineer will provide the test start and end times along with a report of the findings.
The other SMEs (e. g. systems, engineering/development, and database) will report on their individual areas.
4.2 Review Meeting
The Project Manager schedules a meeting to discuss the test results and determine next steps. Testing
continues when the test is deemed successful or any issues have been resolved satisfactorily.
5 Post-Project
These are the items that must be completed before the project can be closed and is done once per project.
5.1 Review Meeting
The project team will meet to discuss the overall test results. Each baseline will be evaluated and a decision
regarding accepting it or running it again will be decided.
5.2 Readiness Release Meeting
The Performance Engineer will provide information for the readiness release meeting and be available to
explain findings. The group must sign off on the test results.
5.3 Executive Summary
The Performance Engineer will provide a report that summarizes the test results and findings.
6 Performance Testing Checklists
The following pages are the actual checklists used for performance testing and may be distributed as
needed. They can also be printed and attached to a computer monitor or corkboard for easy access, or included in
the overall project plan for the project. The items in BOLD are the responsibility of the Performance Engineer.
The project team will determine who is responsible for the other times.
Page 7
Performance Testing Pre-Project Checklist
Project: _______________________________________________________________________
Release Date: __________________
6.1 Pre-Project
__ Performance Engineer included on meeting invites
__ Performance Engineer included on project distribution lists
__ Project schedule provided
__ Build schedule provided
__ Vacation schedule of project team members provided
__ Contact list of support personnel and subject matter experts provided
__ Performance test cases provided
__ High level architecture diagram provided
__ Individual server specifications provided
__ Design documents provided
__ Technical specifications provided
__ Required test types provided
__ Workload characterization/traffic model provided
__ Data model provided
__ Reporting requirements provided
__ Non-standard performance monitoring metrics provided
__ Performance requirements provided
__ Performance Test Plan completed
__ Test results spreadsheet created
Page 8
Performance Testing Pre-Test Checklist
Project: _______________________________________________________________________
Release Date: __________________
6.2 Pre-Test
__ Environment is available and testing scheduled in calendar
__ Server and application components have been installed and verified
__ Functional smoke test passed
__ Test support resources available
__ Performance test equipment validated
__ Performance smoke test passed
Page 9
Post-Test Checklist
Project: _______________________________________________________________________
Release Date: __________________
6.3 Post Test
__ Test start and end times provided to project team
__ Test report provided
__ Non-Tool reports provided
__ Results spreadsheet updated
__ Test report review meeting scheduled
__ Next steps determined
Page 10
Post-Project Checklist
Project: _______________________________________________________________________
Release Date: __________________
6.4 Post Project
__ Test results review meeting scheduled
__ Readiness release meeting scheduled
__ Executive report for performance testing prepared

Mais conteúdo relacionado

Mais procurados

QA Interview Questions With Answers
QA Interview Questions With AnswersQA Interview Questions With Answers
QA Interview Questions With AnswersH2Kinfosys
 
Quality Assurance and Software Testing
Quality Assurance and Software TestingQuality Assurance and Software Testing
Quality Assurance and Software Testingpingkapil
 
Non-Functional testing
Non-Functional testingNon-Functional testing
Non-Functional testingKanoah
 
Agile Testing Strategy
Agile Testing StrategyAgile Testing Strategy
Agile Testing Strategytharindakasun
 
Test plan presentation
Test plan presentationTest plan presentation
Test plan presentationPeter Lebedevv
 
Easy Setup for Parallel Test Execution with Selenium Docker
Easy Setup for Parallel Test Execution with Selenium DockerEasy Setup for Parallel Test Execution with Selenium Docker
Easy Setup for Parallel Test Execution with Selenium DockerSargis Sargsyan
 
Manual Testing Material by Durgasoft
Manual Testing Material by DurgasoftManual Testing Material by Durgasoft
Manual Testing Material by DurgasoftDurga Prasad
 
Manual Testing Interview Questions | Edureka
Manual Testing Interview Questions | EdurekaManual Testing Interview Questions | Edureka
Manual Testing Interview Questions | EdurekaEdureka!
 
How to Learn The History of Software Testing
How to Learn The History of Software Testing How to Learn The History of Software Testing
How to Learn The History of Software Testing Keizo Tatsumi
 
Performance Testing
Performance TestingPerformance Testing
Performance Testingsharmaparish
 
Software Testing interview - Q&A and tips
Software Testing interview - Q&A and tipsSoftware Testing interview - Q&A and tips
Software Testing interview - Q&A and tipsPankaj Dubey
 
Cross browser testing using BrowserStack
Cross browser testing using BrowserStack Cross browser testing using BrowserStack
Cross browser testing using BrowserStack RapidValue
 
Test Plan Template
Test Plan TemplateTest Plan Template
Test Plan TemplateH2Kinfosys
 
Test Plan Simplicity
Test Plan SimplicityTest Plan Simplicity
Test Plan SimplicityJohan Hoberg
 
Software quality assurance
Software quality assuranceSoftware quality assurance
Software quality assuranceRajeev Sharan
 

Mais procurados (20)

QA Interview Questions With Answers
QA Interview Questions With AnswersQA Interview Questions With Answers
QA Interview Questions With Answers
 
Quality Assurance and Software Testing
Quality Assurance and Software TestingQuality Assurance and Software Testing
Quality Assurance and Software Testing
 
Jira
JiraJira
Jira
 
Test plan
Test planTest plan
Test plan
 
Test plan document
Test plan documentTest plan document
Test plan document
 
Non-Functional testing
Non-Functional testingNon-Functional testing
Non-Functional testing
 
Agile Testing Strategy
Agile Testing StrategyAgile Testing Strategy
Agile Testing Strategy
 
Test plan presentation
Test plan presentationTest plan presentation
Test plan presentation
 
Easy Setup for Parallel Test Execution with Selenium Docker
Easy Setup for Parallel Test Execution with Selenium DockerEasy Setup for Parallel Test Execution with Selenium Docker
Easy Setup for Parallel Test Execution with Selenium Docker
 
Test planning
Test planningTest planning
Test planning
 
Manual Testing Material by Durgasoft
Manual Testing Material by DurgasoftManual Testing Material by Durgasoft
Manual Testing Material by Durgasoft
 
Manual Testing Interview Questions | Edureka
Manual Testing Interview Questions | EdurekaManual Testing Interview Questions | Edureka
Manual Testing Interview Questions | Edureka
 
How to Learn The History of Software Testing
How to Learn The History of Software Testing How to Learn The History of Software Testing
How to Learn The History of Software Testing
 
Performance Testing
Performance TestingPerformance Testing
Performance Testing
 
Software Testing interview - Q&A and tips
Software Testing interview - Q&A and tipsSoftware Testing interview - Q&A and tips
Software Testing interview - Q&A and tips
 
Cross browser testing using BrowserStack
Cross browser testing using BrowserStack Cross browser testing using BrowserStack
Cross browser testing using BrowserStack
 
Test Plan Template
Test Plan TemplateTest Plan Template
Test Plan Template
 
Test Plan Simplicity
Test Plan SimplicityTest Plan Simplicity
Test Plan Simplicity
 
Test Life Cycle
Test Life CycleTest Life Cycle
Test Life Cycle
 
Software quality assurance
Software quality assuranceSoftware quality assurance
Software quality assurance
 

Semelhante a Performance Testing Checklists

test-plan-template-05.pdf
test-plan-template-05.pdftest-plan-template-05.pdf
test-plan-template-05.pdfgintpt
 
Test plan on iit website
Test plan on iit websiteTest plan on iit website
Test plan on iit websiteSamsuddoha Sams
 
Rals freedom project management methodologies training
Rals freedom project management methodologies trainingRals freedom project management methodologies training
Rals freedom project management methodologies trainingfrankdrake
 
Activity schedule and affective control of component based project
Activity schedule and affective control of component based projectActivity schedule and affective control of component based project
Activity schedule and affective control of component based projecteSAT Journals
 
Software Project Management: Configuration Management
Software Project Management: Configuration ManagementSoftware Project Management: Configuration Management
Software Project Management: Configuration ManagementMinhas Kamal
 
ISTQB Foundation - Chapter 2
ISTQB Foundation - Chapter 2ISTQB Foundation - Chapter 2
ISTQB Foundation - Chapter 2Chandukar
 
ISTQB, ISEB Lecture Notes- 2
ISTQB, ISEB Lecture Notes- 2ISTQB, ISEB Lecture Notes- 2
ISTQB, ISEB Lecture Notes- 2onsoftwaretest
 
Testing Attributes
Testing AttributesTesting Attributes
Testing AttributesAbiha Naqvi
 
Test Estimation
Test Estimation Test Estimation
Test Estimation SQALab
 
3Audit Software & Tools.pptx
3Audit Software & Tools.pptx3Audit Software & Tools.pptx
3Audit Software & Tools.pptxjack952975
 
Playwright Test Analytics: Extracting Insights for Improved Developer Velocity
Playwright Test Analytics: Extracting Insights for Improved Developer VelocityPlaywright Test Analytics: Extracting Insights for Improved Developer Velocity
Playwright Test Analytics: Extracting Insights for Improved Developer VelocityAffanIT1
 

Semelhante a Performance Testing Checklists (20)

Txet Document
Txet DocumentTxet Document
Txet Document
 
test-plan-template-05.pdf
test-plan-template-05.pdftest-plan-template-05.pdf
test-plan-template-05.pdf
 
Test plan on iit website
Test plan on iit websiteTest plan on iit website
Test plan on iit website
 
Test Plan.pptx
Test Plan.pptxTest Plan.pptx
Test Plan.pptx
 
Te020 i147
Te020 i147Te020 i147
Te020 i147
 
Rals freedom project management methodologies training
Rals freedom project management methodologies trainingRals freedom project management methodologies training
Rals freedom project management methodologies training
 
06 template test plan
06 template test plan06 template test plan
06 template test plan
 
Test plan
Test planTest plan
Test plan
 
Activity schedule and affective control of component based project
Activity schedule and affective control of component based projectActivity schedule and affective control of component based project
Activity schedule and affective control of component based project
 
Software Project Management: Configuration Management
Software Project Management: Configuration ManagementSoftware Project Management: Configuration Management
Software Project Management: Configuration Management
 
ISTQB Foundation - Chapter 2
ISTQB Foundation - Chapter 2ISTQB Foundation - Chapter 2
ISTQB Foundation - Chapter 2
 
ISTQB, ISEB Lecture Notes- 2
ISTQB, ISEB Lecture Notes- 2ISTQB, ISEB Lecture Notes- 2
ISTQB, ISEB Lecture Notes- 2
 
Myresume(testing)-a
Myresume(testing)-aMyresume(testing)-a
Myresume(testing)-a
 
Myresume(testing) a
Myresume(testing) aMyresume(testing) a
Myresume(testing) a
 
Testplan
TestplanTestplan
Testplan
 
stlc
stlcstlc
stlc
 
Testing Attributes
Testing AttributesTesting Attributes
Testing Attributes
 
Test Estimation
Test Estimation Test Estimation
Test Estimation
 
3Audit Software & Tools.pptx
3Audit Software & Tools.pptx3Audit Software & Tools.pptx
3Audit Software & Tools.pptx
 
Playwright Test Analytics: Extracting Insights for Improved Developer Velocity
Playwright Test Analytics: Extracting Insights for Improved Developer VelocityPlaywright Test Analytics: Extracting Insights for Improved Developer Velocity
Playwright Test Analytics: Extracting Insights for Improved Developer Velocity
 

Último

The title is not connected to what is inside
The title is not connected to what is insideThe title is not connected to what is inside
The title is not connected to what is insideshinachiaurasa2
 
Exploring the Best Video Editing App.pdf
Exploring the Best Video Editing App.pdfExploring the Best Video Editing App.pdf
Exploring the Best Video Editing App.pdfproinshot.com
 
Software Quality Assurance Interview Questions
Software Quality Assurance Interview QuestionsSoftware Quality Assurance Interview Questions
Software Quality Assurance Interview QuestionsArshad QA
 
Azure_Native_Qumulo_High_Performance_Compute_Benchmarks.pdf
Azure_Native_Qumulo_High_Performance_Compute_Benchmarks.pdfAzure_Native_Qumulo_High_Performance_Compute_Benchmarks.pdf
Azure_Native_Qumulo_High_Performance_Compute_Benchmarks.pdfryanfarris8
 
Sector 18, Noida Call girls :8448380779 Model Escorts | 100% verified
Sector 18, Noida Call girls :8448380779 Model Escorts | 100% verifiedSector 18, Noida Call girls :8448380779 Model Escorts | 100% verified
Sector 18, Noida Call girls :8448380779 Model Escorts | 100% verifiedDelhi Call girls
 
How To Troubleshoot Collaboration Apps for the Modern Connected Worker
How To Troubleshoot Collaboration Apps for the Modern Connected WorkerHow To Troubleshoot Collaboration Apps for the Modern Connected Worker
How To Troubleshoot Collaboration Apps for the Modern Connected WorkerThousandEyes
 
The Top App Development Trends Shaping the Industry in 2024-25 .pdf
The Top App Development Trends Shaping the Industry in 2024-25 .pdfThe Top App Development Trends Shaping the Industry in 2024-25 .pdf
The Top App Development Trends Shaping the Industry in 2024-25 .pdfayushiqss
 
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️Delhi Call girls
 
MarTech Trend 2024 Book : Marketing Technology Trends (2024 Edition) How Data...
MarTech Trend 2024 Book : Marketing Technology Trends (2024 Edition) How Data...MarTech Trend 2024 Book : Marketing Technology Trends (2024 Edition) How Data...
MarTech Trend 2024 Book : Marketing Technology Trends (2024 Edition) How Data...Jittipong Loespradit
 
%in Midrand+277-882-255-28 abortion pills for sale in midrand
%in Midrand+277-882-255-28 abortion pills for sale in midrand%in Midrand+277-882-255-28 abortion pills for sale in midrand
%in Midrand+277-882-255-28 abortion pills for sale in midrandmasabamasaba
 
Optimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVOptimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVshikhaohhpro
 
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...panagenda
 
Crypto Cloud Review - How To Earn Up To $500 Per DAY Of Bitcoin 100% On AutoP...
Crypto Cloud Review - How To Earn Up To $500 Per DAY Of Bitcoin 100% On AutoP...Crypto Cloud Review - How To Earn Up To $500 Per DAY Of Bitcoin 100% On AutoP...
Crypto Cloud Review - How To Earn Up To $500 Per DAY Of Bitcoin 100% On AutoP...SelfMade bd
 
The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdfThe Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdfkalichargn70th171
 
Unlocking the Future of AI Agents with Large Language Models
Unlocking the Future of AI Agents with Large Language ModelsUnlocking the Future of AI Agents with Large Language Models
Unlocking the Future of AI Agents with Large Language Modelsaagamshah0812
 
Pharm-D Biostatistics and Research methodology
Pharm-D Biostatistics and Research methodologyPharm-D Biostatistics and Research methodology
Pharm-D Biostatistics and Research methodologyAnusha Are
 
10 Trends Likely to Shape Enterprise Technology in 2024
10 Trends Likely to Shape Enterprise Technology in 202410 Trends Likely to Shape Enterprise Technology in 2024
10 Trends Likely to Shape Enterprise Technology in 2024Mind IT Systems
 
8257 interfacing 2 in microprocessor for btech students
8257 interfacing 2 in microprocessor for btech students8257 interfacing 2 in microprocessor for btech students
8257 interfacing 2 in microprocessor for btech studentsHimanshiGarg82
 
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfLearn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfkalichargn70th171
 

Último (20)

Microsoft AI Transformation Partner Playbook.pdf
Microsoft AI Transformation Partner Playbook.pdfMicrosoft AI Transformation Partner Playbook.pdf
Microsoft AI Transformation Partner Playbook.pdf
 
The title is not connected to what is inside
The title is not connected to what is insideThe title is not connected to what is inside
The title is not connected to what is inside
 
Exploring the Best Video Editing App.pdf
Exploring the Best Video Editing App.pdfExploring the Best Video Editing App.pdf
Exploring the Best Video Editing App.pdf
 
Software Quality Assurance Interview Questions
Software Quality Assurance Interview QuestionsSoftware Quality Assurance Interview Questions
Software Quality Assurance Interview Questions
 
Azure_Native_Qumulo_High_Performance_Compute_Benchmarks.pdf
Azure_Native_Qumulo_High_Performance_Compute_Benchmarks.pdfAzure_Native_Qumulo_High_Performance_Compute_Benchmarks.pdf
Azure_Native_Qumulo_High_Performance_Compute_Benchmarks.pdf
 
Sector 18, Noida Call girls :8448380779 Model Escorts | 100% verified
Sector 18, Noida Call girls :8448380779 Model Escorts | 100% verifiedSector 18, Noida Call girls :8448380779 Model Escorts | 100% verified
Sector 18, Noida Call girls :8448380779 Model Escorts | 100% verified
 
How To Troubleshoot Collaboration Apps for the Modern Connected Worker
How To Troubleshoot Collaboration Apps for the Modern Connected WorkerHow To Troubleshoot Collaboration Apps for the Modern Connected Worker
How To Troubleshoot Collaboration Apps for the Modern Connected Worker
 
The Top App Development Trends Shaping the Industry in 2024-25 .pdf
The Top App Development Trends Shaping the Industry in 2024-25 .pdfThe Top App Development Trends Shaping the Industry in 2024-25 .pdf
The Top App Development Trends Shaping the Industry in 2024-25 .pdf
 
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
 
MarTech Trend 2024 Book : Marketing Technology Trends (2024 Edition) How Data...
MarTech Trend 2024 Book : Marketing Technology Trends (2024 Edition) How Data...MarTech Trend 2024 Book : Marketing Technology Trends (2024 Edition) How Data...
MarTech Trend 2024 Book : Marketing Technology Trends (2024 Edition) How Data...
 
%in Midrand+277-882-255-28 abortion pills for sale in midrand
%in Midrand+277-882-255-28 abortion pills for sale in midrand%in Midrand+277-882-255-28 abortion pills for sale in midrand
%in Midrand+277-882-255-28 abortion pills for sale in midrand
 
Optimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVOptimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTV
 
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...
 
Crypto Cloud Review - How To Earn Up To $500 Per DAY Of Bitcoin 100% On AutoP...
Crypto Cloud Review - How To Earn Up To $500 Per DAY Of Bitcoin 100% On AutoP...Crypto Cloud Review - How To Earn Up To $500 Per DAY Of Bitcoin 100% On AutoP...
Crypto Cloud Review - How To Earn Up To $500 Per DAY Of Bitcoin 100% On AutoP...
 
The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdfThe Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
 
Unlocking the Future of AI Agents with Large Language Models
Unlocking the Future of AI Agents with Large Language ModelsUnlocking the Future of AI Agents with Large Language Models
Unlocking the Future of AI Agents with Large Language Models
 
Pharm-D Biostatistics and Research methodology
Pharm-D Biostatistics and Research methodologyPharm-D Biostatistics and Research methodology
Pharm-D Biostatistics and Research methodology
 
10 Trends Likely to Shape Enterprise Technology in 2024
10 Trends Likely to Shape Enterprise Technology in 202410 Trends Likely to Shape Enterprise Technology in 2024
10 Trends Likely to Shape Enterprise Technology in 2024
 
8257 interfacing 2 in microprocessor for btech students
8257 interfacing 2 in microprocessor for btech students8257 interfacing 2 in microprocessor for btech students
8257 interfacing 2 in microprocessor for btech students
 
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfLearn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
 

Performance Testing Checklists

  • 1. Performance Testing Checklists Revision: Final James Venetsanakos Last Updated: April 14, 2017
  • 2. Page 1 Table of Contents 1 Checklist Purpose and Goals ..........................................................................................................................2 2 Pre-Project........................................................................................................................................................2 2.1 Meetings..............................................................................................................................................2 2.2 Environment Details ...........................................................................................................................2 2.3 Project and Build Schedules ...............................................................................................................3 2.4 Contact Lists .......................................................................................................................................3 2.5 Scope...................................................................................................................................................3 2.6 Test Types............................................................................................................................................4 2.7 Workload Characterization..................................................................................................................4 2.8 Data Model..........................................................................................................................................4 2.9 Performance Metrics...........................................................................................................................5 2.10 Monitoring Metrics .............................................................................................................................5 2.11 Test Results .........................................................................................................................................5 3 Pre-Test.............................................................................................................................................................5 3.1 Available Environment .......................................................................................................................5 3.2 Stable Environment.............................................................................................................................5 3.3 Test Support ........................................................................................................................................6 3.4 Performance Team ..............................................................................................................................6 4 Post-Test............................................................................................................................................................6 4.1 Test Analysis Findings ........................................................................................................................6 4.2 Review Meeting..................................................................................................................................6 5 Post-Project.......................................................................................................................................................6 5.1 Review Meeting..................................................................................................................................6 5.2 Readiness Release Meeting.................................................................................................................6 5.3 Executive Summary............................................................................................................................6 6 Performance Testing Checklists......................................................................................................................6 6.1 Pre-Project...........................................................................................................................................7 6.2 Pre-Test ...............................................................................................................................................8 6.3 Post Test ..............................................................................................................................................9 6.4 Post Project .......................................................................................................................................10
  • 3. Page 2 1 Checklist Purpose and Goals Performance testing is an activity that spans beyond the Performance Testing Team. We help the project team to understand the role performance testing can play in the software development life cycle and we rely on other subject matter experts to provide guidance, insight, and support so that we may conduct a thorough and valid set of performance tests. Performance Test Checklists help to set expectations for the individuals involved in the performance testing areas of support, test preparation, test execution, and analysis. The main purpose of performance testing checklists is to establish criteria for each stage of testing to improve test quality. While some of these items will be required in order to move forward, others are important suggestions that should be completed, but will not block testing if they are not done. This organization will reduce the number of tests repeated due to issues such as misconfigured servers or a mismatch of software across a server farm. The reduced test cycles will allow for a more detailed analysis of the test results, which will provide more insight into the performance of the application under test. Checklists also help ensure that the environment is set up properly, that proper traffic and data models exist, and that support resources are available during testing and analysis. Finally, fewer test cycles per project mean more projects can be tested in the same amount of time. It is important to note that the Performance Engineer cannot commit to timelines until the full scope of work is known. That typically means all the pre-project information must be provided BEFORE the final schedule can be set. Without it, the engineer cannot create the Performance Test Plan. Even though the information is provided in a timely fashion, the schedule may need adjustment if unforeseen complexity is discovered. Examples of such situations include changing UIs based on the state of the page, data that is not reusable, complicated looping constructs, excessive parameterization, large user flows, or returning the application under test to its pre- test condition. Extra care and time must be taken in order to account for these situations appropriately. Many times, these conditions remain hidden until test case scripting begins. It is therefore imperative that a sample environment be available as soon as possible so that the engineer can see how the performance testing tool works with the application under test. 2 Pre-Project These are the items that must be completed as part of the project set up and are done once per project. They serve the main goal of providing the necessary information to draft the Performance Test Plan, the Performance Test Schedule and other testing documents. 2.1 Meetings The Performance Engineer should be invited to all team meetings starting from conception and MUST be invited to all meetings concerning performance of the application under test. The engineer can speak to performance relative to design and start gathering information for the Performance Test Plan. The engineer is also available to execute proof-of-concept tests to investigate proposed designs and configurations before the application design is finalized. 2.2 Environment Details Performance testing should not be a ”black box” testing effort restricted to analysis of end-to-end response times and throughput. Many performance testing tools have the ability to monitor servers individually as well as many of the components that run on those servers. This additional information provides valuable insight into the actual operation of the application while under load. The Performance Engineer needs a high-level architecture diagram of the application under test along with any design documents and UI or technical specifications in order
  • 4. Page 3 to fully understand the application and properly test it. A separate listing of the server specifications should also be provided so that server and application monitoring can be included in the performance test results. 2.3 Project and Build Schedules Proper performance testing requires enough time to set up the environment, create the test data, develop the test scripts, create the test scenarios, actually execute the tests, analyze the results, and report those results. New builds sometimes require the re-writing of scripts to accommodate the changes. Builds for performance testing should be restricted to those that actually affect performance. However, the final version must undergo a full suite of performance tests regardless of the level of performance improvements for it. The Performance Engineer must have the overall project schedule and the proposed build schedule. The engineer will use the information to determine test timelines and inform the project team of any slack in the schedule or any unreachable dates. The engineer can also provide input into the creation of the schedule so that full performance testing is possible. Finally, extra time should be provided for unforeseen issues that typically arise during script and data creation. 2.4 Contact Lists The Performance Engineer must have access to other subject matter experts to support the performance testing effort. The most crucial contact is the Project Manager, who is the liaison between the engineer and the rest of the project team. The following list details the contact list by role. However, some individuals may serve more than one role. 1. Product Manager/Business Analyst – an SME regarding the business and customer needs. This person will determine the use cases that must be turned into performance test cases and help decide the transaction targets for the performance tests. 2. Engineer/Developer – an SME who can speak to the way the application works at the code level. Multi- tiered or component applications may require a separate contact for each tier or component. 3. Systems Architect – the architect who designed the application. 4. Systems Administrator – an SME who understands how the hardware is connected and can provide support during the actual performance tests. 5. Database Administrator (DBA) – an SME who understand the database of the application under test and provides support during test execution and results analysis. 6. Functional QA Analyst – the person who will execute the manual test cases. 7. Performance Engineer – the person who will execute the performance tests. 2.5 Scope Not every piece of functionality requires performance testing. The project team must provide the Performance Engineer with a subset of the functional test cases that must be included in any performance test. Generally speaking, performance test cases are the test cases that receive the bulk of use (e. g. the 80/20 rule). Functionality with a small user population and infrequent use does not require performance testing. For existing applications, production logs will reveal what areas receive the most traffic and changes to the test scenarios can
  • 5. Page 4 be made as needed. For new applications, the Performance Engineer can work with the project team to define scope and coverage. 2.6 Test Types Performance testing is an umbrella term that describes several types of tests that are used to measure performance. A common standard is to execute 1X (anticipated peak load plus 1a 10 to 20% buffer), 2X (double anticipated peak load), and longevity (8, 12, or 24 hours at 1X) tests at a minimum, and to execute fail-over and capacity testing if time permits. Additional testing can be accommodated as needed. The following list details the different testing types. 1. Component – tests an individual component without full integration with the rest of the application. This type of tests allows testing to start before the full application is available and provides development/engineering with some insight into expected performance. 2. Load – tests the application at given user level (e .g. 100 users, 1X). This type of test is used to ensure a given platform meets requirements and provides some insight into the ability to handle future growth and helps with hardware planning. 3. Scale/Capacity – tests that determine the limit of a given set of hardware and/or configuration. This type of test helps determine the number of servers to support a required user population. 4. Longevity/Duration – tests run for extended periods that typically involve at least 1 work shift and serve to evaluate the stability of the application. 5. Failover/Disaster Recover – tests to determine the system response to a denial of service. 6. Tuning – tests to configure settings and hardware to optimize performance. This type of test is conducted after the code is optimized for performance. 2.7 Workload Characterization A proper performance test requires that the application under test receive traffic similar to what will be seen in production. Workload Characterization (i. e. the traffic model) defines how the test cases are run. It is usually defined as some type of rate (e. g. transactions/hour) with or without a certain level of concurrency among the user population. Using an inaccurate traffic model may skew results or provide a false sense of security. These numbers are done at the test case level, meaning each test case will have its own transaction targets and set of virtual users, which should match the expected concurrency. 2.8 Data Model The data model describes the types of users that will be used to execute the performance test scripts. Each test case will have its own user type and these users must have the same characteristics as the production users they will emulate, such as number of classes, number of tests taken, etc. in the case of an eLearning application. The Performance Engineer generally creates the testing data based on the characteristics of the users. The data model also describes the initial condition of the database. The testing database must be the same order of magnitude as the production database and have a similar user population. Relevant test information should mimic production as much as possible to simulate a real world experience.
  • 6. Page 5 2.9 Performance Metrics The purpose of testing is to verify the application works as expected and meets the Service Level Agreements (SLAs). Performance metrics are the means that we will use to measure performance. Test results will be compared to performance targets and the relevant information will be communicated to the project team. Performance metrics include response times, server utilization, and uptime. 2.10 Monitoring Metrics Monitoring metrics provide the information used to compare test results to test goals (e. g. performance metrics). A common standard is to measure response time, throughput, and some server metrics (Memory, Disk, CPU, and Network). However, any PERFMON metric for Windows systems and any rstatd metric for UNIX- based systems can be monitored along with many application servers (e. g. Apache, IIS) and databases (e. g. Oracle, SQL Server) in tools like LoadRunner. These metrics are integrated with the test tool so that several metrics can be placed on the same graph without worry about timestamp correlation. For tools without integrated monitoring, the same metrics are available from the operating system. The Performance Engineer just needs to spend more time integrating the disparate information into the reports. Please keep in mind the project team must specify any additional monitoring beyond the standard. The developers and architect know what monitors are best for their applications. 2.11 Test Results Test results should be maintained in some type of dashboard with a trend analysis. While the performance test report from the Performance Engineer will provide the bulk of the results, the other SMEs (e. g. systems, engineering/development, database, testing) must also report on their respective pieces so that a full evaluation of the application under test can be made. 3 Pre-Test These are the items that must be completed before a test can start and are done once per test. It ensures that everything is in the proper state before testing begins so that the test cycle is not wasted. 3.1 Available Environment The environment must be available as per the scheduling calendar. Two performance tests cannot run concurrently in the same environment without putting the accuracy of the test results in question. The Performance Engineer will coordinate with other engineers and the environment administrator to ensure the environment is dedicated to only one test at a time. Concurrent tests are allowed if they do not share components, but please be aware they could all share the same group of load generation equipment and network. The Performance Engineer will coordinate with the Performance Testing Team with respect to test equipment used to generate load. 3.2 Stable Environment A proper performance test requires a fully functional environment (except for proof-of-concept and component tests) with production-like data. Servers must be configured as they will be in production and the application settings must match what will be done in production. The Release Manager and Project Manager must verify that the application has been installed correctly with the correct settings. A functional smoke test should be conducted to verify major functionality is working as expected.
  • 7. Page 6 3.3 Test Support Every performance test requires systems and database support both for monitoring and resolution of issues. Testing becomes delayed when time is wasted tracking down a person to look at an issue observed during the test or explaining things to someone who has just been assigned to the project. Under tight situations, the test window may be given to another project because the test cannot continue. 3.4 Performance Team The Performance Engineer will verify the tests assets and equipment once the environment has been verified. The engineer will run a small performance smoke test and will send out a notification to stakeholders once that smoke test passes. 4 Post-Test These are the items that must be completed before the next test can start. It is done once per test and helps determine the best next steps. 4.1 Test Analysis Findings The Performance Engineer will provide the test start and end times along with a report of the findings. The other SMEs (e. g. systems, engineering/development, and database) will report on their individual areas. 4.2 Review Meeting The Project Manager schedules a meeting to discuss the test results and determine next steps. Testing continues when the test is deemed successful or any issues have been resolved satisfactorily. 5 Post-Project These are the items that must be completed before the project can be closed and is done once per project. 5.1 Review Meeting The project team will meet to discuss the overall test results. Each baseline will be evaluated and a decision regarding accepting it or running it again will be decided. 5.2 Readiness Release Meeting The Performance Engineer will provide information for the readiness release meeting and be available to explain findings. The group must sign off on the test results. 5.3 Executive Summary The Performance Engineer will provide a report that summarizes the test results and findings. 6 Performance Testing Checklists The following pages are the actual checklists used for performance testing and may be distributed as needed. They can also be printed and attached to a computer monitor or corkboard for easy access, or included in the overall project plan for the project. The items in BOLD are the responsibility of the Performance Engineer. The project team will determine who is responsible for the other times.
  • 8. Page 7 Performance Testing Pre-Project Checklist Project: _______________________________________________________________________ Release Date: __________________ 6.1 Pre-Project __ Performance Engineer included on meeting invites __ Performance Engineer included on project distribution lists __ Project schedule provided __ Build schedule provided __ Vacation schedule of project team members provided __ Contact list of support personnel and subject matter experts provided __ Performance test cases provided __ High level architecture diagram provided __ Individual server specifications provided __ Design documents provided __ Technical specifications provided __ Required test types provided __ Workload characterization/traffic model provided __ Data model provided __ Reporting requirements provided __ Non-standard performance monitoring metrics provided __ Performance requirements provided __ Performance Test Plan completed __ Test results spreadsheet created
  • 9. Page 8 Performance Testing Pre-Test Checklist Project: _______________________________________________________________________ Release Date: __________________ 6.2 Pre-Test __ Environment is available and testing scheduled in calendar __ Server and application components have been installed and verified __ Functional smoke test passed __ Test support resources available __ Performance test equipment validated __ Performance smoke test passed
  • 10. Page 9 Post-Test Checklist Project: _______________________________________________________________________ Release Date: __________________ 6.3 Post Test __ Test start and end times provided to project team __ Test report provided __ Non-Tool reports provided __ Results spreadsheet updated __ Test report review meeting scheduled __ Next steps determined
  • 11. Page 10 Post-Project Checklist Project: _______________________________________________________________________ Release Date: __________________ 6.4 Post Project __ Test results review meeting scheduled __ Readiness release meeting scheduled __ Executive report for performance testing prepared