%+27788225528 love spells in Knoxville Psychic Readings, Attraction spells,Br...
An exploratory study of the state of practice of performance testing in Java-based open source projects
1. An exploratory study of the state ofAn exploratory study of the state of
practice of performance testing in Java-practice of performance testing in Java-
based open source projectsbased open source projects
Cor-Paul BezemerPhilipp Leitner
2. Industrial applications depend more andIndustrial applications depend more and
more on open source softwaremore on open source software
3. Is the dependency on open source softwareIs the dependency on open source software
justified (or wise)?justified (or wise)?
4. We know (more or less) how toWe know (more or less) how to
assess quality of functionality...assess quality of functionality...
5. But there is no standardized way ofBut there is no standardized way of
assessing performance!assessing performance!
6. What is the state of practice ofWhat is the state of practice of
performance testing in Java-basedperformance testing in Java-based
open source software?open source software?
7. We conducted an exploratoryWe conducted an exploratory
study on 111 Java-based projectsstudy on 111 Java-based projects
8. We conducted an exploratoryWe conducted an exploratory
study on 111 Java-based projectsstudy on 111 Java-based projects
‘bench’ or ‘perf’ in the src/test directory
10. We manually identifiedWe manually identified
(performance) tests(performance) tests
● Following process:
– Search the test files for performance-related terms
● ‘perf’,‘bench’,‘fast’,‘speed’, etc.
– Manually identify performance tests
11. We studied performance testingWe studied performance testing
from five perspectivesfrom five perspectives
1.The developers who are involved
2.The extent of performance testing
3.The organization of performance tests
4.The types of performance tests
5.The tools used
12. Perspective 1: DevelopersPerspective 1: Developers
Performance tests are usually done by a small group
of developers (median 2 vs. median 9 developers)
13. Perspective 1: DevelopersPerspective 1: Developers
Performance tests are usually done by a small group
of developers (median 2 vs. median 9 developers)
Performance testers are usually core developers of
the project
14. Perspective 2:The extent of testingPerspective 2:The extent of testing
The performance test suite is usually small
– A few hundred LOC compared to a few thousand LOC
for the functional test suite
– Exceptions are Hadoop and Deuce STM (Software
Transactional Memory)
15. Perspective 2:The extent of testingPerspective 2:The extent of testing
The performance test suite is usually small
– A few hundred LOC compared to a few thousand LOC
for the functional test suite
– Exceptions are Hadoop and Deuce STM (Software
Transactional Memory)
Projects that claim to be the fastest/most efficient
do not seem to take extra measures to support
their claims
16. Perspective 3: OrganizationPerspective 3: Organization
There is no standardized way of organizing and
conducting performance tests
– Performance tests are scattered throughout the (test)
code
– Previous results and instructions for executing a
performance test are included in code comments
17. Perspective 4:Types of performance testsPerspective 4:Types of performance tests
Type 1: Performance smoke test (50% of the projects)
― Test a particular use case
Type 2: Microbenchmarks (32%)
― Test smaller units of code
Type 3: One-shot performance tests (15%)
― Test a known buggy case
Type 4: Performance assertions (5%)
― ‘4x as fast as …’
Type 5: Implicit performance tests (5%)
― ‘Test’ using data that is generated as a side-effect (e.g., execution time of a test)
18. Perspective 4:Types of performance testsPerspective 4:Types of performance tests
Type 1: Performance smoke test (50% of the projects)
― Test a particular use case
Type 2: Microbenchmarks (32%)
― Test smaller units of code
Type 3: One-shot performance tests (15%)
― Test a known buggy case
Type 4: Performance assertions (5%)
― ‘4x as fast as …’
Type 5: Implicit performance tests (5%)
― ‘Test’ using data that is generated as a side-effect (e.g., execution time of a test)
19. Perspective 4:Types of performance testsPerspective 4:Types of performance tests
Type 1: Performance smoke test (50% of the projects)
― Test a particular use case
Type 2: Microbenchmarks (32%)
― Test smaller units of code
Type 3: One-shot performance tests (15%)
― Test a known buggy case
Type 4: Performance assertions (5%)
― ‘4x as fast as …’
Type 5: Implicit performance tests (5%)
― ‘Test’ using data that is generated as a side-effect (e.g., execution time of a test)
20. Perspective 4:Types of performance testsPerspective 4:Types of performance tests
Type 1: Performance smoke test (50% of the projects)
― Test a particular use case
Type 2: Microbenchmarks (32%)
― Test smaller units of code
Type 3: One-shot performance tests (15%)
― Test a known buggy case
Type 4: Performance assertions (5%)
― ‘4x as fast as …’
Type 5: Implicit performance tests (5%)
― ‘Test’ using data that is generated as a side-effect (e.g., execution time of a test)
21. Perspective 4:Types of performance testsPerspective 4:Types of performance tests
Type 1: Performance smoke test (50% of the projects)
― Test a particular use case
Type 2: Microbenchmarks (32%)
― Test smaller units of code
Type 3: One-shot performance tests (15%)
― Test a known buggy case
Type 4: Performance assertions (5%)
― ‘4x as fast as …’
Type 5: Implicit performance tests (5%)
― ‘Test’ using data that is generated as a side-effect (e.g., execution time of a test)
22. Perspective 5:ToolsPerspective 5:Tools
Approach 1: Unit testing (51% of the projects)
– Usually using JUnit instead of a ‘dedicated’ performance
unit test framework
Approach 2: Stand-alone performance tests (50%)
– Custom written for the project
Approach 3: Dedicated performance testing
framework (16%)
– Usually Caliper or JMH
23. Perspective 5:ToolsPerspective 5:Tools
Approach 1: Unit testing (51% of the projects)
– Usually using JUnit instead of a ‘dedicated’ performance
unit test framework
Approach 2: Stand-alone performance tests (50%)
– Custom written for the project
Approach 3: Dedicated performance testing
framework (16%)
– Usually Caliper or JMH
24. Perspective 5:ToolsPerspective 5:Tools
Approach 1: Unit testing (51% of the projects)
– Usually using JUnit instead of a ‘dedicated’ performance
unit test framework
Approach 2: Stand-alone performance tests (50%)
– Custom written for the project
Approach 3: Dedicated performance testing
framework (16%)
– Usually Caliper or JMH
25. ImplicationsImplications
There is a lack of a ‘killer app’ for performance
testing
Writing performance tests is not a popular task in
open source projects
Developers want support for quick-and-dirty
performance testing
Performance testing is multi-faceted
Integration into standard CI-frameworks is key
26. ImplicationsImplications
There is a lack of a ‘killer app’ for performance
testing
Writing performance tests is not a popular task in
open source projects
Developers want support for quick-and-dirty
performance testing
Performance testing is multi-faceted
Integration into standard CI-frameworks is key
27. ImplicationsImplications
There is a lack of a ‘killer app’ for performance
testing
Writing performance tests is not a popular task in
open source projects
Developers want support for quick-and-dirty
performance testing
Performance testing is multi-faceted
Integration into standard CI-frameworks is key
28. ImplicationsImplications
There is a lack of a ‘killer app’ for performance
testing
Writing performance tests is not a popular task in
open source projects
Developers want support for quick-and-dirty
performance testing
Performance testing is multi-faceted
Integration into standard CI-frameworks is key
29. ImplicationsImplications
There is a lack of a ‘killer app’ for performance
testing
Writing performance tests is not a popular task in
open source projects
Developers want support for quick-and-dirty
performance testing
Performance testing is multi-faceted
Integration into standard CI-frameworks is key
30. ConclusionConclusion
● Open source developers seem to be not very enthusiastic
about writing performance tests
● There is a lack of a ‘killer app’ for performance testing
What we are doing as researchers, does not seem to
reach practice!
Cor-Paul Bezemer,bezemer@cs.queensu.ca
http://sailhome.cs.queensu.ca/~corpaul/
31. ConclusionConclusion
● Open source developers seem to be not very enthusiastic
about writing performance tests
● There is a lack of a ‘killer app’ for performance testing
What we are doing as researchers, does not seem to
reach practice!
Cor-Paul Bezemer,bezemer@cs.queensu.ca
http://sailhome.cs.queensu.ca/~corpaul/
32. ConclusionConclusion
● Open source developers seem to be not very enthusiastic
about writing performance tests
● There is a lack of a ‘killer app’ for performance testing
What we are doing as researchers, does not seem to
reach practice!
Cor-Paul Bezemer,bezemer@cs.queensu.ca
http://sailhome.cs.queensu.ca/~corpaul/
33. ConclusionConclusion
● Open source developers seem to be not very enthusiastic
about writing performance tests
● There is a lack of a ‘killer app’ for performance testing
What we are doing as researchers, does not seem to
reach practice!
Cor-Paul Bezemer,bezemer@cs.queensu.ca
http://sailhome.cs.queensu.ca/~corpaul/