By Viktor Clerc, XebiaLabs
If you are taking the quality of your software seriously, you'll have numerous automated tests across many different Jenkins jobs. But getting a grip on all of your automated tests -- and then figuring out whether your software is good enough to go live -- becomes harder and harder as you speed up software delivery. Viktor will share tips on how naming conventions, partitioning of testware and mirroring the application's structure in the test code help you best handle automated testing with Jenkins. Viktor will also provide insight into how to keep this setup manageable and will share practical experiences of managing a large portfolio of automated tests. Finally, he will showcase several practices that help you manage all your results, plus add aggregation, trend analysis and qualification capabilities to your Jenkins setup. These practices will help you draw the right conclusions from your tests and deliver code faster, with the confidence that your systems won't fail in production.
2. #jenkinsconf
Agenda
• The World of Testing is Changing
• Testing = Automation
• Test Automation and CD: Execution and Analysis
• Focus on the Basics
• Best Practices for Test Execution using Jenkins
• Supporting Test Analysis
2
3. #jenkinsconf
Footer
But first…a bit about me
• Product Manager XL TestView for XebiaLabs
• Traversed through all phases of the software
development lifecycle
• Supported major organization in setting up a
test strategy and test automation strategy
• Is eager to flip the way (most) organizations do
testing
3
6. #jenkinsconf
Introducing Test Automation For Real
SPECIFY DESIGN BUILD TEST INTEGRATE REGRESSION
USER
ACCEPTANCE
RELEASE
Acceptance
Driven Testing
Development = Test
Test = Development
Automate ALL
User Acceptance
Test
effort
INTEGRATE REGRESSION
USER
ACCEPTANCE
8. #jenkinsconf
Testing = Automation: Implications
• Developers are becoming testers
– Maintain test code as source code
• Need to set up on-demand pipelines and
environments
• Infrastructure as code
– X-browser tests, Selenium grids, dedicated
performance environments, mobile etc.
• Hosted services
9. #jenkinsconf
Testing = Automation: Challenges
• Many test tools for each of the test levels, but no
single place to answer “Good enough to go live?”
• Requirements coverageis not available
– “Did we test enough?”
– Minimize the mean time to repair
– Support for failure analysis
JUnit, FitNesse, JMeter, YSlow,
Vanity Check, WireShark, SOAP-
UI, Jasmine, Karma, Speedtrace,
Selenium, WebScarab, TTA,
DynaTrace, HP Diagnostics, ALM
stack AppDynamics, Code Tester
for Oracle, Arachnid, Fortify,
Sonar, …
10. #jenkinsconf
Testing = Automation: Challenges
• Thousands of tests makes test sets hard to manage:
– “Where is my subset?”
– “What tests add most value, what tests are superfluous?”
– “When to run what tests?”
• Running all tests all the time takes too long, feedback is
too late
• Quality control of the tests themselves and maintenance
of testware
14. #jenkinsconf
The Two Faces of CD
• A lot of focus right now is on pipeline execution
• …but there’s no point delivering at light speed if
everything starts breaking
• Testing (= quality/risk) needs to be a first-class citizen
of your CD initiative!
19. #jenkinsconf
Quick Review
19
1. Cohn’s pyramid
– Unit tests
– Service tests (under the GUI)
– (graphical) User Interface tests
2. And even further downstream
– Integration Tests
– Performance Tests
22. #jenkinsconf
“Modern Testing” 101
1. Testers are developers
2. Test code equals production code
– Conway’s Law
– Measure quality
3. Linking tests to use cases
22
23. #jenkinsconf
“Modern Testing” 101
1. Testers are developers
2. Test code equals production code
– Conway’s Law
– Measure quality
3. Linking tests to use cases
4. Slice and dice
– Labeling
23
24. #jenkinsconf
“Modern Testing” 101
1. Testers are developers
2. Test code equals production code
– Conway’s Law
– Measure quality
3. Linking tests to use cases
4. Slice and dice
– Labeling
5. Radical parallelization24
Fail FASTer!
“Kill the nightlies”
25. #jenkinsconf
Dealing With Growing Tests
• Conway’s Law for test code
– Let the test code mimic the production code
– Organize tests under the project/system under test
• Suite.App.UseCase.TestCase
• Cut the suite at UseCase: now you have
independent chunks which you can run massively in
parallel
25
26. #jenkinsconf
Dealing With Growing Tests
• Tests should not depend on other tests
– Setup and tear down of test data done within each test
– Share test components (as you would do with ‘real’
production code)
– Trade-off between:
• No code duplication yet somewhat more complex fixtures
• Easy-to-grab simple fixtures but a lot of them (and
duplication)
26
27. #jenkinsconf
Keep It Manageable
• Focus on functional coverage, not technical coverage
• Say 40 user stories, 400 tests
– Do I have relatively more tests for the more important user
stories?
– How do I link tests to user stories/features/fixes?
• Metrics
– Number of tests
– Number of tests that have not passed in <time>
– Flaky tests
– Duration
27
28. #jenkinsconf
Slice and Dice
• Use appropriate labels in your test code
– Responsible team
– Topic
– Functional area
– Flaky
– Known issue
– etc.
28
31. #jenkinsconf
Organizing Test Jobs in Jenkins
1. Create unique artifacts and fingerprints to monitor
what you are pushing across your pipeline
2. Treat different platforms (e.g. browsers) as different
tests, handled by different jobs
3. Well-known plugins:
– Multi-job
– Copy Artifact
– Workflow
31
32. #jenkinsconf
Organizing Test Jobs in Jenkins
4. Keep Jenkins jobs sane and simple
– Ergo: execute shell scripts from your Jenkins jobs
5. Shell scripts are parameterized
6. Parameters are fed to individual test tools
– FitNesse labels, Cucumber labels, etc. etc.
7. Shell scripts placed under version control
– Managed by the team as any other source code
34. #jenkinsconf
Distributing Tests Across Jobs
34
• Radical parallelization using cheap and
cheerful throw-away environments
– Especially when environments (e.g. containers)
lie at your fingertips
• Jobs should not depend on other jobs
• Test jobs are your “eyes and ears” – optimize for
them!
38. #jenkinsconf
Footer
Making Sense of Test Results
• Real go/no go decisions are non-trivial
– No failing tests
– 5 % of failing tests
– No regression (tests that currently fail but passed
previously)
– List of tests-that-should-not-fail
• Need historical context
• One integrated view
• Data to guide improvement
39. #jenkinsconf
Footer
Making Sense of Test Results
Executing tests from Jenkins is great, but…
• Different testing jobs have their share of Jenkins
plugins
• Historic view merely available per job, not across
jobs
• Pass/Unstable/Fail is too coarse
– How to do “Passed, but with known failures”?
40. #jenkinsconf
Footer
Making Sense of Test Results
• Ultimate analysis question (“are we good to go live?”)
is difficult to answer
• No obvious solution for now, unless all your tests are
running through one service
40
42. #jenkinsconf
• Started with 1 project containing all tests
• Sharing knowledge
• Structured the same as our use cases, i.e.
• WebshopSuite.BusinessAccountSuite.UseCase1500
• Nightly runs from the beginning
• Indication by labels (“nightly”)
• First sequential per application
• WebshopSuite
• Later parallel, split by functional area
• WebshopSuite.BusinessAccountSuite.*
42
FitNesse Implementation
44. #jenkinsconf
System TestStep
Tools
Production
Acc. Test
Deploy to
Chain
Chain Test
Security
TestStep
Tools
Environment Jenkins server and Sonar server
Remarks
Source Code Quality
TestStep
Tools
Dedicated
Team Server
Chain {1-5} Chain {1-5}
Testing
Environment
Dedicated Team ServerEnvironment
End to End Testing
Smoke Test
Chain {1-5}
48. #jenkinsconf
Summary
• Testing = Automation
– Testers are developers
• Structure and annotate tests
– Conway’s Law for Tests
– Link to functions/features/use cases
• Radical parallelization
– Throwaway environments
48
49. #jenkinsconf
Summary
• Keep Jenkins jobs simple
• Keep Jenkins jobs independent
• Track SUT with fingerprints
• Invoke test tools via plugins or version-controlled
scripts
• Parameterization!
• Parallelize & optimize
49
50. #jenkinsconf
Summary
• CD = Speed + Quality = Execution + Analysis
• Making sense of scattered test results is still a
challenge
• Need to figure out how to address real world go/no
go decisions
50
51. #jenkinsconf
What’s Next?
• Visit http://tiny.cc/webinar-xebialabs for a webinar by
CloudBees and XebiaLabs demonstrating the key
value of CD and go-live decisions
• Read more on the testing challenges in CD
– http://tiny.cc/ta-and-cd
• Try XebiaLabs’ XL TestView solution to bring
quality into the heart of your CD initiative
– http://tiny.cc/xl-testview
51
52. #jenkinsconf
Please Share Your Feedback
• Did you find this session valuable?
• Please share your thoughts in the
Jenkins User Conference Mobile App.
• Find the session in the app and click
on the feedback area.
52