Paul Miles, Software Development Manager at NPR, discusses QA strategies and tools his team uses to address the challenge of maintaining legacy products at NPR.
In this presentation, he covers:
- How to effectively strategize what types of tests to add to legacy software
- What cost-effective tools and testing strategies you can adopt in your organization
- Approaches about how to incorporate testing into your organization’s build pipelines
- How to foster testing centric culture in your organization
17. ● Identify the Most Important Tests
○ Key workflows only
○ Goal is not to cover all features
○ Don’t let perfect be the enemy of good
● Choose Your Tools
○ Consider the real cost of in-house test automation frameworks
■ High Test Maintenance & Infrastructure Costs
○ Online tools are changing the game
■ Permits developers to Focus on Lower Level Tests
Tests at the Top
19. ● Phase 0
○ Put failing and brittle tests tests in a corner (if you have any)
Unit Tests
20. ● Phase 0
○ Put failing and brittle tests tests in a corner (if you have any)
● Phase 1
○ Run unit tests with every build
○ Make the results easy to see
Unit Tests
21. ● Phase 0
○ Put failing and brittle tests tests in a corner (if you have any)
● Phase 1
○ Run unit tests with every build
○ Make the results easy to see
● Phase 2
○ Require all tests to pass as a part of the QA process
Unit Tests
22. ● Phase 0
○ Put failing and brittle tests tests in a corner (if you have any)
● Phase 1
○ Run unit tests with every build
○ Make the results easy to see
● Phase 2
○ Require all tests to pass as a part of the QA process
● Then...
○ Add additional tests
○ Consider fixing the brittle ones
○ Do more advanced build and test practices
Unit Tests
24. ● Phase 0
○ Designate the master environment you are running tests against
○ Put failing and brittle tests tests in a corner (if you have any)
Service Tests
25. ● Phase 0
○ Designate the master environment you are running tests against
○ Put failing and brittle tests tests in a corner (if you have any)
● Phase 1
○ Require all tests to pass as a part of the QA process
Service Tests
26. ● Phase 0
○ Designate the master environment you are running tests against
○ Put failing and brittle tests tests in a corner (if you have any)
● Phase 1
○ Require all tests to pass as a part of the QA process
● Phase 2
○ Run tests as a part of the build / deployment process if possible
○ May require additional automation of those items
Service Tests
27. ● Phase 0
○ Designate the master environment you are running tests against
○ Put failing and brittle tests tests in a corner (if you have any)
● Phase 1
○ Require all tests to pass as a part of the QA process
● Phase 2
○ Run tests as a part of the build / deployment process if possible
○ May require additional automation of those items
● Then...
○ Add additional tests
○ Consider fixing the brittle ones
Service Tests
28. Service Tests
● “Convert” tests to unit-type tests if possible
○ Utilize data manipulation scripts, mocks, fakes, etc.
● Make tests able to be run on all environments
○ Pre-requisite to incorporate them in the build process
○ May need to address differences in the environments
○ May need to bolster other processes (automation, data import, etc)
● Make them repeatable
○ Bolster setup / teardown
● Incorporate tests in the automated build loop
○ Don’t make the build too long, though
31. Static Quality Analysis
● Often overlooked, but very effective in improving code quality
● Tools
○ Your IDE
○ Integrate with your Build Server: SonarQube, FindBugs, CheckStyle
○ Integrate with GitHub / BitBucket: CodeClimate
● Adoption Tips
○ Get shared settings for IDE’s in place
○ Tune the settings
○ Focus on new code only
○ Apply “periodic paydown” on issues
○ Address issues before incorporating into the build process
32. ● Run tests (of all kinds) as soon as possible after a commit
○ 10 minute “magic mark”
○ Slow tests may not fail the build, but they still indicate a quality problem if they fail
● Add in coverage measuring tools
● Feature branches / feature toggles
○ Run tests on a designated environment
○ Require tests to pass before merge
○ Complete code reviews flagged by other tools (and incorporate changes before merge)
○ Add tests as code is written
● Separate releases from coding “sprints”
Incorporating Testing in Build Pipelines
33. Testing Infrastructure
● Requires substantial investment
○ Environments
○ Build servers / tooling
○ Test beds / devices
● Automate everything
● Designate an expert, but get the team to buy in and share responsibility