Mobile apps have brought a whole new set of challenges when it comes to testing: Fast paced development cycles with multiple releases per week, multiple app technologies and development platforms to support, tons of devices and form factors, and additional pressure from enterprise and consumers less patient with low quality apps. With these new challenges, come a new set of mistakes testers can make!
Fred has worked with dozens of mobile test teams to help them avoid common traps when building test automation for mobile apps and would like to share some best practices that could be useful to a lot of developers and testers starting with mobile test automation.
Best practices such as:
• When, what and where to automate?
• Picking the right wait to handle unreliable back-end calls or device performance
• Building testability in a mobile app
• Automating the automation (!)
• Mix and match performance testing and functional testing
In this talk, Fred will bring some real stories (struggles!) and how small changes in process made these mobile apps 10x more reliable!
5. EXPECTATION
•
•
•
•
It is as difficult as developing a good mobile app
It is a software project by itself
It is not optional in mobile
You will probably fail before you succeed
8. #2 – TRACK YOUR OBJECTIVES
IF IT IS NOT MEASURED, IT DOESN’T EXIST
9. SET YOUR OWN METRICS
•
•
•
•
Turnaround time for fixes
Customer satisfaction on app store
Lower number of bugs on production
EMET (Equivalent Manual Execution Test)
10. ACTIONABLE TEST INTELLIGENCE
•
•
•
•
•
•
Part of the automation framework
Available and highly visible to everyone
Establish a baseline and track ROI
Take costs into account (Value=Benefit-cost)
Too many metrics can hurt
Rinse and repeat until you get it right
12. MISTER JENKINS IN YOUR FRIEND
•
•
•
•
•
•
•
•
Automate your build
Automate your app deployment
Automate your environment deployment
Automate your tear down
Automate your test execution
Automate your reporting
Automate your metrics tracking
Automate your communication
14. Careful planning
• Plan your Automation infrastructure
• Cloud, on-premise, devices connectivity, app deployment, data
aggregation, reporting, etc.
• Pick your automation tool wisely
• Start small and grow
• Don’t hesitate to pause tests building activity
15. #6 – PICK THE RIGHT TESTS TO AUTOMATE
AND I CLEARLY DON’T HAVE A GOOD PICTURE FOR THIS SLIDE
16. ANALYZE THE RISK OF FAILURE
R(c)=P(c) * I(c)
Probability
• Code Complexity
• Changed Areas
• Affected Interfaces
• New Technology
• Component Maturity
Impact
• Financial
• Reputation
• Legal
• Security
• Loss of Customers
17. #7 – WHERE TO RUN TESTS
HINT: 0% USERS RUN YOUR APP ON A SIMULATOR
18. SIMULATOR VS REAL DEVICES
• Pros
• Cheap
• Integrated with IDE
• Cons
• Not testing on actual platform.
What if the test pass? What’s
next?
• Network is different
• OS is different (stock)
• Can’t simulate real hardware
(CPU, Memory, etc.) Not fit for
mobile performance
• Pros
• Reproduce real gestures
• Real results, no false negative
• Can test under OEM
customization
• Fit for mobile performance
• Cons
• Need to be managed (device
Cloud helps!)
22. S.F.I.R.S.T.R
•
•
•
•
•
•
•
Small – Easier to understand & Fix
Fast – Parallel execution for faster feedback
Independent – Can run any subsets in any order
Repeatable – Tests get the same result every time
Self-Checking – No human checking
Timely – Should be written in parallel with dev
Reusable – To avoid maintenance nightmares
26. BE DATA-DRIVEN
•
•
•
•
Increase test coverage FAST
Easy to add, remove and configure tests
Reduce the number of tests to maintain
Separation of Tests and Datas
27. #10 – USE RELIABLE WAITS
FORGET ABOUT TIME DELAYS!
28. USE RELIABLE AND INTELLIGENT WAITS
• Use object visibility to manage the flow of the test
• Never use time delays to manage back-end variability
• Never use time delays to account for device
performance
• Spend time finding the right locator to wait on
• Run your test 100 times before claiming success
29. #11 – LEVERAGE FUNCTIONAL TESTS FOR PERFORMANCE
YOU GET THE PICTURE
30. COVER ALL YOUR BASES
No backend connection
Native Apps
CPU
Battery
Memory
Transaction timing
Web
browser
users
CPU
Stress tests
Endurance tests
Load test
Battery
Memory
Transaction timing
Mobile
browser
users
Native app
users
Response time
Mobile Web Apps
Load
Balancer
Data traffic
HTTP(S)
UDP
WebSocket
Cache
Web
Server
Web
Server
Shared web &
mobile infrastructure
App
Server
App
Server
Database