As web applications evolve and provide more and more features there is a growing need to accurately measure performance as perceived by users. While measuring performance during development can help to build faster applications, load and response times vary from user to user depending on their device and network conditions.
This talk covers the user-centric performance metrics available and the way we can collect and analyse these data on real users’ devices by leveraging Web APIs and data analysis tools.
4. Is it worth it?
o 53% of mobile site visits were abandoned if a page took longer than 3s to load [1]
o Sites loading within 5s had 70% longer sessions, 35% lower bounce rates, and 25%
higher ad viewability than sites taking nearly four times longer at 19s [1]
o Pinterest increased search engine traffic and sign-ups by 15% when they reduced
perceived wait times by 40% [2]
Impact calculator tool: https://www.thinkwithgoogle.com/feature/testmysite
7. … but there are many metrics!
o Site speed is not as simple as having a single score
o We need to look at the entire picture: What are the metrics that make up you site’s
performance?
o It’s a distribution
* https://developers.google.com/web/fundamentals/performance/user-centric-performance-metrics
8. Performance and user experience can be captured with
a single user!
Performance myth #2
9. Different conditions!
User conditions vary depending on many factors:
o Network conditions
o Connection speed
o Device / hardware
o Browser
o Cache
* https://testsigma.com/blog/cloud-based-cross-browser-testing-tools-advantages
11. It’s an entire experience!
o Users associate performance with their overall experience
o Bad user experiences can happen at any time
* https://www.sparksinteractive.co.nz/services/user-interface-design
Clicks
Toggling form controls
Tabs
Swipes
Scrolls
Animations
12. RAIL Model
Process events in
under 50ms
Produce a frame in
10ms
Maximize idle time Become interactive
in under 5 seconds
13. Developer testing vs real world
Debugging / Development
Same environment
Real world traffic
Benchmarking
Developer
Real-world user experience
Correlation to business KPIs
Debugging
Analysis
Real-world
15. User experience
What users think?
How do they perceive performance?
o Is it happening?
o Is it useful?
o Is it usable?
o Is it delightful?
“ Google developers “
16. Performance metrics
o First Paint (FP) - The first elements of the web page are rendered
o First Contentful Paint (FCP) - More elements are rendered on the page
o First Meaningful Paint (FMP) - Most important element of a page are visible
o Time To Interactive (TTI) - When the user can interact with the page
o Long Tasks (LT) - Tasks that block the main thread (50 ms or more)
… More!
20. How to measure: Web APIs
o Performance Timeline
o User Timing API
o Navigation Timing API
o Resource Timing API
…browser support?
21. Performance entry
Single performance metric that is part of the performance timeline
o Navigation
o Resource (images, scripts, fonts, videos, iframes,…)
o Paint (“render”)
o Long task
o Application entry (mark / measure)
Properties: name, entryType, startTime, duration
22. Retrieves performance entry metrics:
o getEntries() - Gets all entries in the timeline
o getEntriesByType(type) - Gets all entries of the specified type (e.g. resource, mark)
o getEntriesByName(name, type) - Gets all entries with the specified name (e.g. name)
Performance timeline
23. Navigation entries
o Provides data that can be used to measure the performance of a web site
o Breaks down the events required to retrieve and display webpages and provides
timestamps
https://www.w3.org/TR/navigation-timing-2
25. Resource entries
o Performance metrics about all the resources
o Uses concepts of navigation
o Includes transfer size, encoded body size, decoded body size
o Waterfall shows all resources fetched from the network in a timeline
26. Paint entries
When browser converts the render tree to pixels on the screen:
o First paint
o First contentful paint
27. o Measuring page load
o Send all timestamps to analytics
o Raise an event if any resource takes more than expected to download
o Track specific resources (e.g. third-party ads or analytics)
o Event listeners – How long did it take?
Use cases
28. Sending metrics to server
o Gather all user data from pages
o Send data to the server before unloading the document
o Beacon API
29. Analysing results
o Benchmarking
o Correlate with business metrics
o Histograms
o Distributions
o Web / mobile
o Browser
o Geographic locations
o Percentiles
30. Tools for performance testing
o Waterfall: https://github.com/andydavies/waterfall
o Perfmap: https://github.com/zeman/perfmap
o Performance bookmarklet: https://github.com/nurun/performance-bookmarklet
o Elastic APM RUM agent: https://github.com/elastic/apm-agent-rum-js
o Boomerang: https://github.com/akamai/boomerang
31. What’s next? Prevent regression!
The goal is to be faster!
o Testing both in lab and real world
o Get notifications if performance regresses
o Integrate performance tests in the CI
Performance model that breaks down the user’s experience into key actions
Did the navigation started? Is there any indication?
Useful: Has the most important content rendered yet?
Usable: Can the use interact with the content?
It delightful? Is it consistent? Is the overall experiences good?
DomContentLoaded?
load
requestAnimationFrame etc?
Web APIs: used by speed tools (google dev tools, lighthouse, webpage test)
A single timestamp or a collection of timestamps
Recommend to send plain metrics
Maybe NoSQL (mongo)