4. Common mistakes
- Measuring vanity metrics
- A utility app benchmarking against in-app time instead of DAU/MAU
- Thinks it’s just developments’ work,
- Asking what were being tracked after implementation instead of before
- Focus on dashboard rather than defining metrics well
- Invest time in designing the layout of dashboard instead of defining the important
questions that needs answers
- Do more: doing is more important than why
- Lack of understanding how metrics relates to each other
5. Making information actionable
isn't about reporting on the
number of people that do
something, it is about how we
separate what successful people
do vs what failed people do in our
products so that we can take
steps towards improvements.
6. So what if I don’t do data right?
- Difficult to quantify the impact and value of the team’s effort
- High dev and comm. cost due to constant switch of priorities
- Missing crucial opportunities because of lack of insights
- Lack of clear alignment between functional teams
- Low moral because lack of sense of value
- Lack of guidance on what to improve
- Failed to improve because of unknown blocking dependencies
7. Reasons why “data” project dies
- Lack of support from top management
- Lack of well defined strategy
- Lack of understanding of dependencies between metrics
- Poorly defined metrics definition
- Deprioritised because it does not “feel” like delivering features
9. How to do data right?
1. Identify and measure status quo
1. Performance metrics (Clicks, in-app time, revenue)
2. Contextual metrics (Satisfaction rate, job done time)
2. Frame the problem
3. Visualize your findings
4. Divide and conquer
11. Identify and Measure Status Quo
Are you doing any tracking?
- What tools is your team using? (Map them out with draw.io)
- Who are the stakeholders, and owners of those tools?
- What are they tracking? (Use a event tracking dictionary)
- What are the metrics and what are their definitions?
- E.g. DAU and MAU, where active session is equal to 1 click open
12. Identify and Measure Status Quo
Why are you doing this project?
- How frustrated the team is?
- How much time spent on searching for answers?
- How much time spent on generating reports?
- What are some of the decisions that were made without data?
14. How to frame the problem?
- How effective is your business? (Do you have the right focus)
- Who is your customer? What do they look like?
- If you didn’t exist, what would customer use?
- How efficient is your business? (How well do you convert traffic to money)
- Traffic - How many have stopped for your offerings?
- Usage - How many are committing their time to your offerings?
- Revenue - How many are willing to invest to solve their problems?
15. How to bring company-wide improvement with data
and methodologies that actually works.
I’ve had enough of jargons, product-
market fit, agile, data-driven. Talk is
cheap.
21. Look two ways, and understand why
Macro (Org-wide)
- Identify strategy effectiveness
- Untangle dependencies
- Shape effective team formation
Micro (Process-focused)
- Identify operational efficiency
breakdowns
- Best to focus by a single team
24. Tips
- Don’t get lost in picking the right framework
- Focus on understanding thoroughly why your customer comes to you
- Higher isn’t always better: it’s about ROI and long-term growth
- Identify most impactful common lagging indicator first
- Only delegate when indicator is stable (no / minimal cross-team dependencies
/direct correlation between invested efforts and the performance of indicator)
- DO NOT use the improvement benchmarks as KPIs
- Use OKR for improvement benchmarks
- MUST involve all teams before the start of the project
25. How to frame the problem (Micro)?
Funnel analysis,
a.k.a. Marketing
funnel, conversion
funnel, sales funnel
26. Tips
- Macro and micro goes hand-in-hands: it’s like pipework and water source
- Higher isn’t always better: it’s about ROI and long-term growth
- Identify most impactful common lagging indicator first
- Only delegate when indicator is stable (no / minimal cross-team dependencies
/direct correlation between invested efforts and the performance of indicator)
- DO NOT use the improvement benchmarks as KPIs
- Use OKR for improvement benchmarks
- MUST involve all teams as early as possible
28. How to visualise data?
- Measure journey, not statistics
- Give context to the numbers by offering a
bigger picture
- List out questions before designing dashboard
- Dashboard is just telling the story with the
help of the visuals
- The content of the story is the key
- Invest in tools, but even free tools can be
powerful
- Google Data Studio
- Business BI
- Tableau
- Always maintain a tracking dictionary
30. Measure and learn
- Get top-level management support
- Measure how many decisions were not data driven
- Define a North Star Metric for the product
- Map major customer journey with customer intentions
- Create a data library / tracking dictionary
- Measure the impact before / after new feature implementation
- Create accessible dashboards for teams
32. How to deliver results
- The no-BS, growth-focused and deliverable-focused scrum framework
- Use KPI and OKR together: they are not the same nor mutually exclusive
- Don’t push for KPI if the metric is not “isolated” and “stabled”
- Define priority across organization, not just individuals or groups of people
- Invest time to learn and iterate, not just delivering features
- Document and share findings and learnings across teams
33. Key takeaways
- Everything is a means to an end: Focus on objectives, not tasks
- Let data and customers do the speaking
- Measure, get supports and alignments
- Do one thing and do it well
- Learn from iterations
34. - Want to learn more about my working history? LinkedIn
- Want to read my other case studies and findings? Medium
- Want a glance at my activeness? Personal homepage
Let’s connect!