The document discusses metrics and measurements for tracking business performance. It defines a minimum viable product as the simplest version that allows a team to collect validated customer learning with minimal effort. Metrics are defined as a way to quantify trends, characteristics, or dynamics objectively and facilitate comparisons. Measurements track progress towards strategic objectives and can indicate past or future performance. Common metrics include revenue, profits, margins, and adoption rates.
13. ”
“The minimum viable product is that version of a new product which
allows a team to collect the maximum amount of validated learning
about customers with the least effort.
- Eric Ries, Lean Startup
14. A measuring system that quantifies a trend, dynamic, or
characteristic. Metrics encourage objectivity. They make it possible
to compare; they facilitate understanding. Think benchmarks
statistics and predictive indicators.
A way of monitoring and tracking the progress of strategic
objectives. Measurements can be leading indicators of
performance or lagging indicators. Common measurements such
as product revenue, profits, product margin and product adoption
rate are often referred to as key performance indicators or KPI’s.
METRIC
MEASUREMENT
29. Month 1 Month 2 Month 3 Month 1 Month 2 Month 3
Month 1 Month 2 Month 3 Month 4 Month 5Month 1 Month 2 Month 3 Month 4 Month 5
Day 1 Day 2 Day 3 Day 4 Day 5 Day 6 Day 7 Day 8 Day 9 Day 10 Day 11 Day 12 Day 13 Day 14
So many opportunities & support for learning from your customers & products. If you aren’t looking at this, it really is willful blindness.
Illumination of reality versus your own perception.Search for insights, context, and validation of the little bets you’re constantly making.
Peter Sims wrote a book about Little Bets and presented at the RSA conference in SF recently, and the concept in a nutshell: “rather than start with a big idea or plan a whole project in advance, they make a methodical series of little bets, learning critical information from lots of little failures and from small but significant wins.
Very much in line with the lean startup principles where an MVP isn’t the smallest set of features, but rather the smallest amount of work to start learning what actually needs to be built.Ultimately, to improve (by knowing whether or not you are improving)
Also operational monitors, and that all of these tend to get mixed together…which can be OK, as long as you understand why you’re measuring each of them the way you are…
These are the common ones around a businessPlenty of information on how to derive these, that you should track them, etc.Follow the startup community blogs & google search for these things – pretty straightforwardWealth of information on this available from a number of sources – read the articles and figure out how to apply those· ARPU (average revenue per user)· Avg. Cust. Lifetime, n (This is the inverse of the churn, n=1/[annual churn])· WACC (weighted average cost of capital)· Costs (annual costs to support the user in a given period)· SAC (subscriber acquisition costs, sometimes refereed to as CAC = customer acquisition costs)
What if you’re not responsible for that much? While we wish we controlled enough of the business to change those metrics, if we don’t, how can we apply the same principles to products, projects andfeatures?That’s what I’m going to focus on today – a few things we’ve learned along the way that have helped us begin to do a better job of this…but we still have a long way to go, which is why the other half of this session is an open discussion about whether you agree or disagree with what I present and what your own experiences have been trying to manage more effectively by metrics and become more data-driven.
This can be a slippery slope down the rabbit holeHuge time sink – can become WASTE (as agile coaches talk about it – work that isn’t delivering any value to customersTemptation of lots of data –SIEM analogy of pumping too much data into your SIEM and just creating thousands of useful alerts and more confusion instead of clarity.Can become a slave to your dataAlways remember that data is used to help make decisions, but it doesn’t dictate the decision.If you’re measuring the right things this shouldn’t be a big issue though
#1: Know what (you think) your business goals areLINK THEM TO THE CUSTOMER PROBLEMS YOU’RE SOLVING#1a: What if we don’t know?GO FIND THEM OUT (through customer interviews, surveys, any and all of the techniques suggested by Pragmatic, Lean Startup, Tuned In, etc.STATE THEM SOMEWHERE, SOMEHOW AS CLEARLY AND VISIBILY AS POSSIBLE, THEN GO FROM THERE.This doesn’t need to be a suffocating process – if you’re starting from scratch start light and know your argument might be thin, but make a little bet and start moving forwardOne example: our product is at its best when it just works, so we need to be careful about thinking it’s a win if we can increase how often our customers engage with the Web Dashboard. The nature of our product is such that when they need to login it’s most likely due to a failure on our part, so our goals in developing many UI features should be more focused on optimizing workflows so that tasks that require logins can be completed faster, and we make the product “sticky” by pushing them information & reports on the value we’re providing as an opportunity to engage instead of being forced to and viewed instead as a burden on IT.
#2: Ingrain it into your processWhether actual goalposts or Hypothesis -> proof, what we’ve done that forced us to do thisIs that we require an “internal dashboard” for every project associated with the falsifiablehypotheses and proposed success metrics.This can take various forms… (show examples)This is an organic process – failure is good.Also: there isn’t a tool that will do this for you – only tools that will allow you to collect or present it.None of this will happen for free – you need to dedicate resources to it, whether PM, Eng, analysts, or all of the above.Amazon example of this– they’ve operationalized and made this part of how they deliver product and manage their business from the high-level decisions down to the smallest details.
Extreme view of this: You are building two products; one for your customers and one to monitor how your customers use the product.
#3: Have the discipline to stick with itThe end of a project isn’t the end of tracking your success criteria with it – IT’S A PROCESS, NOT A DELIVERABLESame as your product doesn’t end in the same way projects do, you need to allow your measurement to carry on beyond projects.Over time you should continue to re-evaluate what’s useful and what isn’t and if you do thatCorrectly you will end up with a fantastic view of your customer base – behaviors, feature usage, engagement, etc.
#4: Take an agile approach and LEARN.Don’t worry about standardizing if you’re just getting started (or perhaps ever…again, unless you’re Amazon)Just start doing it – let your teams use whatever systems/products they think will get the job doneExample:BirstMarketoSalesForceMixPanelTalk more about some of these later, but we have all these and are feeling out what works(hint: we’ve ended up designing our own frontends to multiple data sources)Same as overarchitecting when what you need to do is get it in front of your customers (you!)
Just learn how and when to use these and start doing so.
When we began…
We evolved what we were looking at
Had long conversations about what we really cared about, way beyond after the previous versions you sawHad to make decisions about where the funnel startedOnly NOW can we really start experimenting against baselinesIt takes time to get this right so you have to stick with itAlso, keep in mind that you have to give your prospects & customers a chance to use the new & improved version before you go and change it again, or else you won’t know why something might be changingIt’s a process, not a deliverable.
When we started this project, we identified that it was really all about driving new businessSo we wanted to track the use of the feature and how it was closing dealsWe also wanted to track our efficacy against theirs and if that was changing relatively at all