4. @OptimiseOrDie
Timeline
Tested stupid ideas, lots
Most AB or MVT tests are bullshit
Discovered AB testing
Triage, Triangulation,
Prioritisation, Maths
Zen Plumbing
AB Test Hype Cycle
8. Craig’s Cynical Quadrant
Improves
revenue
Improves UX
YesNo
No
Yes Client delighted
(and fires you for another UX agency)
Client fucking
delighted
Client absolutely
fucking furious
Client fires you
(then wins an award for your work)
9. Top Tools & Tips
#1 Get out of the office
#2 Immerse yourself
#3 Session Replay
#4 Voice of Customer
#5 Get the right inputs
#6 Act like a P.I.
#7 Experience testing
#8 Split testing tools
#9 Get performance
#10 Analytics health check
#11 Going agile
Examples
@OptimiseOrDie
13. Som, feedbackRemote UX tools (P=Panel, S=Site recruited, B=Both)
Usertesting (B) www.usertesting.com
Userlytics (B) www.userlytics.com
Userzoom (S) www.userzoom.com
Intuition HQ (S) www.intuitionhq.com
Mechanical turk (S) www.mechanicalturk.com
Loop11 (S) www.loop11.com
Open Hallway (S) www.openhallway.com
What Users Do (P) www.whatusersdo.com
Feedback army (P) www.feedbackarmy.com
User feel (P) www.userfeel.com
Ethnio (For Recruiting) www.ethnio.com
Feedback on Prototypes / Mockups
Pidoco www.pidoco.com
Verify from Zurb www.verifyapp.com
Five second test www.fivesecondtest.com
Conceptshare www.conceptshare.com
Usabilla www.usabilla.com
13
@OptimiseOrDie
1c : Crowdsourced Testing
16. @OptimiseOrDie
1f : The Secret Millionaire
• Tesco placed IT users in front line roles with product
• You have to to create this kind of feedback loop
• If it isn‟t there, you need to push/encourage
• Connect the team with pain points AND outcomes of
their work, split tests and changes
• Hugely motivational strategy
• One last tip – learn how to interview like a pro
• Read these:
“Don‟t Make Me Think” amzn.to/1gIZEJn
“Rocket Surgery Made Easy” amzn.to/1e0hnUL
“Talking to Customers” bit.ly/1e0hT58
“Talking with Participants” bit.ly/1kKL3LE
“Don‟t listen to Users” bit.ly/1cQpiIE
“Interviewing Tips” bit.ly/1fKqu03
“More interviewing Tips” bit.ly/1bmvGT
17. #2 : IMMERSE
YOURSELF
@OptimiseOrDie
• Test ALL key campaigns
• Use Real Devices
• Get your own emails
• Order your products
• Call the phone numbers
• Send an email
• Send 11 shoes back
• Be difficult
• Break things
• Experience the end-end
• Do the same for competitors
• Team are ALL mystery shoppers
• Wear the magical slippers
• Be careful about dogfood though!
18. • Vital for optimisers & fills in a ‘missing link’ for insight
• Rich source of data on visitor experiences
• Segment by browser, visitor type, behaviour, errors
• Forms Analytics (when instrumented) are awesome
• Can be used to optimise in real time!
Session replay tools
• Clicktale (Client) www.clicktale.com
• SessionCam (Client) www.sessioncam.com
• Mouseflow (Client) www.mouseflow.com
• Ghostrec (Client) www.ghostrec.com
• Usabilla (Client) www.usabilla.com
• Tealeaf (Hybrid) www.tealeaf.com
• UserReplay (Server) www.userreplay.com
@OptimiseOrDie
#3 : GET SESSION REPLAY
19. • Sitewide Omnipresent Feedback
• Triggered (Behavioural) Feedback
• Use of Features, Cancellation, Abandonment
• 4Q Task Gap Analysis very good
• Kampyle
www.kampyle.com
• Qualaroo
www.qualaroo.com
• Feedback Daddy
www.feedbackdaddy.com
• 4Q
4q.iperceptions.com
• Usabilla
www.usabilla.com
#4 : GET THEIR VOICE
20. • Make contact and feedback easy & encouraged
• Add contact & feedback to everything (e.g. all mails)
• Read Caroline Jarrett, run surveys (remember them?)
• Run regular NPS and behaviourally triggered surveys
• Get ratings on Service Metrics
• Find what drives the ‘level’ of delight
• Ask your frequent, high spend, zealous users questions
• Make the team spend ½ a day a month at the Call Centre
• Meet with your Sales and Support teams ALL the time
• Tip : Take them for Beers and encourage bitching
#4 : GET THEIR VOICE
21. Insight - Inputs
#FAIL
Competitor
copying
Guessing
Dice rolling
An article
the CEO
read
Competitor
change
Panic
Ego
Opinion
Cherished
notions
Marketing
whims Cosmic rays
Not ‘on
brand’
enough
IT
inflexibility
Internal
company
needs
Some
dumbass
consultant
Shiny
feature
blindness
Knee jerk
reactons
#5 : Your inputs are all wrong
@OptimiseOrDie
22. Insight - Inputs
Insight
Segmentation
Surveys
Sales and
Call Centre
Session
Replay
Social
analytics
Customer
contact
Eye tracking
Usability
testing
Forms
analytics
Search
analytics Voice of
Customer
Market
research
A/B and
MVT testing
Big &
unstructured
data
Web
analytics
Competitor
evalsCustomer
services
#5 : These are the inputs you need…
@OptimiseOrDie
23. • For your brand(s) and competitors
• Check review sites, Discussion boards, News
• Use Google Alerts on various brands & keywords
• See what tools they’re using (www.ghostery.com)
• Sign up for all competitor emails
• Run Cross Competitor surveys
• This was VITAL for LOVEFiLM
• Use Social & Competitor Monitoring tools :
slidesha.re/1k7bflG
#6 : ACT LIKE A PI
24. #4 – Test or Die!
Email testing www.litmus.com
www.returnpath.com
www.lyris.com
Browser testing www.crossbrowsertesting.com
www.browserstack.com
www.spoon.net
www.saucelabs.com
www.multibrowserviewer.com
Mobile devices www.appthwack.com
www.deviceanywhere.com
www.mobilexweb.com/emulators
www.opendevicelab.com
@OptimiseOrDie
#7 : MAKE MONEY FROM TESTING!
25. • Google Content Experiments
bit.ly/Ljg7Ds
• Optimizely
www.optimizely.com
• Visual Website Optimizer
www.visualwebsiteoptimizer.com
• Multi Armed Bandit Explanation
bit.ly/Xa80O8
• New Machine Learning Tools
www.conductrics.com
www.rekko.com
@OptimiseOrDie
#8 : MAKE MORE MONEY FROM TESTING!
30. #10 : Your analytics tool is broken!
@OptimiseOrDie
31. • Get a Health Check for your Analytics
– Mail me for a free pack
• Invest continually in instrumentation
– Aim for at least 5% of dev time to fix + improve
• Stop shrugging : plug your insight gaps
– Change „I don‟t know‟ to „I‟ll find out‟
• Look at event tracking (Google Analytics)
– If set up correctly, you get wonderful insights
• Would you use paper instead of a till?
– You wouldn‟t do it in retail so stop doing it online!
• How do you win F1 races?
– With the wrong performance data, you won‟t
@OptimiseOrDie
#10 : Your analytics tool is broken!
33. Methodologies - Lean UX
Positive
– Lightweight and very fast methods
– Realtime or rapid improvements
– Documentation light, value high
– Low on wastage and frippery
– Fast time to market, then optimise
– Allows you to pivot into new areas
Negative
– Often needs user test feedback to
steer the development, as data not
enough
– Bosses distrust stuff where the
outcome isn’t known
“The application of UX design methods into product
development, tailored to fit Build-Measure-Learn cycles.”
33
34. Agile UX / UCD / Collaborative Design
Positive
– User centric
– Goals met substantially
– Rapid time to market (especially when
using Agile iterations)
Negative
– Without quant data, user goals can
drive the show – missing the business
sweet spot
– Some people find it hard to integrate
with siloed teams
– Doesn’t’ work with waterfall IMHO
Wireframe
Prototype
TestAnalyse
Concept
Research
“An integration of User Experience Design and Agile*
Software Development Methodologies”
*Sometimes
34
36. Lean Optimisation
Positive
– A blend of several techniques
– Multiple sources of Qual and Quant data aids triangulation
– Focus on priority opportunities drives unearned value and
customer delight for all products
Negative
– Needs a one team approach with a strong PM who is a Polymath
(Commercial, Analytics, UX, Technical)
– Only works if your teams can take the pace – you might be
surprised though!
“A blend of User Experience Design, Agile PM, Rapid Lean UX
Build-Measure-Learn cycles, triangulated data sources, triage
and prioritisation.”
36
38. We believe that doing [A] for
People [B] will make
outcome [C] happen.
We’ll know this when we
observe data [D] and obtain
feedback [E]. (reverse)
@OptimiseOrDie
39. agile - Summary
• Design your own methodology
Experiment and optimise with your team
• Don’t be a slave
The methodology is the slave, not your master
http://tcrn.ch/1gPpUNo
• Collaborative working
– Harvard study into teams – it‟s an all the time thing
• Ask me later…
Questions – see me on Twitter, G+ or ask by mail
@OptimiseOrDie
42. • 20M+ visitor tests with People Images
• Some interesting stuff at Autoglass (Belron)
• Negative body language is a turnoff
• Uniforms and branding a positive (ball cap)
• Eye gaze and smile are crucial
• Hands are awkward without a prop
• Best prop tested was a clipboard
• Single image better than groups
• In most countries (out of 33) with strong
female and male images in test, female won
• So – a question about this test
@OptimiseOrDie
#2 : SPLIT TESTING PEOPLE
44. Terrible Stock Photos : headsethotties.com & awkwardstockphotos.com
Laughing at Salads : womenlaughingwithsalad.tumblr.com
Other Stock Memes : linkli.st/optimiseordie/7Fdxz
BBC Fake Smile Test : bbc.in/5rtnv @OptimiseOrDie
46. TV - Off TV - On
Isi went on to star in the TV slot and helped Autoglass grow recruitment of
female technicians, as well as proving a point!
#3 : TV ADVERTISING
48. @OptimiseOrDie
#4 : VOC, NPS, EXPERIMENTS
• Belron NPS programme is huge
• Millions of people every year, across the world
• 35% survey takeup, 6% dropout rate!
• (Try @lukew and @cjforms and @stickycontent)
• Higher scores than some consumer products
• Why? Measuring the drivers of delight
• Even on A/B tests, we could split NPS data
• We could see a new funnel drove a 5.5% rise
• Lovefilm beat their competitors using NPS
• How? Measuring key service metrics
• Regression to find high value investment areas
• Contact deflection using self service
• Analytics, split testing, UX
49. How is it working out for Craig?
• Methodologies are not Real Life ™
• It’s mainly about the mindset of the team and
managers, not the tools or methodologies used
• Not all my clients have all the working parts
• Use some, any techniques instead of ‘guessing’
• Bringing together UX techniques with the
excellent tools available – along with analytics
investment - will bring you successful and well-
loved products
• Blending Lean and Agile UX with conversion
optimisation techniques (analytics, split
testing, Kaizen, Kano) is my critical insight from
the last 5 years.
• UX got hitched to numbers, they ran away and
lived happily ever after 49
50. If it isn‟t working, you‟re not doing it right
@OptimiseOrDie
52. RESOURCE PACK
• Maturity model
• Best CRO people on twitter
• Best Web resources
• Good recent books to read
• Triage and Triangulation
• The Bucket outcome methodology
• Belron methodology example
• CRO and testing resources
• Companies and people to watch
• Building a ring model
• Manual Models for Analytics
@OptimiseOrDie
53. Ad Hoc
Local Heroes
Chaotic Good
Level 1
Starter Level
Guessing
A/B testing
Basic tools
Analytics
Surveys
Contact Centre
Low budget
usability
Outline process
Small team
Low hanging fruit
+ Multi variate
Session replay
No segments
+Regular usability
testing/research
Prototyping
Session replay
Onsite feedback
________________________________________________________________________
_____________________ _
Dedicated team
Volume
opportunities
Cross silo team
Systematic tests
Ninja Team
Testing in the
DNA
Well developed Streamlined Company wide
+Funnel
optimisation
Call tracking
Some segments
Micro testing
Bounce rates
Big volume
landing pages
+ Funnel analysis
Low converting
& High loss pages
+ offline
integration
Single channel
picture
+ Funnel fixes
Forms analytics
Channel switches
+Cross channel
testing
Integrated CRO
and analytics
Segmentation
+Spread tool use
Dynamic adaptive
targeting
Machine learning
Realtime
Multichannel
funnels
Cross channel
synergy
________________________________________________________________________
_______________________
________________________________________________________________________
________________________
Testing
focus
Culture
Process
Analytics
focus
Insight
methods
+User Centered
Design
Layered feedback
Mini product tests
Get buyin
_________________________________________________________________________
_______________________Mission Prove ROI Scale the testing Mine value
Continual
improvement
+ Customer sat
scores tied to UX
Rapid iterative
testing and
design
+ All channel view
of customer
Driving offline
using online
All promotion
driven by testing
Level 2
Early maturity
Level 3
Serious testing
Level 4
Core business value
Level 5
You rock, awesomely
________________________________________________________________________
________________________
53
57. Triage and Triangulation
• Starts with the analytics data
• Then UX and user journey walkthrough from SERPS -> key paths
• Then back to analytics data for a whole range of reports:
• Segmented reporting, Traffic sources, Device viewport and
browser, Platform (tablet, mobile, desktop) and many more
• We use other tools or insight sources to help form hypotheses
• We triangulate with other data where possible
• We estimate the potential uplift of fixing/improving something
as well as the difficulty (time/resource/complexity/risk)
• A simple quadrant shows the value clusters
• We then WORK the highest and easiest scores by…
• Turning every opportunity spotted into an OUTCOME
“This is where the smarts of CRO are – in identifying the
easiest stuff to test or fix that will drive the largest uplift.”
@OptimiseOrDie
58. The Bucket Methodology
“Helps you to stream actions from the insights and prioritisation work.
Forces an action for every issue, a counter for every opportunity being lost.”
Test
If there is an obvious opportunity to shift behaviour, expose insight or
increase conversion – this bucket is where you place stuff for testing. If
you have traffic and leakage, this is the bucket for that issue.
Instrument
If an issue is placed in this bucket, it means we need to beef up the
analytics reporting. This can involve fixing, adding or improving tag or
event handling on the analytics configuration. We instrument both
structurally and for insight in the pain points we’ve found.
Hypothesise
This is where we’ve found a page, widget or process that’s just not working
well but we don’t see a clear single solution. Since we need to really shift
the behaviour at this crux point, we’ll brainstorm hypotheses. Driven by
evidence and data, we’ll create test plans to find the answers to the
questions and change the conversion or KPI figure in the desired direction.
Just Do It
JFDI (Just Do It) – is a bucket for issues where a fix is easy to identify or the
change is a no-brainer. Items marked with this flag can either be deployed
in a batch or as part of a controlled test. Stuff in here requires low effort
or are micro-opportunities to increase conversion and should be fixed.
Investigate You need to do some testing with particular devices or need more
information to triangulate a problem you spotted. If an item is in this
bucket, you need to ask questions or do further digging. 58
59. 5 - Belron example – Funnel replacement
Final
prototype
Usability
issues left
Final changes Release build
Legal review
kickoff
Cust services
review kickoff
Marketing
review
Test Plan
Signoff
(Legal, Mktng
, CCC)
Instrument
analytics
Instrument
Contact
Centre
Offline
tagging
QA testing
End-End
testing
Launch
90/10%
Monitor
Launch
80/20%
Monitor < 1
week
Launch
50/50%
Go live 100%
Analytics
review
Washup and
actions
New
hypotheses
New test
design
Rinse and
Repeat!
61. So you want examples?
Examples of companies putting this stuff together in a good way:
• Belron – Ed Colley
• Dell – Nazli Yuzak
• Shop Direct – Paul Postance (now with EE)
• Expedia – Oliver Paton
• Schuh – Stuart McMillan
• Soundcloud – Eleftherios Diakomichalis & Ole Bahlmann
• Gov.uk – Adam Bailin (now with the BBC)
Read the gov.uk principles : www.gov.uk/designprinciples
And my personal favourite of 2013 – Airbnb!@OptimiseOrDie
62. #1 Building a Model
#1 Avinash article
#2 The Ring Model
#3 3 examples
#4 Benefits
#5 Summary
62
63. 6.1 – Avinash “See-Think-Do”
• Avinash Kaushik, analytics guru, proposes a very nice model for marketing. A
brilliant article can be read here:
• http://www.kaushik.net/avinash/see-think-do-content-marketing-measurement-
business-framework/
• But this sort of thinking is also relevant to optimisation
• CRO often focuses on purely the ‘Do’ stage – rather than ‘See’ or ‘Think’ stages.
63
65. 6.2 – The Ring Model
• Simply looking at conversion points is not enough
• We need a way to look at the ‘layers’ or ‘levels’ reached
• So I developed a ring or engagement model
• This works for many (but not all) websites
• Focuses on depth of engagement, not pages viewed
• Helps to see the key loss steps, like a funnel
• It’s not a replacement for funnel diagrams
• It helps to see the ‘big picture’ involved
• So – let’s try some examples
65
71. 6.4 – Benefits
• Helps you see where flow is ‘stuck’
• Sorts out small opportunities from big wins
• Ignores pages in favour of ‘Macro’ and ‘Micro’ conversions
• Lets you show the client where focus should be
• Helps flush flow or traffic through to lower levels
• Avoids prioritising the wrong part of the model!
• Example – Shoprush problem is basket adds, not checkout
• If you had 300k product page views, 5k adds and 1k
checkouts – where would your problem be?
• If you had 300k product page views, 100k adds and 1k
checkouts – it’s a different place!
• Example – Google adwords site has bad traffic, not
conversion
71
72. 6.5 – Benefits contd.
• A nice simple way to visualise complex websites
• Does not rely on pages – more ‘steps’ or ‘layers’
• Helps you see where traffic is ‘stuck’ or ‘failing to engage
more deeply’
• The combination of traffic potential, UX and persuasion
issues combines to identify opportunity
• Avoids visual bias when doing an expert review
• In the e-commerce example, Rush have optimised product
page first, not homepage.
• Questions?
72
74. #5 By Hand Analytics
#1 When to use this method?
#2 How to use it
#3 Demo
#4 Limitations
74
75. 5.1 – When to use this method
• If goals are unreliable / broken / have no data
• If flows are mixed in funnels (mid stage joiners)
• If the conceptual model does not match site config
• When the data you need does not exist
75
76. 5.2 – How to use this method
• For example, with a funnel
• Use UNIQUE PAGEVIEWS (and events, if available)
• Do NOT mix with pageviews or visitor counts
• Step 1 – Basket UPVs
• Step 2 – Customer details
• Step 3 – Shipping
• Step 4 – Payment
• Step 5 – Thank you
• Use regex / advanced segments to aggregate or filter
• Gives you a unique count of people at steps
• Always be aware of time periods!
76
77. 5.3 – Limitations & Benefits
• Mixing and matching data can look nice but causes issues
• Time consuming and more complex
• Try to use in-page filters not advanced segments (sampling)
• Is not readily replayed by client
Some benefits though:
• Construct segmented funnels
• Split by other data attributes
• Very good way to spot variances inside funnels
• Vital for multi-device category websites
77
78. END SLIDES
78
Feel free to steal, re-use, appropriate or otherwise lift
stuff from this deck.
If it was useful to you – email me or tweet me and tell me
why – I‟d be DELIGHTED to hear!
Regards,
Craig.
Notas do Editor
And here’s a boring slide about me – and where I’ve been driving over 400M of additional revenue in the last few years. In two months this year alone, I’ve found an additional ¾ M pounds annual profit for clients. For the sharp eyed amongst you, you’ll see that Lean UX hasn’t been around since 2008. Many startups and teams were doing this stuff before it got a new name, even if the approach was slightly different. For the last 4 years, I’ve been optimising sites using the combination of techniques I’ll show you today.
These are the results of a live test on a site, where an artificial delay is introduced in the performance testing. I’ve done some testing like this myself on desktop and mobile sites and confirm this is true – you’re increasing bounce rate, decreasing conversion, site engagement…It doesn’t matter what metric you use, performance equals MONEY or if not measured, a HUGE LOSS.
Performance also harms the lifeblood of e-commerce and revenue generating websites – repeat visitors! The gap here in one second of delay is enormous over time. You’re basically sucking a huge portion of potential business out of your site, with every additional bit of waiting time you add.
These are all people on twitter who cover hybrid stuff – where usability, psychology, analytics and persuasive writing collide. If you follow this lot, you’ll be much smarter within a month, guaranteed.
And here are the most useful resources I regularly use or share with people. They have the best and most practical advice – cool insights but with practical applications.A special mention here to my friends at PRWD, who are one of the few companies blending Psychology, Split Testing and UX for superb gains in rapid time. Check out their resources section on their website.
So – what’s driving this change then? Well there have been great books on selling and persuading people – all the way back to ‘Scientific Advertising’ in 1923.And my favourite here is the Cialdini work – simply because it’s a great help for people to find practical uses for these techniques.I’ve also included some analytics and testing books here – primarily because they help so MUCH in augmenting our customer insight, testing and measurement efforts.There are lots of books with really cool examples, great stories and absolutely no fucking useful information you can use on your website – if you’ve read some of these, you’ll know exactly what I mean. These are the tomes I got most practical use from and I’d recommend you buy the whole lot – worth every penny.