Craig Sullivan, Senior Optimisation Consultant covers the top 10 reasons why your conversion rate might suck. Packed with actionable tips and resources this presentation is for anyone wanting to improve their Conversion Optimisation. Craig covers common problems and topic areas such as issues with Google Analytics setup, inputs, tools, testing, testing cycles, product cycles, Photo UX, how to analyse statistics / Data, segmentation, multiple channel optimisation. The resource pack also include a maturity model, Crowd-sourced UX, collaborative tools, testing tools for CRO & QA, Belron Methodology example, and CRO and testing resources.
17. Insight - Inputs
Opinion
Cherished
notions
Marketing
whims
Cosmic rays
Not ‘on
brand’
enough
Ego
IT
inflexibility
Panic
Internal
company
needs
#FAIL
Competitor
change
An article
the CEO
read
Some
dumbass
consultant
Competitor
copying
Dice rolling
Guessing
Knee jerk
reactons
Shiny
feature
blindness
18. Insight - Inputs
Usability
testing
Forms
analytics
Search
analytics
Voice of
Customer
Market
research
Eye tracking
Customer
contact
A/B and
MVT testing
Big &
unstructured
data
Insight
Social
analytics
Session
Replay
Web
analytics
Segmentation
Sales and
Call Centre
Surveys
Customer
services
Competitor
evals
41. •
•
•
•
•
•
•
•
•
•
•
•
Invest continually in Analytics instrumentation, tools & people
Use an Agile, iterative, Cross-silo, One team project culture
Prefer collaborative tools to having lots of meetings
Prioritise development based on numbers and insight
Practice real continuous product improvement, not SLED
Source photos and copy that support persuasion and utility
Have cross channel, cross device design, testing and QA
Segment their data for valuable insights, every test or change
Continually try to reduce cycle (iteration) time in their process
Blend ‘long’ design, continuous improvement AND split tests
Make optimisation the engine of change, not the slave of ego
See the Maturity Model in the resource pack
42. •
•
•
•
•
•
•
Belron
Dell
Shop Direct
Expedia
Schuh
TSR Group
Soundcloud
• Gov.uk
–
–
–
–
–
–
–
–
Ed Colley
Nazli Yuzak
Paul Postance (now with EE)
Oliver Paton
Stuart McMillan
Pete Taylor
Eleftherios Diakomichalis &
Ole Bahlmann
Adam Bailin (now with the BBC)
Read the gov.uk principles : www.gov.uk/designprinciples
And my personal favourites of 2013 – Airbnb and Expensify
56. Level 1
Level 2
Level 3
Level 4
Level 5
Starter Level
Culture
Early maturity
Serious testing
Core business value
You rock, awesomely
Local Heroes
Chaotic Good
Small team
Low hanging fruit
Dedicated team
Volume
opportunities
Cross silo team
Systematic tests
Ninja Team
Testing in the
DNA
A/B testing
Basic tools
Session replay
________________________________________________________________________
Company wide
Ad Hoc
Well developed
Streamlined
Process
_____________________ _ Outline process
________________________________________________________________________
+Spread tool use
+Funnel
+Cross channel
_______________________+ Multi variate
Guessing
optimisation
testing
Testing
focus
No segments
Call tracking
Some segments
Micro testing
Integrated CRO
and analytics
Dynamic adaptive
targeting
Machine learning
Segmentation
Realtime
________________________________________________________________________
Multichannel
+ offline
+
Bounce rates
+ Funnel fixes
________________________Funnel analysis
funnels
integration
Analytics
Low converting
focus
Big volume
landing pages
& High loss pages
Forms analytics
Channel switches
Single channel
picture
+User Centered
Design
scores tied to UX
Cross channel
synergy
________________________________________________________________________
________________________
+ All channel view
+Regular usability
Analytics
+ Customer sat
Insight
methods
Surveys
Contact Centre
Low budget
usability
testing/research
Prototyping
Session replay
Onsite feedback
Layered feedback
Mini product tests
Rapid iterative
testing and
design
of customer
Driving offline
using online
All promotion
driven by testing
_________________________________________________________________________
Continual
56
Get buyin
Scale the testing
Mine value
Mission
_______________________ Prove ROI
improvement
57. 2 - UX Crowd tools
Som, feedback
Remote UX tools (P=Panel, S=Site recruited, B=Both)
Usertesting (B)
www.usertesting.com
Userlytics (B)
www.userlytics.com
Userzoom (S)
www.userzoom.com
Intuition HQ (S)
www.intuitionhq.com
Mechanical turk (S)
www.mechanicalturk.com
Loop11 (S)
www.loop11.com
Open Hallway (S)
www.openhallway.com
What Users Do (P)
www.whatusersdo.com
Feedback army (P)
www.feedbackarmy.com
User feel (P)
www.userfeel.com
Ethnio (For Recruiting)
www.ethnio.com
Feedback on Prototypes / Mockups
Pidoco
Verify from Zurb
Five second test
Conceptshare
Usabilla
www.pidoco.com
www.verifyapp.com
www.fivesecondtest.com
www.conceptshare.com
www.usabilla.com
57
63. 3.5 - Google Docs and Automation
•
•
•
•
•
•
Lots of people don’t know this
Serious time is getting wasted on pulling and preparing data
Use the Google API to roll your own reports straight into Big G
Google Analytics + API + Google docs integration = A BETTER LIFE!
Hack your way to having more productive weeks
Learn how to do this to make completely custom reports
63
69. 5 – Methodologies - Lean UX
“The application of UX design methods into product
development, tailored to fit Build-Measure-Learn cycles.”
Positive
–
–
–
–
–
–
Lightweight and very fast methods
Realtime or rapid improvements
Documentation light, value high
Low on wastage and frippery
Fast time to market, then optimise
Allows you to pivot into new areas
Negative
– Often needs user test feedback to
steer the development, as data not
enough
– Bosses distrust stuff where the
outcome isn’t known
69
70. 5 - Agile UX / UCD / Collaborative Design
“An integration of User Experience Design and Agile*
Software Development Methodologies”
*Sometimes
Research
Positive
– User centric
– Goals met substantially
– Rapid time to market (especially when
using Agile iterations)
Negative
– Without quant data, user goals can
drive the show – missing the business
sweet spot
– Some people find it hard to integrate
with siloed teams
– Doesn’t’ work with waterfall IMHO
Wireframe
Concept
Analyse
Prototype
Test
70
72. 5 - Lean Conversion Optimisation
“A blend of User Experience Design, Agile PM, Rapid Lean
UX Build-Measure-Learn cycles, triangulated data sources,
triage and prioritisation.”
Positive
– A blend of several techniques
– Multiple sources of Qual and Quant data aids triangulation
– CRO analytics focus drives unearned value inside all
products
Negative
– Needs a one team approach with a strong PM who is a
Polymath (Commercial, Analytics, UX, Technical)
– Only works if your teams can take the pace – you might be
surprised though!
72
74. 5 - Triage and Triangulation
“This is where the smarts of CRO are – in identifying the
easiest stuff to test or fix that will drive the largest uplift.”
•
•
•
•
•
•
•
•
•
•
Starts with the analytics data
Then UX and user journey walkthrough from SERPS -> key paths
Then back to analytics data for a whole range of reports:
Segmented reporting, Traffic sources, Device viewport and
browser, Platform (tablet, mobile, desktop) and many more
We use other tools or insight sources to help form hypotheses
We triangulate with other data where possible
We estimate the potential uplift of fixing/improving something
as well as the difficulty (time/resource/complexity/risk)
A simple quadrant shows the value clusters
We then WORK the highest and easiest scores by…
Turning every opportunity spotted into an OUTCOME
74
75. 5 - The Bucket Methodology
“Helps you to stream actions from the insights and prioritisation work.
Forces an action for every issue, a counter for every opportunity being lost.”
Test
Instrument
If there is an obvious opportunity to shift behaviour, expose insight or
increase conversion – this bucket is where you place stuff for testing. If
you have traffic and leakage, this is the bucket for that issue.
If an issue is placed in this bucket, it means we need to beef up the
analytics reporting. This can involve fixing, adding or improving tag or
event handling on the analytics configuration. We instrument both
structurally and for insight in the pain points we’ve found.
Hypothesise
This is where we’ve found a page, widget or process that’s just not working
well but we don’t see a clear single solution. Since we need to really shift
the behaviour at this crux point, we’ll brainstorm hypotheses. Driven by
evidence and data, we’ll create test plans to find the answers to the
questions and change the conversion or KPI figure in the desired direction.
Just Do It
JFDI (Just Do It) – is a bucket for issues where a fix is easy to identify or the
change is a no-brainer. Items marked with this flag can either be deployed
in a batch or as part of a controlled test. Stuff in here requires low effort
or are micro-opportunities to increase conversion and should be fixed.
Investigate
You need to do some testing with particular devices or need more
information to triangulate a problem you spotted. If an item is in this
bucket, you need to ask questions or do further digging. 75
76. 5 - Belron example – Funnel replacement
Final
prototype
Usability
issues left
Final changes
Release build
Legal review
kickoff
Instrument
analytics
Signoff
(Legal,
Mktng, CCC)
Test Plan
Marketing
review
Cust services
review kickoff
Instrument
Contact
Centre
Offline
tagging
QA testing
End-End
testing
Launch
90/10%
Go live 100%
Launch
50/50%
Monitor < 1
week
Launch
80/20%
Monitor
Analytics
review
Washup and
actions
New
hypotheses
New test
design
Rinse and
Repeat!
78. END SLIDES
Feel free to steal, re-use, appropriate or otherwise lift
stuff from this deck.
If it was useful to you – email me or tweet me and tell me
why – I’d be DELIGHTED to hear!
Regards,
Craig.
78