It is possible that 72% of your successful A/B tests may not be driving any business benefit, or may actually be harming your bottom line. This is because bad methodology is preventing you from detecting the truly successful ideas.
In this webinar, Qubit explains how you can avoid this problem. With special guests from Forrester who explain how A/B testing fits into the digital landscape, and Staples, who give practical advice for how to set up an A/B testing campaign, this webinar will change the way you think about A/B testing forever
Mysore Call Girls 8617370543 WhatsApp Number 24x7 Best Services
Would you bet your job on your A/B test results?
1. Would you bet your job on
your A/B test results?
@QubitGroup #QubitABtest
Friday, 28 February 14
2. Agenda
• 5 digital experience technology trends that will guide
your technology investment
— Anjali Yakkundi, Forrester Research, Inc.
• 5 fundamental A/B testing mistakes and how to
avoid them
—Will Browne, Qubit
• How to run a successful A/B testing program
—Melanie Kyrklund, Staples
• Q&A
@QubitGroup #QubitABtest
Friday, 28 February 14
3. 5 Digital Experience Technology Trends
To Guide Your Investment
Anjali Yakkundi, Analyst
February 27, 2014
Friday, 28 February 14
26. 5 fundamental
A/B testing mistakes
and how to avoid them
Will Browne, Qubit
will.browne@qubitdigital.com
@QubitGroup #QubitABtest
Friday, 28 February 14
28. A/B testing is an essential tool
@TheGrahamCooke
Friday, 28 February 14
29. A/B testing is an essential tool
• It is the best, simplest way to prove the business value of
your digital technology
@QubitGroup
@TheGrahamCooke
Friday, 28 February 14
#QubitABtest
30. A/B testing is an essential tool
• It is the best, simplest way to prove the business value of
your digital technology
• It is the only way to get close to proving cause and effect
in what you do
@QubitGroup
@TheGrahamCooke
Friday, 28 February 14
#QubitABtest
31. But there are costs associated with A/B testing
Friday, 28 February 14
32. But there are costs associated with A/B testing
• The monthly cost of your testing platform
@QubitGroup #QubitABtest
Friday, 28 February 14
33. But there are costs associated with A/B testing
• The monthly cost of your testing platform
• The salaries and time of your developers and team
@QubitGroup #QubitABtest
Friday, 28 February 14
34. But there are costs associated with A/B testing
• The monthly cost of your testing platform
• The salaries and time of your developers and team
• The lost revenue from implementing changes that do
nothing for your site (or actively harm it)
@QubitGroup #QubitABtest
Friday, 28 February 14
35. We’re here to show you how to
reduce these costs, and increase ROI
@QubitGroup #QubitABtest
Friday, 28 February 14
36. We’re here to show you how to
reduce these costs, and increase ROI
• Find the ideas that are likely to have a real
positive effect on user behavior (true
winners)
@QubitGroup #QubitABtest
Friday, 28 February 14
37. We’re here to show you how to
reduce these costs, and increase ROI
• Find the ideas that are likely to have a real
positive effect on user behavior (true
winners)
• Detect as many true winners as possible
@QubitGroup #QubitABtest
Friday, 28 February 14
38. We’re here to show you how to
reduce these costs, and increase ROI
• Find the ideas that are likely to have a real
positive effect on user behavior (true
winners)
• Detect as many true winners as possible
• Minimize the chance of finding tests that
look like winners, but have no true effect
@QubitGroup #QubitABtest
Friday, 28 February 14
39. Find the ideas that are likely to have a real positive effect
on user behavior
39
@QubitGroup #QubitABtest
Friday, 28 February 14
41. Detect as many true winners as possible
We’ve found that tests based on data are
7.7x more
effective
than tests based on intuition alone
(Qubit, 2013)
Friday, 28 February 14
42. Imagine testing for weighted coins
We’vemany coin flips would be on datato prove that one
found that tests based enough are
• How
coin was weighted?
7.7x more
effective
than tests based on intuition alone
(Qubit, 2013)
@QubitGroup #QubitABtest
Friday, 28 February 14
43. Imagine testing for weighted coins
We’vemany coin flips would be on datato prove that one
found that tests based enough are
• How
coin was weighted?
7.7x more
effective
• It is impossible to be 100% certain, but the more data you
have, the closer you can get
than tests based on intuition alone
(Qubit, 2013)
@QubitGroup #QubitABtest
Friday, 28 February 14
44. Imagine testing for weighted coins
We’vemany coin flips would be on datato prove that one
found that tests based enough are
• How
coin was weighted?
7.7x more
effective
• It is impossible to be 100% certain, but the more data you
have, the closer you can get
• We follow scientific standards and wait until there is an
than tests based on intuition alone
80% chance of detecting a difference
(Qubit, 2013)
@QubitGroup #QubitABtest
Friday, 28 February 14
45. Minimize the chance of finding tests that look like winners,
but have no true effect
We’ve found that tests based on data are
7.7x more
effective
than tests based on intuition alone
(Qubit, 2013)
Friday, 28 February 14
46. Minimize the chance of finding tests that look like winners,
but have no true effect
•We’ve found that tests based on data are there is a 5%
We generally assume that with good testing
false positive rate
7.7x more
effective
than tests based on intuition alone
(Qubit, 2013)
@QubitGroup #QubitABtest
Friday, 28 February 14
47. Minimize the chance of finding tests that look like winners,
but have no true effect
•We’ve found that tests based on data are there is a 5%
We generally assume that with good testing
false positive rate
7.7x more
effective
• We can test this using A/A tests (showing both groups
the exact thing)
than tests based on intuition alone
(Qubit, 2013)
@QubitGroup #QubitABtest
Friday, 28 February 14
48. Minimize the chance of finding tests that look like winners,
but have no true effect
•We’ve found that tests based on data are there is a 5%
We generally assume that with good testing
false positive rate
7.7x more
effective
• We can test this using A/A tests (showing both groups
the exact thing)
• So out of 100 A/A tests, 5 would be winners, despite
than tests based nothing
doing absolutely on intuition alone
(Qubit, 2013)
@QubitGroup #QubitABtest
Friday, 28 February 14
51. Validate your results
We recommend a
95/5 split
to confirm your winning variation
is still winning
“Most winning A/B test results are illusory”, Qubit 2014
@QubitGroup #QubitABtest
Friday, 28 February 14
52. A worked example: Imagine Mr Bean’s jelly beans shop
“Most winning A/B test results are illusory”, Qubit 2014
@QubitGroup
Friday, 28 February 14
53. A worked example: Imagine Mr Bean’s jelly beans shop
• Mr Bean has a 1% conversion rate
• He runs 20 different tests in one year
• There are 2 good tests with uplifts of 5%
• There are also 2 bad tests with uplifts of -5%
“Most winning A/B test results are illusory”, Qubit 2014
@QubitGroup #QubitABtest
@QubitGroup
Friday, 28 February 14
58. Correct method
Naive method
Reported uplift
2 x 5%
9 x (5-40)%
Delivered uplift
2 x 5%
2 x 5% + (? x -5%)
@QubitGroup #QubitABtest
Friday, 28 February 14
59. Correct method
Naive method
Reported uplift
2 x 5%
9 x (5-40)%
Delivered uplift
2 x 5%
2 x 5% + (? x -5%)
None
7 unnecessary
tests
Wasted resource
@QubitGroup #QubitABtest
Friday, 28 February 14
60. Correct method
Naive method
Reported uplift
2 x 5%
9 x (5-40)%
Delivered uplift
2 x 5%
2 x 5% + (? x -5%)
None
7 unnecessary
tests
Improved
Degraded
Improved
Confused
Wasted resource
Customer
experience
Your
understanding
@QubitGroup #QubitABtest
Friday, 28 February 14
61. Correct method
Naive method
Reported uplift
2 x 5%
9 x (5-40)%
Delivered uplift
2 x 5%
2 x 5% + (? x -5%)
None
7 unnecessary
tests
Improved
Degraded
Improved
Confused
Happy
Angry
Wasted resource
Customer
experience
Your
understanding
Your manager
@QubitGroup #QubitABtest
Friday, 28 February 14
62. So, to conclude, those
5 mistakes are:
• Mistake 1: Not A/B testing
• Mistake 2: Not prioritizing tests properly
• Mistake 3: Not waiting for significance and
sample size
• Mistake 4: Testing too many things at once
• Mistake 5: Not validating the tests after
implementation
@QubitGroup #QubitABtest
Friday, 28 February 14
63. Questions you should ask your
testing provider
• How do you prioritize tests?
• How do you calculate the sample
size required?
• How do you account for the multiple
testing problem?
• How do you validate winning tests?
@QubitGroup #QubitABtest
Friday, 28 February 14
64. Thank you
If you have any detailed questions please contact
research@qubitdigital.com
Will Browne, Qubit
will.browne@qubitdigital.com
@QubitGroup #QubitABtest
Friday, 28 February 14
65. How to run a successful A/B testing program
Melanie Kyrklund
Senior Ecommerce Analyst
Friday, 28 February 14
66. Testing programs have the potential to drive the
online customer experience
Understand your customers
and respond to their needs
Optimize the user journey
Customer experience
Create personalized
experiences
Quickly respond to market
pressures
Personalization
Persuasion
Targeted
offers
User
experience,
functionality &
content
Rapidly align marketing &
onsite experience
Validate development items
Every visit represents an
opportunity to establish a
relationship with a customer
66
Friday, 28 February 14
67. The reality of running a testing program
For many organizations, testing represents a cultural change with
many political and operational barriers to be overcome:
Resource & operational factors
Buy-in / Multiple stakeholders
Alignment with other business priorities and development plans
Instilling a data-driven culture
67
Friday, 28 February 14
68. A lot of testing is required to deliver results
1/5 tests or 1/10 tests will deliver statistically significant wins
A rapid pace of testing is required to build a program that is
delivering consistent results month over month
68
Friday, 28 February 14
69. Two factors are critical in building a successful
testing program
Approach
Speed
Methodology
Idea Generation
Operational Process
Prioritization
69
Friday, 28 February 14
70. The right approach to optimization
Be inspired – every visit is an opportunity to establish a relationship
with a customer
Use data to understand how visitors are using your website, what is
working and what isn’t
Customer feedback to understand why visitors are not purchasing on
your website
If working over multiple websites & countries, look for insight at
local level first and then identify commonalities at a later stage
Persuasion research, understand the psychology of your customers
Supplement with quick UX testing
Thoroughly research & quantify each opportunity (visitors reached,
expected uplift and incremental revenue)
UX optimization
Personalization
Persuasion
70
Friday, 28 February 14
75. Speed – Operational Processes
Testing Tool selection : Map out the most efficient
implementation process
Operational efficiency
Pick managed services (analysis, implementation) as required
Prioritization
Build a test pipeline logging:
Idea & hypothesis
Expected uplift & incremental revenue
Implementation estimates
Test duration
Prioritize tests weekly, to keep up momentum of testing and
focus on biggest wins
72
Friday, 28 February 14
76. Conclusion: A rapid, data-driven approach is required
Testing programs have the potential to drive the online
customer experience, however a rapid data-driven approach
is required to deliver results
Plan the required resources upfront
Consider ease of use, test time to market and managed services
in tool selection
Map out the most efficient end to end process to get tests live
Process supported by data-driven idea generation &
prioritization
73
Friday, 28 February 14