Even after all of these years, more media is bought based on the cheapness of the click than any other KPI, even though the click has no correlation to real success. Using the latest measurement techniques across both mobile and desktop to examine this issue, Eyereturn will show the effects of the industry's long standing addiction to the click, and what we can all do to help break the habit.
3. 3
Clients are still focused on
clicks after all these years!
In 2014, 74% of campaigns that
included our DSP, Eyedemand
were client directed to deliver either
the highest CTR or lowest CPC
In 2015 YTD we’re at 79%. So it’s
actually trending up.
These clients included the largest
holding companies, direct brands,
small agencies, everyone. But does
anyone really think about the bigger
picture?
4. 4
More clicks on an ad = more clicks to my site
More clicks to my site = more sales / newsletter signups etc
…and there isn’t actually anything wrong with clicks.
It’s only problematic when they’re used as the primary KPI.
So what’s wrong with clicks?
Or when
it’s just
habit….
5. 5
We’ve been in the 3rd party adserving business since 2000.
When we first started serving ads, they were gifs, and they had to
work on Netscape 4.
We’re an Adserver, DSP, DMP, and have the in-house data
scientists who optimize campaigns, look for fraud, and do studies
like this.
Eyereturn’s part in the history of the click
10. 10
The lowest hanging fruit of fraud
Despite all the talk, all of the verification vendors and whitelists – click fraud is still out there.
Optimizing to a very easily gamed KPI like clicks will attract fraud.
Fraud is decreasing faster for clients who don’t optimize to click.
11. A lot of clicks are unintentional
In September 2015, Youtube changed their click policy – you used to be able to
click on the whole video. After the change, to click through on the video you had to
click on the tiny link at the bottom.
12. 12
But what does this actually mean?
Small call to action = fewer clicks (7,700 fewer clicks on 1 million impressions)
What happened to all those clicks?
If users were really looking to click wouldn’t they have found the new link?
13. 13
Some sites / ads are designed to generate clicks
Do you actively buy on this kind of inventory?
15. 15
The web has adapted to feed
clicks to advertisers, whether
those clicks are real or not
16. 16
Clicks don’t equal brand-site engagement
60.66% of ad clicks that make it to a landing page don’t make it any further than the landing page.
And…average number of page views is far less for users who clicked on an ad.
21. 21
Most campaigns that are optimized to click end up on a lot of mobile devices
Why? Fat Fingers.
Mobile devices generate a far higher CTR
22. Click Distribution on campaign optimized to click
Chrome
IE
Firefox
Safari
Others
37.7%47.6%
10.2%
12.5%
55%
25.1%
All Impression Distribution by Browser
Clickers come from certain browsers
Safari increases
362%
23. 23
The App Effect
More mobile = more app content
Apps are harder to track, harder to measure
App inventory is not fraud free
24. 24
A lot of mobile clicks don’t make it
Users change their minds after accidental clicks
Mobile Network Latency, landing page weight drops redirects before the landing page loads
25. 25
What happens when you optimize to CTR
Clicks
Dropped
Redirects
“Actual”
Performance
26. 26
Side-effects of skewing mobile
If your creative is not mobile optimized but ends up on mobile inventory – not good
If brand site is not mobile optimized but ends up on the screens of mobile users – not good
If conversions are a secondary metric, more mobile, more Safari = more difficult to measure
28. 28
To Recap:
• Most clicks are generated by a small number of users
• Fraud actively targets click optimized campaigns
• Clicks do not equal conversions
• A lot of clicks are unintentional
• Inventory can be designed to trick users into clicking
• Clickers visit fewer pages on brand sites
• Mobile devices deliver high CTR but low conversions
• More mobile = more discrepancies between clicks and site visits
29. 29
High CTRs and low CPCs generated by poor quality clicks
become benchmarks that are hard to back off from.
If a campaign is evaluated purely on CPC or CTR, vendors
who deliver those KPIs are rewarded with more business,
perpetuating the cycle.
Who wants to tell a client that their past successes have been built on a shaky
foundation?
The hamster wheel of clicks
30. 30
Strategies for getting away from CTR and CPC
1) Take stock:
Are any of the issues described earlier affecting your campaigns?
Take inventory of the kind of data you have.
Determine the real KPIs that will drive the advertiser’s business.
How can you measure those KPIs?
31. 31
What are your alternatives?
- What levers do you have available to pull?
- Is there data you can collect during a CTR/CPC campaign to support trying a different
strategy?
- For example – compare CTRs to CPA – does pulling the optimization lever on one affect the
other?
32. Provincial Travel Campaign
Phase 1 Results:
Directed to optimize to conversion,
eyeDemand achieved a 12% conversion
rate. (All other vendors between 6-8%)
Phase 2 Results:
Client changed optimization to $1.25 CPC.
Behavioural segments and conversion
lookalike targeting removed to reduce
CPMs.
Conversions dropped by 50%
33. Provincial Travel Campaign
Phase 3 Results:
Reverting to original CPA tactic
increased CPM and CPC but
outperformed phase 1 on CPA
Moral of the story:
Disciplined testing against real KPIs is the best way to improve results
34. 34
Another Experiment
Retail client concerned that retargeting was hitting users who were purchasing anyway, and that their budget
was being wasted.
- Using the same targeting parameters, impressions were split 50/50 between brand ads and PSAs.
- CPA for the brand ads was $14.28
- CPA for the PSAs: $157.67
Moral of the story:
Assumptions can be flawed –
only disciplined
experimentation can show the
true relationship between
campaign variables.
35. 35
The marketer must become the master of data
There’s never been a better time to use data science to crunch the numbers.
Think about what’s causation, vs what’s correlation.
Experiment. Test and test again.
36. 36
I A N H E W E T S O N
i h e w e t s o n @ e y e r e t u r n . c o m
@ i a n h e w