2. Todays Agenda
• Integrated Media Evaluation
o Offline Vs. Online
o Changes to the Purchase Funnel
o Best Practices
o Case study: How to attribute credit for success?
• Interpreting Plans and Reports
o Success factors
o What to look for
o Reporting typology
• Creative Testing
o Testing Methodology’s
o Best practices
4. Media Integration Background
• Offline media is a lean back medium, online is lean
forward.
• TV is often a shared experience, digital is a personal
experience.
• Many people are online while they consume other
media.
• Research proves that there is a strong link with offline
media driving online activity i.e. a TV ad triggers
curiosity in a product, the consumer uses the internet
(often a search engine) to find out more.
• China* TV content is often consumed online e.g.
Toudu, Youku etc. Many under 30’s don’t consume
analog media.
6. Offline vs. Online Advertising Key Differences
Offline Online
• Limited channels
• typically few free to air • Vast number of sites
stations • Long tail
• Local-centric • Global
• Estimated • Precise
measurements measurements
• Measure R&F • Measure quality
7. Offline vs. Online Reach & Frequency
Reach - the number of different people who are exposed an advertising
message at least once
Frequency - the number of times they are exposed to the message
Offline advertising
o Buy “Media Impressions” – based on overall reach
o Calculate frequency by how many opportunities each person will see
the ad
Online advertising
o Buy “Ad Impressions” – actual unique impressions on the ad
o Calculate frequency by tracking each repeat visits using “cookies”
technology
8. Online R & F Methodology
Measuring Online Share Of Voice (SOV)
% of the publisher’s total inventory (based on impressions) in a given
period of time.
• Measurement might not be consistent from site to site – is SOV
relative to:
Impressions?
Unique visitors?
Particular media targets?
Total impression volume available on the site?
Issues:
• Disabled / deleted “Cookies” in browsers
• Shared computer terminals
• Publisher counting tactics and methodology differences
9. Social Media & the Purchase Funnel
• While the shape of today’s funnel
can take many forms depending on
the author, its clear that a feedback
loop that is powered by consumer
generated content is an ever present
addition to this model.
• Word of Mouth has always been
important in influencing consumer
decisions, the difference is that
digital media has amplified this voice
and empowered consumers to listen
at a higher volume.
• Instead of asking 2 neighbors which
soap they use you can go online and
find out what thousands of people
think.
10. Digital Age Model is Non-Linear
• Today the impact of digital media has created multiple new tools for
consumers to use to find out more information about potential purchase on
their own terms.
• Blogs, search engines, BBS, video, comments/rankings, podcasts all offer
opinions about brands that are separate to the message form the brand
owners
11. The New McKinsey Model
• McKinsey rejects the linear approach in favor of a circular model of
ongoing exposure leading to a trigger moment
• Consumers begin with a narrow consideration set due to ad clutter and
being time poor. Then a trigger occurs when the consumer decides they
want to buy something new
• Once they decide to buy the move into active evaluation. Here the
brands considered increases (opp of linear model) and intense research is
undertaken-checking out sites, paying more attention to certain kinds of
relevant media they already consumer.
• Third stage is closure. This is where the consumer goes to purchase
although many have not made a final decision on which item to buy.
Post Purchase experience dictates the degree of loyalty the consumer has:
• Active loyalty: Prefers same brand and will recommend to others.
• Passive loyalty: Is comfortable with repurchase but also open to other choices
13. Making 2+2=5 via Media Integration
• Effectiveness of offline media can be measured by
immediate performance of online media.
• Both mediums should ALWAYS be synchronized
(flighting, messaging, placement) to deliver optimal
synergies.
• Online media can be used to increase ROI of offline
media.
• The critical link in this approach is SEM- keyword
bidding, copy & linking pages must be comprehensive
to catch and redirect qualified traffic from offline.
• In this way online media supports and leverages the
larger offline media spend to make it more effective
14. How the integrated road to purchase works
TV Ad: 8pm to 10pm June 4th
Internet Ad: 8pm to 10pm June 4th?
For the treasure hunt to
succeed clues must be
left along the way
15. Measuring Success
• Media channels must be evaluated together to determine how they
effect one another.
• Performance should be monitored before, during & after a
campaign to establish baseline performance and to enable
benchmarking.
• If an ongoing PPC campaign usually yields 5,000 clicks per month
(the baseline) but records 10,000 clicks during an offline media
burst, then the campaign contribution is 5,000 clicks. Scrutiny
should also be paid to changes in other digital media performance
to assign credit appropriately.
• An ongoing challenge is to determine the impact of offline
awareness building which drives subsequent online media
response.
• If a TV campaign is driving a lot of brand searches on Baidu why do
a PPC campaign when you can get organic clicks for free?
18. Critical Success Factors
Effectiveness and ROI depend on…
• Creative strategy
• Targeting
• Ad type
• Ad Size
• Frequency
• Website and landing pages
Optimizing these variables makes the difference
between success and failure
19. What to look for in a media plan
• There must be a clear strategy that explains how the
tactics will work together in harmony to achieve the
objective.
• The online strategy should roll up into the overall
integrated campaign strategy.
• Does the media approach support the creative
strategy? Often they do not compliment one another.
• Is the plan a ‘cut and paste job’ from a previous
campaign or a different client?
• Does this agency/team consistently recommend the
same publishers? If so why and are there better
alternatives?
20. What to look for in a creative strategy
• Is there a clear rationale justifying the creative
concept?
• Ask yourself, does this strategy make YOU think more
highly of this product/brand? What about someone that
you know well that is part of the target audience
profile?
• Does the online creative approach support and
leverage the offline thinking so that it translates online?
• If larger sizes are underperforming smaller sizes its
likely the concept or execution is sub-optimal.
• Multiple resizes are common in this market, each
additional edition should take no more than 20-30 mins
of billable hours.
• If the creative concept is a western import, once initial
translation is done there should be no extra charges.
21. What to look for in tracking reports
• GAP Analysis: Did the campaign begin and end on time and if it’s
a CPD campaign was each flighting verified?
• Formatting: Manually check the formulas and that the fields all
tally. Look closely at ‘Gross’ and ‘Net’ media costs.
• Value added: How much of the final delivery is bonus and what is
the nature of it- often its remnant inventory that’s dumped at the
back end of a campaign to meet quotas.
• Ad serving: This is usually marked up by the agency. It should be
no more than $0.07 CPM. If the agency is using allyes it may be
free in return for media booking fees collected from media owners.
• Quality score: Media owners will usually pad a proposal with
unsellable inventory to get to a desirable eCPM. Evaluate each line
item for prominence, exclusivity and scale.
• Optimization: All publishers allow mid campaign changes to
placements and creative. Ideally performance should always
improve over time
22. Tips on interpreting reports
Report Type Purpose
Daily Monitor delivery – assist in identifying delivery issues
(or weekly)
Monitor performance – LUM analyze performance across a 2
Fortnight weeks span on sites and placements
In depth analysis on entire campaign performance and take
End Campaign learnings into future campaigns
23. Type of Charts
Typical Charts
• By Banner Size
• By Placement
• By Day Part
• By Unique Users
• Post Click Tracking
• Impressions vs. CTR
• Interaction Rate vs. Impressions
• Interaction Rate vs. CTR
• CPC vs. CPM
33. Tips on interpreting reports
Snapshot in time Over time
(1 month cumulative) (across 1 month)
Placement Analysis Run Charts
• measures Impression vs. Click • measures Impression vs. CTR
• assist in analyzing placements / • across a set time period
creative • assist in identifying trends or
patterns
34. Tips on interpreting reports
Placement Analysis by Site
• Some sites / creative combination will be more effective in reach / impressions
(e.g.: in the example below “Gamer ROS”– ad placements may appear on any pages of the
target site)
• Some sites / creative combination will be more efficient in generating click-through
(e.g.: below “Gamer Home Page”)
35. Tips on interpreting reports
Placement Analysis by Banner Type
• Performance of banner types can be compared by Impressions vs. CTR
• CTR is typically reflective of the effectiveness of placement and/or creative
• CPC can be included in the analysis to measure placement efficiency
Banner types
are grouped
together for
easier reading
36. Tips on interpreting reports
Run Charts – identify problems and trends
• Is there a significant trend or pattern that should be investigated?
• Are they just normal process variations?
o Look at data for a long enough period of time (at least 2 weeks is
recommended for online), so that a “usual” range of variation is evident
o Is the data within the usual range of variation?
o Is there a daily pattern? Weekly? Monthly?
37. Tips on interpreting reports
Identifying Trends
• Upward and downward (seesaw) movements are common
• typically 6 or more consecutive jumps in the same direction (up or down) indicates a
trend
• In display ads, a downward trend typically signifies “creative wear out”
Creative change
38. Tips on interpreting reports
Report date range
• Notice behaviour shifts – over weekends, holidays, etc.
In the example above, we can observe that business tech sites gain more
impressions over weekends.
39. Tips on interpreting reports
Adopt a holistic view
• Do not interpret data in isolation
In the example above, CTR was extremely high on Oct 24th.
On closer examination, we find that impression was very low for that day,
therefore resulting in an anonymity in the reports.
As it turns out there were technical difficulties on that site in that particular day.
40. Example of Testing for Banner
• Effectiveness & Efficiency vary by site and format
High CTR not Equal High Action
42. Background to Testing
• Digital is a highly measureable medium, *everything can be tracked
and analyzed
• The more tracking data you have the more learning’s you have,
however data overload can obscure this.
• Rigorous testing and analysis is time consuming and is often seen
as only a ‘nice to have’ beyond basic metrics.
• The key question is how much testing is optimal to provide the most
cost effective outcomes.
• There are eye tracking studies, creative A/B and multivariate
testing, media trials and panel based surveys all to consider.
• This aspect of digital marketing is the most important and
undervalued: Why Invest in a medium famous for its unique
measurability when you don’t leverage its core advantage???
• Offline thinking (i.e focus group think) at the C-Level still precludes
this elephant in the room from ever being properly acknowledged.
43. 8 Step Closed Loop Creative Testing Approach
1. The Create Step
Create six different ideas, executed in four sizes each
2. The Measurement Step
Measure the success of the banners.
3. The Selection and Speculation Step
Evaluate each banner and try to assess why some worked
and others didn't by size, idea, call-to-action, execution, etc.
Why are the top/bottom performers winning/losing?
4. Codify Learning’s and Assumptions
Build on industry best practices to develop a set of client-specific &
audience-specific best practices. Refine and record what is
working
5. Create and Measure Again
Create a second raft of banners, based on learning’s from previous
Phase. Evaluate performance using same criteria
6. Select Top Performers Again
Annotate learning’s and optimize
7. Leveraging the Library
Rerun top performing banners from round 1 and 2. Run the tweaked or
optimized mid-level banners from round 1 and 2. Mix in new banners .
8. Leveraging the Library Long Term
Perform bi-weekly or monthly measurement reviews ,Ongoing measurement and analytics
followed by optimization and codified learning combine to continue driving performance.
44. Main Types of Land Page Testing
• A/B Testing • Multivariate Testing
Multivariate tests (MVT) allow you to test multiple elements of a single page
at the same time. A/B tests as simplified versions of them.
45. 6 Key Web Page Testing Mistakes
1. Using the wrong type of test (A/B vs. multivariate test)
A/B tests are good for alternate designs of an entire page, whereas multivariate tests are helpful in
determining the most successful elements at different locations on a page.
2. Testing too many elements
software solutions permit thousands or even millions of combinations on a single page. Testing
fewer than 100 scenarios can boost conversion rates from 4% to 15% in a matter of a few months.
3. Running tests that take too long
Keep track of external factors and understand that they might have an impact on your results.
Minimize the time it takes to run your experiments to four weeks at the most.
4. Failing to monitor your test as it takes place
Digital is not a set and forget medium. Closely monitor experiments to decide if you should
eliminate some elements when there is enough evidence.
5. Failing to conduct follow-up experiments
Designing follow up experiments, and learning which elements worked and which elements did
not, is at the heart of conversion optimization.
6. Thinking that testing is a silver bullet
Testing is a great tool to increase your conversion rates However, it is only one step in the
process.
Bottom Line: Select the right type of test to conduct based on the number of visitors and other
data collected from analytics. Limit the number of scenarios through a holistic approach to testing.
Get beyond single-digit improvement by continuing to test. Approach any project from a number of
different angles, because testing alone will not maximize your results.
46. Eye Tracking Testing
Eye tracking works by reflecting invisible infrared light onto an eye, recording the reflection
pattern with a sensor system, and then calculating the exact point of gaze using a geometrical
model. Once the point of gaze is determined, it can be visualized and shown on a computer
monitor. The point of gaze can also be used to control and interface with different machines.
This technique is referred to as eye control.
47. Summary
• Offline and online media are fundamentally different mediums which
are only truly effective when strategically synchronized.
• Everything is measurable with digital, the key is to determine what
the most important metrics are to focus on and then apply as
actionable learning's.
• The ‘devil is in the details’ with digital: strive to look beyond the
obvious and question every single detail RUTHLESSLY to
uncover the true value of a program.
• If a chart, statistic or concept does not make sense in less than 10
seconds to you its not meaningful.(KISS)
• ABT (Always Be Testing): digital thinking changes every 6 months
and useable learning's must be deployed to make 2+2=5.
• Each of you together at R3 are Chinas most important thought
leaders in making our marketing industry more effective, more
respected and more important. Its up to you to make it happen!
Editor's Notes
Vertical Portal (eg. Fashion site) – use a functional message Mass Media Portal (eg. News site) –use a eye-catching creative Crazy Ad’s good for branding but poor on conversion