As customer moments become more and more powerful in the search journey, the emotions that your audiences experience can dramatically influence their decision making throughout that journey.
This session - which you can view on-demand here https://www.brighttalk.com/webcast/16065/342947 - will discuss how brands can use artificial intelligence to understand the emotions that customers are experiencing and the sentiment of their query, allowing them to deliver much more effective communications.
Speaker: Richard Page, Data, Insights and Technology Manager, Reprise Digital
Semelhante a Why audience emotions matter, and how you can measure them with AI | Richard Page, Data, Insights and Technology Manager, Reprise Digital (20)
2. REPRISE /
1
2
3
4
WHAT WE’LL COVER
Introduction to AI, ML and NLP
What are they & how do they
work?
Current implementations in
Digital
How is artificial intelligence
being used?
HEART: Total Emotional
Listening
An introduction to HEART, our
emotional analysis tool
Analysing the 2018 Heatwave &
future use cases
How we tested the mood of the
nation and future use cases
3. WHAT IS AI?
Artificial intelligence is the concept of machines being
able to carry out tasks in a way that we consider “smart”
AI has been around for a long time, the first recorded
paper on neural networks was published in 1943
In the early days of computer development we tried to
teach machines to handle complex situations in a
specific way
Algorithms have progressed and have become more
intuitive, but for a long time the dream of mimicking
general human decision making was never possible
4. REPRISE /
TWO TYPES OF ARTIFICIAL INTELLIGENCE
4Title goes here / Date
APPLIED GENERALISED
5. “Machine Learning at its most basic is the practice of using
algorithms to parse data, learn from it, and then make a
determination or prediction about something in the world.”
- NVIDIA
“Machine learning research is part of research on artificial
intelligence, seeking to provide knowledge to computers through
data, observations and interacting with the world. That acquired
knowledge allows computers to correctly generalize to new
settings.”
- Dr Yoshua Benigo, Université de Montréal
SO WHAT IS
MACHINE LEARNING?
6. REPRISE /
HOW DOES IT WORK?
A set of classifiers/ the
language that a computer
understands
The objective/scoring function
Search method, often the
highest scoring classifier
REPRESENTATION EVALUATION OPTIMISATION
8. REPRISE /
ONE ANSWER: NEURAL NETWORKS
8
Designed to classify information in the same
way that the human brain does
Works on a system of probability, based on
the data that it learns from, a neural network
will make a statement, decision or prediction
with a degree of confidence
Repeated training on new data and the use of
feedback loops helps to increase the
accuracy of a model, hence the importance
of data
Title goes here / Date
9. 9
The broad field of study that attempts to understand human
conversation, either written or spoken through computers
Tries to understand nuances such as topic, emotion,
semantics, sentiment that we understand implicitly
We can use this understanding in multiple ways, such as
triage for crisis communications, providing natural responses
through chat bots or improved sentiment analysis
NATURAL LANGUAGE
PROCESSING
Title goes here / Date
11. IMAGE RECOGNITION
& “VISUAL LISTENING”
Apply the power of image intelligence to
the scale of social listening
Train a model to recognise scenes,
objects and logos to return social posts
based off the content of their images, not
just their accompanying text.
GumGum estimates that 80% of brand
mentions on social are missed because
they’re in the image but not tagged in the
text
14. REPRISE /
A LOT OF THIS IS BUILT FOR US ALREADY
FastText:
Text classification
Detectron:
Image recognition
AutoML:
Automated Machine Learning
platform
Cloud Vision:
Image annotation
Video Intelligence:
Video annotation
Azure Vision:
Face & Emotion detection
15. REPRISE /
INCLUDING EMOTIONAL ANALYSIS
15
IBM Watson Tone Analyser
Uses linguistic analysis to detect emotional,
social and language tones in written text. The
service can analyse tone at both the document
and sentence levels.
No longer identifying just positive or negative,
we can now look for anger, disgust, joy,
confidence, openness, extraversion and
agreeableness to name but a few.
17. REPRISE /
THEY’RE NOT ALWAYS PERFECT
Watson is great for long-form text, but
struggles when less data is available
Tweets and social comments in particular are
naturally shorter, and often don’t give
Watson the context to understand it fully
To circumvent this we looked into applying
another layer: emojis
Not enough context Misinterpreted
Not seen
18. 18
THE UNLIKELY WINDOW
INTO THE SOUL
Emojis are used to express nuanced emotion, and there are
enough of them now to be used very specifically
They provide a layer of context that is lost in written text as
they’re intended to replicate our facial expressions – what we
use in real life to determine the context in which something is
said
Some are very polarised in their use and therefore give
extremely accurate indications of emotion
19. 19
HOW EMOTIONAL IS
AN EMOJI?
We teamed up with UM to conduct a survey into the UK’s
usage of emojis
UM asked a net rep sample of 4,000 adults what emotions
they wanted to convey when using an emoji
We made sure that respondents could answer with multiple
emotions as not all emojis are binary
We then quantified the results into a confidence score which
we could combine with Watson’s emotional scores
We tested the combined scores against Watson’s old score
and the results showed a dramatic improvement in accuracy
21. REPRISE /
THE BEST WAY TO UNDERSTAND BRITAIN’S MOOD?
THE WEATHER
21
We analysed over 135k mentions of this
year’s record breaking summer to identify
how it affected the nation’s emotions
We saw distinct patterns between the rises
and falls in temperature along with the rises
and falls in emotion
We were able to add a geographic element to
understand the areas in the UK which were
happiest and angriest about the weather
As well as looking at gender to understand
how men and women compared
10
15
20
25
30
35
40
0
500
1000
1500
2000
2500
3000
3500
4000
4500
Joy Sadness Temp
Robin Hood’s Bay WatfordHappiest Most Vocal
22. REPRISE /
HOW EMOTIONAL IS THE AUTOMOTIVE
PURCHASE FUNNEL?
22
We used social listening to identify users in
different stages of the purchase funnel when
buying a car
Then we split their mentions by the funnel
and ran an emotional analysis on each stage
Identified that consumers were more
tentative than confident at the purchase
stage
These results allowed the team to propose a
new content strategy that focused on
addressing the emotions that were most
present in each stage
INSPIRATION
22% Joy, 21% Extraversion
CONSIDERATION
24% Agreeableness, 22% Tentative
PURCHASE
34% Tentative, 10% Confident
ADVOCACY
26% Joy
23. REPRISE /
ONGOING CAMPAIGN ANALYSIS
23
We’re able to identify which stages of a campaign drove higher emotional values, we then use this
information to tailor new campaigns to elicit these specific emotions based off previous insight
0
20
40
60
80
100
120
140
May Jun Jul Aug Sep Oct Nov Dec Jan Feb Mar Apr May
2017 2018
Anger
Disgust
Fear
Joy
Neutral
Sadness
Influencer 1
Influencer 1
Negative press
Influencer 2
24. REPRISE /
HOW WE’RE LOOKING TO IMPLEMENT
HEART FURTHER
24
Analyse the messaging
we’ve published and cross-
reference with
performance metrics
Segment audiences by their
emotions then look at what
content is driving their
positive/negative emotion
Determine whom to target
by their emotional state then
provide them with
personalised messaging
PERFORMANCE
OPTIMISATION
AUDIENCE
ENRICHMENT
PERSONALISED
MESSAGING
25. THANK YOU
This webinar is part of our ‘'Mapping out your digital
strategy for 2019' webinar series. You can view the rest:
https://www.brighttalk.com/channel/16065/straight-
talking-digital-marketing
Feel free to email me your questions at:
richard.page@reprisedigital.com