UX STRAT Online 2021 Presentation by Josephine Scholtes, Microsoft
4 de Dec de 2021•0 gostou
0 gostaram
Seja o primeiro a gostar disto
mostrar mais
•243 visualizações
visualizações
Vistos totais
0
No Slideshare
0
De incorporações
0
Número de incorporações
0
Baixar para ler offline
Denunciar
Design
These slides are for the following session presented at the UX STRAT Online 2021 Conference:
"Designing Conversational AI at Microsoft: A Design Toolkit"
Josephine Scholtes
Microsoft: User Experience Consultant
Conversational UX (CUX)
- a modality of interaction that’s based on natural language. When interacting with each
other, human beings use conversation to communicate ideas, concepts, data, and emotional
information. CUX allows us to interact with our devices, apps, and digital services the way we
communicate with each other, using phrasing and syntax via voice and text or chat that
come naturally.
Reference: Microsoft CUX Guide
Context dependent
- Conversational UX is very context dependent, for example:
Voice-only Hybrid Chat-only
Single-user Multi-user
Private Public
Quiet Noisy
Pleasenote that this is an example.These factors are no strict determinants.
Character Design
Dave
How do I look?
Just like a mobile app
your assistant icon is
really important.
What’s my name?
Think about your name and
how it reflects your brand
and the type of assistant
you are designing.
How should I sound?
Tone of voice is the primary
way you will inject your
brand into an assistant.
When should I ask for help?
No matter how smart your
assistant is, there will be
cases where you still need to
let a human take control of
the conversation.
Conversation Design
- Dialogs are for bots, like screens are for apps. They separate concerns and organize flows in
a similar way.
Home page
Sub page Search feature
Traditional application
Root dialog
Sub dialog Search dialog
Bot
Understanding intents
- You can think of user intent as the reason why a person is interacting with your conversational experience.
Identifying the true intent ensures your users are matched with the most accurate content to help them
complete their goal.
User query:
“The jeans I bought last week are too big.
Can I get them one size smaller?”
Intent: exchange an item for a different size
Reference: Microsoft CUX Guide
Disambiguating intents
- Disambiguation is the process of narrowing down a user intent by asking clarifying questions, or in some cases
presenting options to choose from, to get a better understanding of the true intent and direct to the correct content.
Ask more questions to narrow
down the intent
Provide options to reduce the
need to ask multiple questions
Ask targeted questions that inform the
user of what information is necessary
to fully understand their intent.
Reference: Microsoft CUX Guide
Why this toolkit?
- Enable people who are new to Conversational UX to design bots, virtual agents and digital assistants
Tone of voice
The primary way to inject your brand in the bot
Saying hello
First impressions matter
Rich user controls
Buttons, carousels, forms etcetera
Avatar
The visual representation of your brand
The “stubborn & clueless” bot
Help, cancel and limit retry options
Ask for help
Even the most intelligent bots need human intervention
Attendees of
UX STRAT Online
Can quickly pointattendees
to preferred talks
Maintenancebased on
eventcadence
Personal data should
not be recorded
Text-only bot for UX
STRAT Online
website
Practical eventinfo –
refer to pages
Maximumof three
errors
English (US)
Can give customized
information & send calender
invites
UX STRAT 2021 01
Do the talks have
live closed
captions?
Will therebe live
closed captions?
Yes, please
UX STRAT 2021 01
Sorry, I do not
understand your
question?
Could you repeat?
Hmm I think I’mnot
smartenough to
answer this question
Would you like me to
forward you to my
human colleague?
In 2016 Tay was launched on Twitter by Microsoft as an AI bot, within 24 hrs taken offline
Microsoft’s intent: Tay, the creation of Microsoft's Technology and Research and Bing teams, was an experiment aimed at learning through conversations. She was targeted at American 18 to 24-year olds--primary social media users, according to Microsoft--and "designed to engage and entertain people where they connect with each other online through casual and playful conversation.“ Tay was designed to learn from interactions it had with real people in Twitter. Seizing an opportunity, some users decided to feed it racist, offensive information.
And in less than 24 hours after her arrival on Twitter, Tay gained more than 50,000 followers, and produced nearly 100,000 tweets. The problem? She started mimicking her followers.
According to Microsoft, Tay is "as much a social and cultural experiment, as it is technical." But instead of shouldering the blame for Tay's unraveling, Microsoft targeted the users: "we became aware of a coordinated effort by some users to abuse Tay's commenting skills to have Tay respond in inappropriate ways.“
"Any AI system learning from bad examples could end up socially inappropriate," Yampolskiy said, "like a human raised by wolves.“
The failure of Tay, she believes, is inevitable, and will help produce insight that can improve the AI system.
Incorporated in our products (surface laptops, hololens, M365)
Similar to Siri, Alexa, Google Assistant
Back-end > offer as a B2B product
Make use of Cognitive Services (includes LUIS – Language Understanding Information System)
BMW, together with Microsoft, is looking to make future conversations with the Bavarian automaker’s Intelligent Personal Assistant even more personalized, not to mention more natural-sounding and “multi-modal”.
BMW customers could for example make an appointment at their preferred BMW dealership, simply by talking with the personal assistant – a conversation that might start with a reminder that service is due, and finish with the system arranging the appointment. Users could also manage their personal e-mails and calendar appointments the same way while on the move.
If the user isn’t matched with the correct intent, they may receive wrong or misleading information, causing frustration and the potential loss of trust in your brand.
If the user isn’t matched with the correct intent, they may receive wrong or misleading information, causing frustration and the potential loss of trust in your brand.
If the user isn’t matched with the correct intent, they may receive wrong or misleading information, causing frustration and the potential loss of trust in your brand.
If the user isn’t matched with the correct intent, they may receive wrong or misleading information, causing frustration and the potential loss of trust in your brand.
If the user isn’t matched with the correct intent, they may receive wrong or misleading information, causing frustration and the potential loss of trust in your brand.