call girls in Vaishali (Ghaziabad) đ >àŒ8448380779 đ genuine Escort Service đâïžâïž
Â
Designing for Privacy in an Increasingly Public World
1. Designing for Privacy in an Increasingly Public World
Product Design Meetup | 21 July 2021 | Robert Stribley
Street art by JR
2. Iâm Robert Stribley
Iâm a user experience designer and manager at
Publicis Sapient and I teach user experience design
at the School of Visual Arts in New York.
Introduction
3. âą Corporations and non-profits consider the privacy of their
usersâ data, their content, even their browsing behavior for
their clientsâ benefit and safety
âą But they also do it for their own personal and financial self
interest
âą Itâs increasingly important that they consider the privacy and
security issues affecting their customers and clients
âą So how do we design these digital experiencesâapps,
websites, etc â to ensure peopleâs privacy?
Background
4. Privacy and security are different concepts
Privacy: Your ability to control your personal information and
how itâs used
Security: How your personal information is protected by those
holding on to it
These concepts often overlap, so weâll refer to both
Our focus: How we can ensure peopleâs privacy is maintained
as we design experiences for them
Purpose
6. âArguing that you don't
care about the right to
privacy because you
have nothing to hide is
no different than saying
you don't care about
free speech because
you have nothing to
say.â
â Edward Snowden, former CIA
employee, infamous NSA leaker
Why Privacy?
7. âą If weâre not concerned with a particular privacy
issue, remember, weâre not designing for
ourselves
âą If weâre designing with empathy, weâll consider
the needs of people not like ourselves â people
with different backgrounds and experiences
âą That means researching privacy issues, but also
engaging with people with diverse backgrounds
and lived experiences
Why Privacy?
8. Examples:
DayOne, a non-profit, provides services for
young people in abusive dating relationships.
These clients may worry about their partners
tracking their online activity or stalking them in
real life.
Similarly, LGBTQ youth need to feel their privacy
is secure when reaching out for help online.
In this sense, privacy issues are often diversity
issues.
Why Privacy?
10. In April, Facebook, the
largest social media platform
on the planet was hacked.
533 million userâs phone
numbers and personal data
were leaked online.
Data for half a billion people.
Data Security
11. Fraud & identity theft on the
rise during the pandemic.
FTC: 1.4 million reports of
identity theft in 2020 â double
from 2019.
Leaks of personal data can be
catastrophic to peopleâs lives.
Fraud & Identity
Theft
Photo by Kyle Glenn
12. Stores, such as Albertsons,
Rite-Aid, Macyâs, ACE
Hardware are using facial
recognition programs to
identify customers.
Some also use apps to track
customers around their stores
to present them with ads
online later.
Facial
Recognition
13. Amazon required delivery
drivers to sign consent
forms, which allowed the
company to collect their
biometric data and to use AI
cameras to monitor their
location, movement, their
driving patterns.
At least one driver quit over
this form of âAI surveillance.â
Biometric Data
14. âą A donation site for Donald Trump
deployed âdark patternsâ to trick
supporters into agreeing to recurring
donations
âą Designers rolled out iterations of this
feature with increasingly confusing
language, fine print, bold text, all-
caps, and a pre-selected check box
âą They referred to the feature as a
âmoney bombâ
âą Donations grew astronomically â as
did fraud complaints from angry
supporters
âą One 78-year-old supporter summed
up his thoughts: âBandits!â
Dark Patterns
15. âą Demand for personalized content, which benefits from
personal data seems higher than ever
âą People say they want personalized ads, so youâd think
they enjoy sharing their data
âą But a 2019 survey by network security company RSA
found only 17% of respondents said it was ethical to
track their online activity to personalize ads
âą Earlier, Pew Research found 91% of adults believe
consumers have lost control over how their personal
information is collected and used by companies
Data Sharing
16. Data Sharing
Apple rolled out a new iPhone
feature called âApp Tracking
Transparency,â an anti-tracking
shield, which prevents apps from
shadowing you across the internet.
Now, they have to ask first.
Only ~15% of iOS users worldwide
allowing apps to track them so far â
The Register, May 2021.
18. GDPR stands for âŠ
The General Data Protection Regulation
Law finalized in 2016, came into effect in
2018
Regulates how apps and sites can gather
and transfer or process personal data when
working within the European Union
Also, what happens to that data when itâs
transferred outside of the EU?
Impact of Regulations
Remember a while back
when you suddenly got a
gazillion emails from
companies telling you they
had updated their privacy
policies?
That was a result of the
GDPR.
19. Some things GDPR requires âŠ
âą Ask people to opt in to sharing their data
âą Communicate to people in the moment, when
youâre collecting their personal data
âą Be transparent about what youâre doing with it
âą Allow people to download their data and
delete it â a âright to erasureâ or âright to be
forgottenâ
Impact of Regulations
20. California passed their own version of the GDPR â
the California Consumer Privacy Act.
Gives Californians more control over how their
personal data is used.
Requirements very similar to those in the GDPR.
CCPA differs in that it (currently) allows businesses
to collect your information by defaultâthough they
do have to offer the ability to opt out.
California Consumer Privacy Act 2018
Impact of Regulations
21. In March, California announced theyâre
banning âdark patterns.â
And a new âPrivacy Optionsâ icon for
businesses to show you where to opt out of
data collection.
The icon was designed by Carnegie Mellonâs
CyLab and the University of Michiganâs School
of Information.
Impact of Regulations
22. New York, Maryland, Massachusetts and Hawaii
are developing their own privacy laws, too.
So, if youâre designing for GDPR and California
privacy laws and more, you may as well design
for all â design for the highest common good.
Impact of Regulations
24. Our Role
âYou were not hired to get approval
or to have your work pinned to the
company fridge.â
âPeople hire you to be the expert, so
you might as well be the expert.â
âMike Monteiro, designer, co-founder of Mule Design
in Ruined by Design
25. More specifically?
We have a responsibility to act as the
advocate for users â but even thatâs
too abstract.
The term âuserâ tends to strip people of
their individual circumstances, their
personality, their history, even their
lives.
We have a responsibility to real human
beings.
We may need to push back where
necessary in terms our clients
understand.
Our Role
Photo by Vince Fleming
26. We may have to explain to our clients the impacts of ignoring privacy and security concerns.
What are these impacts, specifically?
âą Civic responsibility. As user-centered designers, we really should be encouraging our clients to treat
their âend usersâ as human beings, who are members of their community
âą Reputation management: We may have to remind our clients that what companies do can
undermine their brands
âą Using dark patterns may anger people and cause them to abandon your site in favor of another with
a more transparent experience
âą Data breaches and sloppy treatment of data may lead to the loss of their user base â likely affecting
their profits
âą Financial consideration: Keep in mind the increasing number of laws and regulations and the
resulting fines for not following them
Even if thereâs an up-front cost to designing for privacy and security, the long-term costs can be
devastating
Our Role
27. In 2019, 5 employees quit their jobs at
GitHub after learning the company
shared its data with ICE, the
government agency, which has been
accused repeatedly of human rights
violations â especially related to the
treatment of immigrants.
It might be tough to speak up in such
a situation, but we got into this
business to help people â and what
we do has a real-world impact.
Our Role
28. In the1940s a Frenchman, Rene Carmille was working on the
French Census.
He and his team have been dubbed the first âethical
hackers.â They decided to sabotage their own machines, so
the punch cards couldnât register peopleâs religion properly.
The team was discovered, arrested by the Nazis and
tortured. Carmille died at Dachau.
But they prevented the Nazis from discovering the identities
of tens of thousands Jewish people living in France, saving
their lives in the process.
They did so by changing an experience to maintain peopleâs
privacy.
Rene Carmille
30. In Privacy by Design, Dr. Ann Cavoukian
lays out 7 foundation principles for Fair
Information Practices.
She recommends making privacy the
âdefault settingâ in our designs and
says privacy should be âembeddedâ
into design.
So, what are some practical ways to
ensure weâre doing that?
Best Practices
Self Study:
âPrivacy by Design: The 7 Foundational Principlesâ
by Dr. Ann Cavoukian
Founder of Global Privacy & Security by Design and the former Information and Privacy Commissioner
for the Canadian province of Ontario
32. Dark Patterns
UX designer Harry Brignull coined
the term âdark patternâ in 2010
He defines dark pattern: a âuser
interface that has been carefully
crafted to trick users into doing
thingsâ that you didnât mean to do
â like buying or signing up for
something
Another researcher described dark
patterns as supplanting user value
âin favor of shareholder valueâ
33. Brignull identified about a dozen types of
dark patterns.
Bait and Switch â You set out to accomplish
one thing but something else completely
undesirable happens.
Confirmshaming â You try to unsubscribe
from something, for example, and the
feature to opt out uses language to guilt
you out of taking action.
Friend spamming â A site asks to access
your contacts, so you can find your friends,
then it emails all your friends without your
permission.
Dark Patterns
Example of confirmshaming
34. Dark Patterns
âDark patterns are the canaries in the
coal mine of unethical design.
A company whoâs willing to keep a
customer hostage is willing to do
worse.â
â Mike Monteiro, Ruined by Design
35. Dark patterns can expose usersâ personal
information
When you make a payment on Venmo, it
defaults to public, so you automatically share
your payments with ⊠everyone
The opposite of designing with privacy as a
default
Somebody created Vicemo, which scraped
payments listed with words associated with
drugs, alcohol or sex and posted them online
for all to see
Dark Patterns
36. Strava automatically tagged other runners when you
passed them if they didnât change their settings.
This feature even had a name: Flyby.
If you clicked on a face, it showed the userâs full name,
picture and a map of their running route â effectively
revealing where they lived.
This happened without you following users and without
them knowing they were sharing their activity.
After receiving criticism, Strava did change the default
setting to private.
It should have always been private.
âStalkerwareââ Apps which allow people to be tracked â
intentionally or not
Dark Patterns
38. Itâs important to be very specific â
especially when sharing PII.
Personally identifiable information â
data points such as name, email, phone
number, social security number, motherâs
maiden name, which can be used to steal
peopleâs identities and commit fraud
87% of the U.S. population can be
uniquely identified by just their date of
birth, gender, ZIP code? (Those items
arenât even considered PII.)
Imagine how much damage a bad actor
can do with just 3 data points of PII.
What Data Is Used?
40. Consider this as an opportunity to explain the benefits of
sharing their data:
âą Does it ensure a better experience in the future?
âą Does it personalize ads and offers for them?
Be prepared to explain those benefits in detail.
If you canât, are youâre designing the right sort of product?
Why Is Data Used?
41. Why Is Data Used?
Lemonade includes an itemized,
detailed explanation of what
personal information youâre
sharing, and they also explain
why.
They also promise never to sell
your information to third parties.
âTL;DR: We will never, ever, sell your data to anyone.â
43. Clear Language
The New York Times studied 150 privacy policies
from various tech and media platforms. They
described what they found as an âincomprehensible
disaster.â
They described AirBnBâs privacy policy as
âparticularly inscrutable.â
âThis information is necessary for the adequate performance of
the contract between you and us and to allow us to comply with
our legal obligations.â
Vague language and jargon allow for a wide range
of interpretation, making it easy companies to
defend their practices in a lawsuit while making it
harder for us to understand whatâs really going on
with their data.
44. Twitter advises you to read
their privacy policy in full but
highlights key aspects of it up
front â in a dedicated section
â advising you to pay
attention to those particular
things
Clear Language
45. Guidelines:
âą Avoid legalese and jargon: Even your
terms and conditions content doesnât
have to sound like legal content
âą Consider different ages groups and levels
of savviness
âą Most adult Americans read at about a
basic or intermediate literacy level
âą 50% canât read a book written at an 8th
grade level
âą The Content Marketing Institute
recommends writing for about a 14- or
15-year-old (about 8th grade)
âą Carefully crafted personas can help
determine if an experienceâs reading level
should vary from that range
Clear Language
Photo by John-Mark Smith
47. User Controls
Google offers a Privacy Checkup with high
level descriptions of how your personal data is
being used and why.
This links to specific Privacy Controls, which
allow you to adjust how that data is accessed.
They allow you to turn off activity tracking,
location history, your YouTube history, your
Google photo settings, check which 3rd
parties have access to your account
information, and access other key settings all
in one privacy dashboard.
48. This seems like a good moment to
recall Dr. Cavoukianâs maxim:
Keep these settings private by
default
User Controls
50. Easy to Find
Such important information
shouldnât be placed in 8-point
font âŠ
buried in the Terms &
Conditions âŠ
hidden in the footer âŠ
or several levels of navigation
down deep in your app
â and yet, thatâs often where
we find it
A feature like Californiaâs
new âPrivacy Optionsâ icon
could prove effective to
draw attention to these
privacy options.
51. Easy to Find
Contextual and easy to find also means âŠ
Onboarding â Explaining in detail how you use peopleâs
data when theyâre using your app for the very first time.
âJust in timeâ alerts â Alerting users in the momentâwhen
theyâre about to share data in a new wayâeven if they have a
history of using your experience.
52. Easy to Find
Mozilla displays robust
Privacy information by
default in a dedicated tab
when you download and
open their Firefox
browser for the first time.
53. Remind users regularly about their privacy
options
And actively encourage them to take
advantage of them
Reminders
7
54. Reminders
Facebook allows you to set
reminders to do a privacy
checkup every week, month, 6
months or year
Google also has a feature, which
will send you a reminder to check
your privacy settings.
55. Never change usersâ privacy settings
without telling them in advance.
They should also have the option to opt
out of such changes.
Never Change Without Notice
8
56. A few years ago, Facebook made usersâ âlikesâ visible overnight,
which consequently may have outed some people in the
LGTBQ community or revealed peopleâs personal, political or
religious beliefs.
When I asked an employee how they justified this change, they
responded that the company valued transparency and wanted
people to be transparent about their interests.
The companyâs founder, Mark Zuckerberg, had even famously
said privacy was no longer a âsocial norm.â
Never Change Without Notice
57. We donât have the right to make decisions about other
peopleâs personal data and interests on their behalf.
Assuming everyoneâs information can safely be made
public is a belief that comes from a position of privilege.
We should never make decisions like this, which can
profoundly affect peopleâs privacy without their explicit
consent.
Never Change Without Notice
59. We talk a lot about âempathyâ in
design.
If we design with empathy, we
wonât design experiences we
wouldnât want to use ourselves.
And we wonât design using âdark
patternsâ either.
Conclusion
Photo by Josh Calabrese
60. Privacy is not about secrecy.
Itâs all about control.
â Dr.Ann Cavoukian
If we want to ensure people have control over their
own personal information
If we want to ensure experiences we design are user
friendly and truly âuser-centeredâ
Weâll keep these best practices in mind
Conclusion
Photo by Zanardi, Unsplash
63. Further Study
âą California Consumer Privacy Act
âą GDPR.eu
âą âPrivacy by Design: The 7 Foundational Principlesâ -
Dr. Ann Cavoukian
âą The Privacy Project â New York Times
âą âWe Read 150 Privacy Policies. They Were an
Incomprehensible Disasterââ Kevin Litman-Navarro,
New York Times
âą âPrivacy UX - Common Concerns and Privacy in Web
Formsâ â Vitaly Friedman, Smashing Magazine
âą âWhat GDPR Means for UXâ â Claire Barrett
âą www.darkpatterns.org â Harry Brignull
âą âHow Dark Patterns Trick You Onlineâ â YouTube
âą Ruined by Design â Mike Monteiro