12. Wrangle 2015
Digital Technology and Scale
How
repeatable tasks are automated
humans decide which tasks computer does and how
productivity is limited by how fast people make decisions
13. Wrangle 2015
Inference Technology and Scale
How
human-like decisions
humans being “automated away”
decisions faster, at scale, and without humans
16. Wrangle 2015
Bias & prescription
(not everyone gets their
cornucopia of rainbows)
Where
examples as defined by
17. Wrangle 2015
Bias
holding prejudicial favor,
usually learned implicitly and socially.
every one of us is biased,
and people can’t observe their own biases
Where
18. Wrangle 2015
Bias in Data
bias in human thought leaves bias in data,
skews that we can’t directly observe
Where
19. Wrangle 2015
Bias, scale, harm
inference technology scales human
decisions — any flawed decisions
or biases it is built on is scaled, too.
Where
25. Wrangle 2015
Dating Subtle Bias and Race
Where
We can say that building a matching algorithm
based on scores would reinforce a racial bias
Ratings men typically gave to women
The effect is apparent in aggregate
26. Wrangle 2015
• College Education
• Computer Science major
• Years of experience
• Last position title
• Approximate age
• Work experience in venture backed company
future startup Founders Institutional Bias
Where
Decision Tree with inputs:
27. Wrangle 2015
institutional bias comes through the data
— though it seemed meritocratic at the
outset
The features say nothing about gender!
Yet in literally pattern matching founders,
we see bias.
future startup Founders Institutional Bias
Where
28. Wrangle 2015
The problem is not that this doesn’t reflect the
real world — but rather that it doesn’t reflect
the world we want to live in.
future startup Founders Institutional Bias
Where
29. Wrangle 2015
loan assessment The long history of bias
Where
long history of loan officers issuing loans
based on measurable values such as income, assets,
education, and zip code
Problem: in aggregate, loan officers are historically biased
So loan algorithms perpetuate and reinforce an unfair
past in the real world today
35. Wrangle 2015
Bias is difficult to understand because it
lives deep within your data
and deep within the context of the real world
Finding Bias
36. Wrangle 2015
1. Talk to people
2. Construct Fairness
ACtion
Two ways to Combat biaS
37. Wrangle 2015
Seek to understand:
who they are
what they value
what they need
what potential harm can affect them
ACtion
talk to people
38. Wrangle 2015
Construct fairness
for example:
to mitigate gender bias, include gender so you
can actively enforce fairness
(what doesn’t get measured can’t be managed)
ACtion
45. Wrangle 2015
If an algorithm makes something cheaper for the
majority but harmful for a minority —
are you comfortable with that impact?
46. Wrangle 2015
we’re at the forefront of a new age
governed by algorithms
We must be deliberate in managing them
ethically, strategically, and tactically
50. Wrangle 2015
huge thanks to
@clarecorthell
clare@summer.ai
sources of note
• “Fairness Through Awareness” Dwork, et al
• Fortune-Tellers, Step Aside: Big Data Looks For
Future Entrepreneurs NPR
• Harvard Implicit Bias Test
Manuel Ebert
Cynthia Dwork
Wade Vaughn
Marta Hanson