3. liberation of power from the
constricted group of the few to its
rightful place as the wealth of the
information sharing community
charging a mere 25 cents to post
a message significantly raised the
level of discourse, eliminating
many trivial or rude messages
Bruckman, A. (1998). Finding one’s own in cyberspace. High Wired: On the
Design, Use, and Theory of Educational MOOs. Ed. Cynthia Haynes and Jan
Rune Holmevik. Ann Arbor, MI: U of Michigan P, 24.
4. Kiesler, S., Siegel, J., & McGuire, T. W. (1984). Social psychological aspects of computer-mediated
communication. American psychologist, 39(10), 1123.
refers to the practice of expressing oneself
more strongly on the
computer than one would in other
communication settings
FLAMING
5. Kiesler, S., Siegel, J., & McGuire, T. W. (1984). Social psychological aspects of computer-mediated
communication. American psychologist, 39(10), 1123.
a "GRIPENET" emerged—organized
complaints against management
practices and policies whose form and
substance deviate considerably from
standard IBM culture
ORGANIZED LABOR
6. The two sides were supposed to listen to expert testimony and come to a friendly agreement;
cartoon from the Cleveland Dealer. Source: Wikipedia
7. Lotan, G., Graeff, E., Ananny, M., Gaffney, D., & Pearce, I. (2011). The Arab Spring | the revolutions were twee
15. Leavitt, A. (2015, February). This is a Throwaway Account: Temporary Technical Identities and
Perceptions of Anonymity in a Massive Online Community. In Proceedings of the 18th ACM
Conference on Computer Supported Cooperative Work & Social Computing (pp. 317-327). ACM.
16. Andalibi, N., Haimson, O. L., De Choudhury, M., & Forte, A. (2016, May). Understanding Social Media
Disclosures of Sexual Abuse Through the Lenses of Support Seeking and Anonymity. In Proceedings
of the 2016 CHI Conference on Human Factors in Computing Systems (pp. 3906-3918). ACM.
when people seek support in stigmatized
traumatic contexts such as sexual abuse,
they face a trade-off between choosing
anonymity and being potentially identified
17. Doleac, J. L., & Stein, L. C. (2013). The visible hand: Race and online market
outcomes. The Economic Journal, 123(572), F469-F492.
In local online classified advertisements throughout
the US… Black sellers do worse than white sellers
on a variety of market outcome measures:
• They receive 13% fewer responses
• 17% fewer offers
• Conditional on receiving at least one offer,
• black sellers also receive 2-4% lower offers
• buyers corresponding with black sellers exhibit
lower trust:
• They are 17% less likely to include their name in
e-mails,
• 44% less likely to accept delivery by mail
• 56% more likely to express concern about
making a long-distance payment.
18. Duggan, M. (2014). Online harassment. Pew Research Center.
half of those who have experienced online
harassment did not know the person
involved in their most recent incident
19. McMillen, Andrew. Wikipedia Is Not
Therapy: How the online encyclopedia
manages mental illness and suicide
threats in its volunteer community.
Backchannel. Illustration by Laurent Hrybyk
22. Citron, D. K., & Norton, H. L. (2011).
Intermediaries and hate speech: Fostering
digital citizenship for our information age.
Boston University Law Review, 91, 1435.
“a thoughtful and nuanced
intermediary-based
approach to hate speech
can foster respectful
online discourse without
suppressing valuable
expression”
23. MacKinnon, R. (2012). Consent of the
networked: The worldwide struggle for
Internet freedom. Basic Books
“How do citizens make
sure that private agendas
and pursuit of profit do not
erode… democratic
expression?”
27. Cheng, J., Danescu-Niculescu-Mizil, C. & Leskovec, J. (2014). How Community Feedback Shapes User
Behavior. ICWSM 2014.
negative feedback leads to significant
behavioral changes that are detrimental to
the community.
Not only do authors of negatively-evaluated
content contribute more, but also their future
posts are of lower quality, and are perceived
by the community as such.
28. Halfaker, A., Geiger, R. S., Morgan, J. T., & Riedl, J. (2012). The rise and decline of an open collaboration
system: How Wikipedia’s reaction to popularity is causing its decline. American Behavioral Scientist
AI VANDALISM
DETECTION
29. Wulczyn, Ellery, Nithum Thain. Understanding Personal Attacks on Wikipedia.
July 2016 Wikimedia Research Showcase
30. Wulczyn, Ellery, Nithum Thain. Understanding Personal Attacks on Wikipedia.
July 2016 Wikimedia Research Showcase
31. Munger, Kevin. Tweetment Effects on the Tweeted: An Experiment to Reduce Twitter Harassment.
Society for Political Methodology Conference, Washington University, St Louis, July 2015.
32. Kohavi, R., Deng, A., Frasca, B., Walker, T., Xu, Y., & Pohlmann, N. (2013, August). Online controlled
experiments at large scale. In Proceedings of the 19th ACM SIGKDD international conference on
Knowledge discovery and data mining (pp. 1168-1176). ACM.
33.
34. it was the
complications—the
incomplete data sets, the
trolls’ endless
prevarications, the
incessant march of
subcultural change—that
gave rise to my basic
argument
Phillips, Whitney. The Ethnography of
Trolling.Ethnographymatters, January 8 2013.
1973. Idealists at Berkeley install computer terminals in libraries and record stores to create public assess digital communications for their neighborhood.
They talk about supporting the “liberation of power” and the “wealth of the information sharing community.” But let’s look closer at their terminal. It charges 25 cents per post!
As Amy Bruckman, who’s here in the room, has pointed out, even in the earliest days of Internet idealism, they had a problem with people being terrible to each other online.
Ten years later, Sara Kiesler and colleagues start worrying that people might be worse to each other on computers than elsewhere.
And what was the specter that haunted computer communications?
“Gripenet,” the possibility that workers might do labor organizing or try to change their workplace culture using the internet.
Because of course, as we learned in the Coal Miner’s Strikes of 1902, conversations with employers in a democracy always involve considering the evidence and arriving at a friendly agreement over the treatment of workers.
In the thirty years since Kiesler’s article, we’ve come to see the deep power of social technologies to communicate political change
And make visible to us profound social changes in our respect for human dignity and rights as a society
But we’ve also gained remarkable capacities to organize harm, as many in this room have personally experienced, whether you’ve been a target of networks like GamerGate, or some other coordinated harassment.
As digital communications have become a basic part of human life, so has harassment. Maeve Duggan
As much as we need to acknowledge the risk to women and marginalized groups, we also need to recognize that everyone shares these risks— with a greater percentage of men than women reporting that they face physical threats online.
And it’s not just online risks. it Katherine Clark
1993 New Yorker Cartoon by Peter Steiner
As we debate whether anonymity is part of the problem, we should acknowledge new research that shows how important it is for the most vulnerable— with women on reddit being more likely to use strong anonymity protections.
And anonymity playing a key role for people who seek support for sexual abuse
Many choose to be anonymous to be treated fairly and equally
In half of online harassment, anonymity might not be an issue at all — in cases like domestic abuse and bullying, where people know their harassers.
And we have come to know and care for strangers on the Internet. As those bonds grow, we feel their suffering, and we want to protect each other- as Andrew McMillen’s moving article about Wikipedia’s response to mental illness has reminded us.
And so we ask how to respond.
Berkman Klein Colleague Larry Lessig’s work on how to regulate the Internet was prompted by Julian Dibbell’s article on a Rape in Cyberspace.
and 17 years later, we have a complex set of layers that govern what we’re allowed to do online. Here in the room, Alice Marwick and Andy Sellars have written about the relationship between law and online harassment. Many people from platforms are constantly working out policies, often at a rate of several new ones a day. And many of us think about community responses, including Lindsay Blackwell, who’s doing research on peer support with HeartMob.
Right now, platforms are under incredible pressure over the policies and decisions that they make— expected to “solve” online harassment.
And yet at the same time, we’re still in the early days of figuring out what it means for democracies to have our speech regulated by private corporations.
Often, when people come to designers, this is what they want.
They think of us as santa, and what they want might very well escalate our conflicts rather than resolve them.
And what about that dislike button? Ratings and voting are common online, but do they reduce problems?
Cliff Lampe did work in the early 2000s on voting systems, showing how volunteers can actually be good at rating participation. But it’s also possible that these voting systems may not achieve the ends that we hope, as Justin Cheng has found.
Jigsaw and Wikimedia
Any response is ultimately a blend of social and technical, as Kevin Munger’s ongoing experiments show us. Kevin is using social bots to test the effect of responding to racial slurs that people share on Twitter. And much of Susan Benesch’s research has examined this kind of “counter-speech” in the wild.
As we talk about these issues, we need to be thinking about the scale and speed at which online harassment evolves, and the scale of the tools we now have at our disposal for addressing them.
At Microsoft, Ron Kohavi. In day two, we will discuss infrastructures of experimentation
Infrastructures of Understanding. Many of our questions have parallels in debates over digital copyright, which is another fast-evolving area. Last night, I looked up Olympic copyright takedowns on Lumen and saw requests from the Olympic Committee to take down livestreams from the olympics. Periscope is only 18 months old. And Lumen opens up this emerging area of regulation to further discussion.
Adam Holland from Lumen, which was founded 15 years ago as an archive for copyright takedown requests, is helping us understand emerging questions.
However online harassment just something we can put in a test tube. It’s a social phenomenon that is part of wider structures of society and culture. Whitney Phillips started out wanting to understand why Trolls did what they did, and instead wrote about factors in attention economies and contemporary culture that make it mainstream. Her book, which is deeply valuable, came out of wrestling with the incomplete data of her subject.
All of you know that I’ve just *begun* to outline *part* of the conversation about online harassment, the conversation we are here to have. And not just to gain intellectual clarity— though that is one of our goals — but to advance concrete projects that could transform people’s lives in fundamental ways toward collective well-being, justice, and potential as human beings.
Day One: Projects
Day Two: Infrastructures & Enablers
The collective experience, resources, expertise, and wisdom in this room is profound. Each of you has been invited because we believe that you can do something amazing over the next two days. And you already have — I’ve loved the experience of hearing you share your ideas and passions. As we join in conversations, let’s be proactive to look for points of connection or ways that we can support someone else’s work by opening our networks, knowledge, and organizations.
And that means paying attention to who’s NOT in this room as much as who *is* here.
Since not all of us can have our names listed publicly, we are creating a document out of this event that represents the public record of our conversations. That way, if you want to link someone up with a person or a resource, advocate for a project in your organization, or incorporate feedback in your designs. Our report will be a tool to help you bring others into the conversation.
This becomes especially important on day two. By listing out and defining the impacts and projects enabled by a particular infrastructure or collaboration, we support each other to scope those projects well, make them more concrete, and articulate the impact that can come from them.