At Scale and under Pressure: How Social Media Moderate, Choreograph, and Censor Public Discourse
Problematic forms of online expression and activity are pushing social media platforms to face some difficult questions. While many platforms continue to celebrate themselves as providing open spaces for public participation, in fact they have always had to police inappropriate speech and anti-social behavior. I will discuss the array of challenges social media platforms face, the justifications they offer for their interventions, and some of the implications that their responses present. As these platforms weigh possible measures — including removing content or users, limiting or blocking content from other users’ eyes, or defending content as protected speech — they revive and extend a century of questions about the role of private intermediaries in sorting out what should be made visible and what should be obscured. And in the way that they implement these measures, they have built up a complex system of sociotechnical mechanisms and distributed labor forces that now structure and drive the logic of the platform. Finally, because these decisions must be enacted at an enormous scale, they may depend on a logic that is fundamentally orthogonal to how users experience them, as specific incidents and impositions.
Tarleton Gillespie is a principal researcher at Microsoft Research, an affiliated associate professor in Cornell's Department of Communication and Department of Information Science, a faculty associate at the Berkman Klein Center for Internet & Society at Harvard University, co-founder of the blog Culture Digitally, the author of Wired Shut: Copyright and the Shape of Digital Culture (MIT, 2007) and a co-editor of Media Technologies: Essays on Communication, Materiality, and Society (MIT, 2014). His next book (Yale University Press, forthcoming 2018) examines how the governance of cultural values by social media platforms has broader implications for freedom of expression and the character of public discourse.