Beyond “Bots and Trolls” — Understanding Disinformation as Collaborative Work
Sponsored by the Algorithmic Fairness and Opacity Group (AFOG)
Strategic information operations (e.g., disinformation, political propaganda, and other forms of online manipulation) are a critical concern for democratic societies — as they destabilize the “common ground” that we need to stand upon to govern ourselves. In this talk, I argue that defending against strategic information operations will require a more nuanced understanding of the problem.
In particular, we will need to move beyond focusing on “bots” and “trolls” to looking at the collaborative nature of disinformation campaigns that target, infiltrate, shape, and leverage online communities. Drawing from three distinct case studies, I describe how orchestrated campaigns can become deeply entangled within “organic” online crowds and I highlight a persistent challenge for researchers, platform designers, and policy makers — distinguishing between orchestrated, explicitly-coordinated information operations and the emergent, organic behaviors of an online crowd.
Kate Starbird is an associate professor at the Department of Human Centered Design & Engineering (HCDE) at the University of Washington (UW). Starbird’s research is situated within human-computer interaction (HCI) and the emerging field of crisis informatics — the study of the how information-communication technologies are used during crisis events. One aspect of her research focuses on how online rumors spread during natural disasters and man-made crisis events. More recently, she has begun to focus on disinformation and other forms of strategic information operations online. She is a co-founder and executive council member of the UW Center for an Informed Public. Starbird earned her Ph.D. from the University of Colorado at Boulder in technology, media and society and holds a B.S. in computer science from Stanford University.