Toxic speech and limited demand for content moderation on social media

Pradel, F., Zilinsky, J., Kosmidis, S., & Theocharis, Y. (2024). Toxic speech and limited demand for content moderation on social media. American Political Science Review, 1–18.

Link: https://doi.org/10.1017/S000305542300134X

Open access: Yes

Notes: It is unclear whether ‘civility’ ought to be a standard for being on social media—indeed, sometimes being ‘uncivil’ in response to unjust contexts can be a better path of resistance. Navigating this tension, Pradel and colleagues explore in this study whether users’ perceptions of toxic content align with their preferences for content moderation. That is, they explore if people’s identification of toxic speech leads to a corresponding call for moderation. To study this, they conducted two experimental studies in the US, whereby people navigated and reflected on three components of toxic speech: incivility, intolerance, and violent threats. Here, they have two important findings. First, different types of toxic speech have different consequences—for instance, incivility is seen differently than violent threats. Second, users mostly do not support content moderation of uncivil and intolerant content, as it is not necessarily seen as harming democracy or their communities. Overall, this study invites a nuanced and critical reading of content moderation that better accounts for the communicative and cultural process of online community building.

Abstract: When is speech on social media toxic enough to warrant content moderation? Platforms impose limits on what can be posted online, but also rely on users’ reports of potentially harmful content. Yet we know little about what users consider inadmissible to public discourse and what measures they wish to see implemented. Building on past work, we conceptualize three variants of toxic speech: incivility, intolerance, and violent threats. We present results from two studies with pre-registered randomized experiments (Study 1, ; Study 2, ) to examine how these variants causally affect users’ content moderation preferences. We find that while both the severity of toxicity and the target of the attack matter, the demand for content moderation of toxic speech is limited. We discuss implications for the study of toxicity and content moderation as an emerging area of research in political science with critical implications for platforms, policymakers, and democracy more broadly.

Join the ConversationLeave a reply

Your email address will not be published. Required fields are marked *

Comment*

Name*

Website

css.php