Hasinoff, A. A., & Schneider, N. (2022). From scalability to subsidiarity in addressing online harm. Social Media + Society, 8(3), 205630512211260.
Open access: Yes
Notes: Traditional approaches to content moderation rely on punitive models—where the infractor is served a warning, a suspension, or is banned from the service. In this article, however, authors argue that such a model (which aims to mimic criminal-legal systems) lacks context sensitivity—and is thus unable to address violence effectively. As a response, Hasinoff and Schneider propose that social media addresses harm via restorative (focused on repairing harm for those who have experienced it) and transformative (focused on changing the underlying conditions that led to the harm) models of justice. These models, however, are incompatible with current scalability models that dominate social media platforms, as “while scalability demands quick resolution to incidents, both restorative and transformative justice call for slower, more individualized care and negotiation” (p. 5). Here, the response is to center subsidiarity—that is, social units have meaningful autonomy within larger systems.
While such an approach has important implications for platforms’ responses to social media, it is hard to imagine current large social media companies switching models. Instead, it would be good to see emergent platforms (such as Bluesky) rely on restorative and transformative justice models for addressing harm.
Abstract: Large social media platforms are generally designed for scalability—the ambition to increase in size without a fundamental change in form. This means that to address harm among users, they favor automated moderation wherever possible and typically apply a uniform set of rules. This article contrasts scalability with restorative and transformative justice approaches to harm, which are usually context-sensitive, relational, and individualized. We argue that subsidiarity—the principle that local social units should have meaningful autonomy within larger systems—might foster the balance between context and scale that is needed for improving responses to harm.