Best Practices for Deletion of Harmful Content on Social Media: Choosing between hard and soft deletion policies

Christian Djeffal, Hannah Tilsch, Lisa Mette, Chithra Madhusudhanan

Research output: Other contribution

Abstract

Social media platforms must rethink their approach to content moderation, moving beyond the binary question of whether to remove harmful content and instead focusing on how content is removed. One design choice in this area is between hard and soft delete. The current norm of hard deletion, where offending posts are entirely erased without any indication they ever existed, causes conversations to lose important context. We propose that platforms shift to a policy of soft deletion as the default. With soft deletion, when a post is removed for violating content guidelines or laws, a notice is put in its place indicating that it was deleted and why. This preserves the flow and coherence of discussions while still removing the harmful content itself. However, we believe impacted users should be given a choice. Platforms should allow those affected by harmful posts to opt out of soft deletion in favor of hard deletion on a case-by-case basis. The key is providing agency to those most directly impacted. When implementing soft deletion notices, platforms must be thoughtful about what information to include. At a minimum, notices should indicate that a post was removed, specify which rule was violated, and ideally provide a link to the relevant content policy. Notices could also include the username of the poster and the date of the original post. This additional context promotes transparency and accountability.
Original languageAmerican English
TypePolicy Paper
StatePublished - 2024

Fingerprint

Dive into the research topics of 'Best Practices for Deletion of Harmful Content on Social Media: Choosing between hard and soft deletion policies'. Together they form a unique fingerprint.

Cite this