Black hole moderation

If someone violates community guidelines, they can be given a suspension or ban.

A suspension is a temporary hold. It could be a warning that is meaningful for someone who is pushing the boundaries too far and needs a digital rap on the knuckles with a ruler. A ban, forbidding access to the site, can be appropriate for people who are malicious and harming a web site’s community. Once they are pushed out, over time, their influence will be fade.

Black hole moderation is more demonstrative way of rejecting an account. To implement a black hole moderation decision, all content created by the user will be erased from the site. The incentive for such users to leave a “mark on the trees” will be eliminated.

Although black hole moderation could be a disincentive to bad actors, it might be painful to the site. It would not be something done lightly. For example, content that is copied into a reply or reposted might also be deleted. Technical solutions for such a search and destroy mission would be interesting to develop.

In a simple example, black hole moderation on DeviantArt would remove all of the user’s messages, art and interaction. On a message board, the hosts would remove all of the user’s messages and interaction with other users. Very difficult examples of a black hole moderation would occur on crowdsourced sites like Wikipedia and Fandom.com.

Adding black hole moderation to social media sites might be more useful and less difficult.

The purpose of the black hole moderation is to give a disincentive to the troll who incessantly adds bad content that doesn’t quite breach community standards but when taken as a whole is harmful.

If privacy of an individual can justify the right to be forgotten, black hole moderation embodies the right to forget.