Article | Open Access
The Moral Gatekeeper? Moderation and Deletion of User-Generated Content in a Leading News Forum
Views: | 8382 | | | Downloads: | 5232 |
Abstract: Participatory formats in online journalism offer increased options for user comments to reach a mass audience, also enabling the spreading of incivility. As a result, journalists feel the need to moderate offensive user comments in order to prevent the derailment of discussion threads. However, little is known about the principles on which forum moderation is based. The current study aims to fill this void by examining 673,361 user comments (including all incoming and rejected comments) of the largest newspaper forum in Germany (Spiegel Online) in terms of the moderation decision, the topic addressed, and the use of insulting language using automated content analysis. The analyses revealed that the deletion of user comments is a frequently used moderation strategy. Overall, more than one-third of comments studied were rejected. Further, users mostly engaged with political topics. The usage of swear words was not a reason to block a comment, except when offenses were used in connection with politically sensitive topics. We discuss the results in light of the necessity for journalists to establish consistent and transparent moderation strategies.
Keywords: community management; computational methods; forum moderation; gatekeeping; journalism; participatory media; Spiegel Online; topic modeling; user comments; user participation
Published:
© Svenja Boberg, Tim Schatto-Eckrodt, Lena Frischlich, Thorsten Quandt. This is an open access article distributed under the terms of the Creative Commons Attribution 4.0 license (http://creativecommons.org/licenses/by/4.0), which permits any use, distribution, and reproduction of the work without further permission provided the original author(s) and source are credited.