As research by the German media shows, comments with certain terms are systematically filtered on the video platform. The authors know nothing about it.
Tiktok is one of the most popular social networks among young people. In addition to the almost never-ending stream of short videos, the extremely lively comments section should also ensure that the app enjoys such great popularity among young people. It offers space for humorous reactions, critical feedback and open debates. Research by the German «Tagesschau»the North German and the West German Broadcasting (NDR and WDR) now shows, however, that the comment area is not as open as users expect it to be.
In a self-experiment, journalists with different profiles commented on short videos on the platform. A total of 70 terms and word combinations were tested. It turned out that at least 20 words were held back by automated filters. Comments that would have contained one of these terms were not publicly visible in at least four attempts from different profiles, as the “Tagesschau” writes.
The words “sex”, “porn” and “heroin” were blocked, among other things. For reasons of youth protection, it is understandable that comments with these terms are suppressed. On the other hand, it is not understandable that terms such as «LGBTQ», «climate change» or «gay» are consistently or at least partially blocked.
Automated mechanism to protect the community
“In order to protect our community, we use technologies that allow us to proactively search for comments that violate our guidelines,” said a Tiktok spokeswoman when asked by the NZZ. The fact that terms that did not violate the guidelines were also blocked as part of the “Tagesschau” research is regrettable and should not have happened.
However, the spokeswoman also pointed out that the blocked terms in the self-experiment were at least partly due to unnatural user behavior. The journalists had made a large number of comments with just a few accounts within a very short time, which corresponds to the classic behavior of user accounts that spread spam or false information. This is also why Tiktok’s automatic content moderation did not spread the comments.
However, Tiktok does not only moderate the comments automatically. The company also uses thousands of human moderators around the world to provide additional monitoring of posts and comments, the spokeswoman said. The company wants to ensure that comments are only blocked if they are actually insults or false information. However, it is unclear how well this human moderation works in the local language area. The company spokeswoman did not want to provide any information on the number of German-speaking moderators, even after repeated inquiries.
Not the first allegations of censorship
It is not the first time that Tiktok has been criticized for its non-transparent moderation practice. Already pointed out in March research by NDR, WDR and the “Tagesschau” to the word filter used.
At that time, among other things, comments were blocked that contained the word “climate change”, “Auschwitz” or the name of the Chinese tennis player Peng Shuai. Tiktok then admitted mistakes and announced that it would reconsider its guidelines. Some of the terms blocked at the time and the name of the Chinese tennis player can now be used again in the comments. However, terms such as “LGBTQ”, “gay” or “heterosexual” continue to be suppressed. That raises questions about how much the company’s facilitation strategy has really changed over the past few months.
Incidentally, it is not clear to users whether and for what reasons their comments were hidden. Instead, the platform gives the impression that all comments are publicly viewable, research revealed. This opaque method of discourse control is known as “shadow banning”. Given Tiktok’s promise to keep the discourse open to everyone, this leaves a stale aftertaste.