TikTok: self-harm videos recommended for teenagers


TikTok is spreading self-destructive content to young teens. This is the result of a study conducted by the Center for Countering Digital Hate (CCDH), an international NGO that fights against online hate, published Thursday, December 15.

A life-size test

As part of this study, the organization’s researchers created several new accounts on the app, in the United States, United Kingdom, Canada and Australia. New profiles set to the minimum authorized age, i.e. 13 years old. The researchers then “liked” or watched for a few seconds video content on topics such as “body image and mental health.”

Within three minutes, the app started suggesting suicide videos to its test users. And a few moments later, “TikTok recommended content related to eating disorders“, says the CCDH report. Content suggesting practices close to self-harm and anorexia was then put forward.

Vulnerable accounts in our study received 12 times more self-harm and suicide video recommendations than standard accounts“, advances the NGO.

Blame it on a self-learning algorithm

The researchers’ findings are hardly surprising. TikTok’s algorithm is designed to maximize the targeting of videos offered to users. A like, a comment, a break of a few seconds on a video… all micro-actions are scrutinized in order to offer clips “relevant.” TikTok’s self-learning algorithm is actually the engine of the application. The more time the user spends on the application and watches videos, the more the content offered will be targeted according to the tastes of the Internet user. .

An infernal spiral that can lead to offering self-destructive content if the user is in a bad way. “TikTok identifies the user’s vulnerability and takes advantage of it. […] Rather than entertainment and safety, our findings reveal a toxic environment for younger TikTok users, heightened for the most vulnerable“, assures the Center for Countering Digital Hate.

TikTok denounces a distorted vision

Contacted by our colleagues from Guardian, TikTok ensures for its part that the study of the CCDH does not reflect the reality on the ground. “We regularly consult with health experts, remove violations of our policies, and provide access to support resources for anyone who needs them.“, assures a spokesperson for the application.

And to add:We realize that trigger content is unique to each individual and remain focused on promoting a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others about these important topics.”



Source link -98