Addiction to social networks on the youngest: Zuckerberg and his teams knew


Mathilde Rochefort

March 14, 2023 at 10:00 a.m.

3

Social media addiction © © Julia M Cameron / Pexels

© Julia M Cameron / Pexels

According to a complaint document, the management of Meta (formerly Facebook) is aware of the harms caused by social networks on young people, but has decided not to act on it.

Opened in Oakland, California, the case includes dozens of complaints filed across the United States on behalf of teenagers and young adults claiming that Facebook, Instagram, TikTok, Snapchat and YouTube caused them anxiety, depression, eating disorders and insomnia.

Meta has turned a deaf ear

One of the documents attests that Mark Zuckerberg, CEO of Meta, was personally made aware of the dangers caused by his own platforms by these words: We are not on track to succeed in our core areas of wellbeing, and we are exposed to increased regulatory risk and external criticism. These issues affect everyone, especially young people and creators; if they are not resolved, they will follow us into the metaverse. »

Despite this warning, the company chose to cut funding for its mental health team rather than actually tackle the problem, according to the plaintiffs. If Meta denies these allegations, they corroborate the alert launched by Frances Haugen, a former employee of Facebook. In 2021, it revealed the existence of an internal study showing that Instagram was harmful to the mental health of adolescents. According to her, Meta wanted to hide these results from the authorities and did not take them seriously.

As one of the complaints states, an employee of the platform wrote that same year: No one wakes up thinking they want to maximize the number of times they open Instagram that day. But that’s exactly what our product teams are trying to do. »

Soon the end of impunity?

Meta isn’t the only company targeted by the plaintiffs. This is also the case with TikTok. Its parent company ByteDance is said to be aware that young people are more likely to be drawn to the dangerous challenges they see on the platform because their ability to assess risk is not fully formed, but it would have nothing done to address the problem. As a reminder, several young people have died following challenges that have gone viral on the Chinese social network.

In their defense, the platforms invoke Section 230, American legislation that does not hold social networks responsible for the content they host. The text is nevertheless challenged in another case currently before the Supreme Court. This could totally redistribute the cards in the content moderation arena by pushing these companies to do much more or face significant penalties.

The complaint document further alleges that more than a dozen suicides can be traced to major online platforms across the Atlantic, on the grounds that they knowingly designed algorithms that lead children down dangerous and addictive paths.

Source : Bloomberg



Source link -99