Meta has the tools to better protect children but does not use them according to a former engineer


According to an engineer who worked at Meta on the protection of minors, Facebook or Instagram could protect them effectively and easily. Except that Meta seems not to want to use the tools he already has at his disposal to do so.

Child in front of social networks
Credits: 123RF

With the billions of users present on the social networksit is inevitable that some of them post prohibited content by the platforms. Not to mention the illegality of it, being exposed to posts like this can have a real impact. So much so that Facebook and Instagram have been accused of being harmful to the mental health of minors. The parent company Meta is also being prosecuted by several American states. Because the consequences are very real and very often tragic. But according to Arturo Bejara former engineer then consultant for Meta, this is far from pushing the firm to resolve the problem.

“If they had learned the lessons of Molly Russell, they would create a safe product for 13-15 year olds where, in the last week, 1 in 12 people do not see someone harming themselves or threaten to harm themselves. do it. And where the vast majority of them feel supported when they come across content depicting self-harm”.

Read also – Instagram: the algorithm displays shocking videos to adults who follow minors’ accounts

The man refers to suicide of a 14-year-old girl years old in November 2017. His behavior changes during the last year of his life. She becomes more solitary, staying locked in her room more often. This time, Molly spends it on social networks, Instagram on your mind. 6 months before her death, she was exposed to at least 2,100 posts relating to the suicideat theself-harm or to the depression.

2 minutes before taking action, she saves one on which a sentence related to this psychological disorder is written. In 2022, the inquest concludes that Molly “died as a result of an act of self-harm while suffering from depression and the negative effects of online content”.

Meta could better protect children on its social networks, but the firm is not doing it according to a former engineer

Arturo Béjar becomes a consultant for Meta in 2019. Until 2021, he conducts research which shows that on Instagram, 12.5% ​​of children aged 13 to 15 have received sexual advances unsolicited, 20% have been victims of cyberbullying And 8% saw content about self-harm. Armed with these alarming findings, the engineer orders Meta to significantly reduce exposure to sensitive content. It also offers several avenues:

  • Facilitate reporting unwanted postsallowing you to indicate why the user does not want to see them.
  • Regularly launch user surveys about their experiences.
  • Make it easier to send reports of the same kind, at the initiative of the users.

But for Béjar, Mark Zuckerberg’s company is in no hurry to tackle the phenomenon, even thoughit already has the tools to do it. “Either they need a different director, or they need him to wake up tomorrow morning and say, ‘This type of content is not allowed on the platform,’ because they already have the infrastructure and the tools so that they are impossible to find”.

According to the ex-engineer, Meta could quickly eradicate sensitive content from its platforms if the firm wanted to

Arturo Béjar’s studies and recommendations appear in a complaint filed against Meta by New Mexico Attorney General Raúl Torrez in December 2023. Other documents show that after Molly Russell’s death, employees warned that Meta was “defending the status quo” while “the status quo [était] clearly unacceptable for the media, the many families impacted and […] would also be for the wider public.”

Read also – Facebook finally offers parental supervision of Messenger and Instagram to French users

The company’s supposed inaction is all the more incomprehensible since, according to Béjar, it would only take 3 months to act effectively against sensitive content. “They have all the tools they need to do this. What it takes is the will […] to say that, for adolescents, [ils vont] create a truly safe environment,” he explains, adding that its effectiveness should also be measured and reported publicly.

A Meta spokesperson responded: “Every day, many people inside and outside Meta are working on how to help keep young people safe online. Working with parents and experts, we’ve introduced more than 30 tools and resources to help teens and their families have safe and positive online experiences. All this work continues.” At the beginning of January 2024, Meta announced the implementation of several measures to better protect minors from sensitive content on Facebook and Instagram.

Source: The Guardian



Source link -101