In-article:

Twitter: the team dedicated to the fight against sexual abuse of children decimated


Musk cleaned up, but probably in the wrong place. The arrival of the owner of Tesla at the head of the social network was accompanied by a wave of layoffs. Problem, one of the most sensitive Twitter moderation services is hit hard. According to Bloomberg’s revelations, Twitter’s child sexual exploitation team has been cut in half, “leaving behind an overwhelmed skeletal team”, according to a media source.

Also according to Bloomberg, the service would today be made up of less than 10 specialists, responsible for examining and transmitting reports on cases of sexual exploitation of children. At the beginning of the year, the team was made up of almost 20 already overstretched people. Even before the (forced) departures, employees were extremely stretched, working long hours to deal with user requests and legal cases.

Comprised of former law enforcement officers and child safety experts, the team was tasked with stopping the dissemination of material relating to the sexual exploitation of children, identifying and addressing case of psychological manipulation of minors — called grooming — but also to stop content and representations seeking to portray attraction to minors as a sexual identity or orientation.

Advertising, your content continues below

AI is good, humans are better

Last week, the new Twitter boss tweeted that ““Suppressing the exploitation of children is the number one priority” and prompted users to “answer in the comments [s’ils voient] something Twitter needs to address“. And if certain hashtags associated with the sexual exploitation of minors have disappeared since his arrival, Elon Musk’s choices risk having serious consequences on the moderation of his social network.

Although it is aided by artificial intelligence to detect illegal content, human beings and their analysis are irreplaceable. First of all, because an artificial intelligence is not able to analyze with precision the cases of grooming, but also because an AI is not a good witness in a judicial investigation. Especially since some parts of the globe will be more impacted than others: the loss of European and Singaporean specialists “will be a particular challenge” to maintain “order in non-English speaking markets”, always according to the sources of Bloomberg.

The downsizing of such a large team comes as the European Union and the United Kingdom plan to tighten the regulations imposed on digital platforms concerning the protection of users, and by extension minors. With the Digital Services Act (or DSA), large companies will have to be very meticulous about the safety of their public, under penalty of significant fines of up to 6% of their worldwide turnover, or even a ban on operating on the Old Continent in the event of repeated violations.

This bleeding in the team fighting against sexual exploitation is perceived in more ways than one as an alarming signal, the social network risking becoming less and less secure, especially for the youngest.

Advertising, your content continues below



Source link -98