Generating child pornography images by AI is a big problem


Mallory Delicourt

May 25, 2023 at 6:10 p.m.

33

Stability AI © Stability AI

© Stable Diffusion

The start-up ActiveFence, which specializes in moderating internet content, has announced that several users are using the tools to create and distribute child pornography content.

The production has not (yet) exploded, but a clear increase has already been observed on various monitored forums.

A very clear increase in two quarters

Despite updates to some AIs to combat the creation of pornographic content, flaws remain. In the United States, the National Center for Missing and Exploited Children (NCMEC) recently issued a warning that sexual predators are now using generative tools driven by artificial intelligence on a large scale to create and publish child pornography images increasingly. more realistic.

The latter also share guides so that others can achieve satisfactory results and avoid detection. According to ActiveFence, the results of which were presented by Avi Jager, users of a major child pornography forum shared 68 batches of images during the first four months of the year. The increase is significant, since this same start-up had identified 25 lots over the last four months of the year.

For obvious security reasons, this forum has not been named. Unfortunately, that’s not all. Pedocriminals also use the forum to exchange tips and scripts in order to generate fictitious characters capable of gaining the trust of potential victims.

Stability Example © © Stability AI

© StabilityAI

It’s only just begun »

The phenomenon is therefore undoubtedly on the rise, but Yiota Souras, legal director of the NCMEC, indicates that we cannot yet speak of an explosion of the phenomenon, but that it will not be long. According to her, ” we are on the verge of something new “. On his side, Bloomberg flushed out a forum using Stable Diffusion and Stability AI. The platforms do their best to remove content that violates the rules and have effective tools:

Over the past seven months, we have taken numerous steps to significantly mitigate the risk of exposure to Not Safe For Work content from our models. This includes installing filters that block dangerous and inappropriate content in all our apps […] said Motez Bishara, director of communications for Stability AI.

However, the companies and NCMEC expect to see a massive amount of AI-generated child pornography images emerge in the coming weeks and months. According to Avi Jager, the phenomenon has only just begun and will soon get worse. By way of example, ActiveFence estimates that there are 80,000 daily reports related to abuse or the dissemination of child pornography photographs. For the time being, the generation of false child pornography images is not illegal in the United States, but it is already illegal in several countries including the United Kingdom, Canada, and Australia.

Source : Bloomberg



Source link -99