"Deepfake porn", a tech that undresses women online

Over 100,000 fake photos of naked women and girls have been created from photos stolen from social media. Responsible: a "deepfake bot", recent and very disturbing technology …

Video by Clemence Chevallet

Imagine finding yourself in a nude photo on the web, without having wanted to … or even having the photo in question taken. Sensity, a Dutch security watch company, has discovered more than 100,000 photographs of naked women circulating on the Telegram encrypted messaging app. Fake photos, explains The Verge, site specializing in new technologies, manufactured from scratch. The culprit: an artificial intelligence in the form of a "bot", that is to say of a computer program capable of sending automated messages, here capable of removing clothing from any photo . Originally, this program, then named DeepNude, had been identified by the Vice site during the summer of 2019. The outcry had been such that its developer had "killed" its creation, but not before some clever people copy it. Since this summer, it has therefore resurfaced on Telegram and has just been spotted at the end of October 2020 by the agency Sensity.

According to her, on dedicated conversations comprising more than 75,000 users, we share doctored photos of real women, which can be made without computer skills. The only requirement is to upload a photo of a celebrity (16% of content identified) or a loved one (63% of content). And the result can be very realistic. Worse, some of the photos show probably underage girls. Between July 2019 and 2020, approximately 104,852 women or girls were victims of these arrangements. "It's a pretty unique case, because it's not just about people chatting and sharing content, Says Giorgio Patrini, boss of Sensity to the site Buzzfeed News. This computer program is really built into Telegram, and we've never seen anything like it ". The secure messaging application, used by more than 400 million people per month, has not yet reacted to this new scandal linked to "deep fake".

Dangerous technology

The term "deepfake" is used to describe computer-generated images and videos, most often strikingly realistic, obtained from a real model. These doctored photos or videos can be used for humorous, political (for example, creating a video of a false speech by a leader) but also for pornographic purposes. Until now, the targets spotted have been mainly actresses, models and celebrities. Anonymous women and girls are therefore also affected on a large scale, and risk the revenge porn or blackmail, with users sometimes asking for money to remove the fake photo.

Russia and several countries in eastern Europe are currently concerned, but the danger of this type of program spreading all over the world is very real. What does the law say in France? It simply prohibits the dissemination of pornographic montages. The penalty incurred may go up to a year of imprisonment and 15,000 euros if we publish, "by any means whatsoever, the montage made with the words or the image of a person without his consent, if it is not evident that it is a montage or is it is not expressly mentioned. " Another bulwark against "deepfakes", the law of December 22, 2018 on the fight against the manipulation of information. But currently, the government relies mainly on web giants like Pornhub, Twitter, Gfycat and Reddit to fight this scourge: they are required to apply a policy of systematic removal of "deepfakes" and to close all accounts and exchange space that offer it. Which, as the revelations of Sensity prove, does not prevent these montages from circulating.

Internet violence: what fits into cyber-harassment

Video by Clara Poudevigne