Deepfake porn: The photo theft on the net

It can happen to any woman who posts photos of herself on the web: her portraits are edited into porn videos. Taking action to counteract this is complicated, expensive – and often hopeless. An alliance of activists wants to change that.

“Just surreal” – that’s how it felt for her, says Bella, when she found herself in a porn on the Internet in autumn 2022. A reporter drew the 30-year-old influencer’s attention to the fact that strangers had apparently mounted photos and videos of her face on the body of an actress in a sex film. On closer inspection, the fake was recognizable, “but it was still frighteningly well done,” says Bella, known online as “Mrs. Bella”. “The fact that people specifically search for videos like this with me and actually watch them shocked me.”

A portrait that almost everyone shares of themselves on social media: that’s all it takes to fake a face into porn – in such a way that you cannot, or at least not immediately, determine whether it really belongs to the respective body . This face swapping usually affects women, influencers like Bella or top politicians like Annalena Baerbock, but also non-celebrities find themselves in so-called deepfake porn. “The danger of becoming a victim yourself hangs over women like the sword of Damocles,” says Anna-Lena von Hodenberg. She leads the organization HateAidwhich supports victims of violence and hate on the internet.

A deepfake is an image or video manipulated with artificial intelligence. The technology is as fascinating as it is terrifying: For example, it can help to bring deceased actors to life in films – but it can also be abused. And that is what is happening more and more often. “It used to take a lot of different images and a lot of technical understanding to put a person’s face into a porn,” says von Hodenberg. Today, a photo, a faceswap app and a few clicks are all it takes to destroy someone’s reputation.

Why are these videos made?

The collages end up on porn websites or with providers like Mr. Deepfakes. There are tons of movies like this out there. Those affected usually find out about it by accident. For example, when friends recognize you in a video. Or employers demand that the dodgy part-time job be left alone.

The motives of the perpetrators vary: they want revenge on ex-partners, satisfy sexual fantasies, some act out of pure misogyny. The patterns are often similar, says Anna Lehrmann, who also advises victims of digital violence for the association “Women help women”. “They create manipulated images or videos and threaten the women with publishing themif they don’t do what the perpetrators want.”

Those affected are then often in shock, many feel ashamed and blame themselves. There is almost nothing you can do to ensure that you don’t appear in deepfake porn. “You shouldn’t move in the digital space and not publish any photos of yourself on the Internet for that,” says Lehrmann. Not even on the website of the company you work for. That sounds almost impossible in the 21st century – and according to Anna-Lena von Hodenberg, at worst it can deter women from getting ahead in their jobs or to aspire to political office. “It’s a weapon against women,” she says.

The victims are often powerless

Once the manipulated video is online, there are few ways to defend yourself against it. You can submit a deletion request to Google so that the film no longer appears in the list of results in Google photo searches. However, not all platforms respond to requests to delete videos. And even then, it may reappear elsewhere at a later date.

“Mrs. Bella” had the fake video deleted with the help of her lawyer. “I have the means to take legal action against something like that,” she says. In addition, she was hardened: As an internet personality, she gets lewd messages and unsolicited photos almost every day. “For people who are not in the public eye, the consequences are much worse.”

So that the victims of deepfake porn are less powerless in the future HateAid and organizations such as the “Federal Association of Women’s Counseling Centers and Women’s Emergency Call Centers – Women Against Violence” submitted a petition on the platform in November Weact started and already collected more than 60,000 signatures in the first four weeks. With prominent supporters such as the SPD politician Sawsan Chebli or Mrs. Bella, they are calling on the responsible digital minister, Volker Wissing, to adapt the criminal law.

Law enforcement: This needs to change

Anyone who appears in deepfake porn against their will today can take legal action for defamation or violation of the right to their own image. But the public prosecutor’s office usually comes to the conclusion that criminal prosecution is not in the public interest. Those affected then bear the costs of the procedure, including the risk. “That has to change as soon as possible,” demands von Hodenberg. In addition, the initiators are pushing for apps that allow the manipulation of porn or even advertise it to disappear from the app stores.

Great Britain is currently showing that it is in principle very possible to take political action against deepfake porn. The government there is planning to introduce a new “online safety law” to criminalize the publication of intimate photos and pornographic fake videos without the consent of those depicted.

The influencer Bella known online as “Mrs. Bella” – is one of the many victims of deepfake porn. “I find it unbelievable,” she says, “that there should even be portals that distribute such videos with the faces of real people against their will.”

Bridget

source site-46