Deepfake pornography: The new weapon against women? – Culture

AI-generated pornographic images of Taylor Swift circulated on the X platform for 17 hours at the end of January. The deepfakes were said to have been viewed up to 47 million times until the search for Taylor Swift was blocked from the platform. Deepfakes are images or videos that have been manipulated, usually with the help of artificial intelligence (AI).

The case made headlines around the world. Taylor Swift’s deepfakes are just the tip of the iceberg, as cases of so-called image-based sexual violence in the digital space have been around for years. Victims were K-pop stars, TikTokers, journalists and girls at US high schools.

New form of sexual violence – in the digital space

According to Study by Deeptrace, a company specializing in deepfakes, 96 percent of all deepfakes are pornographic in nature. And aim out of 100 cases 99 on women. Despite all fears that deepfakes will infiltrate political processes such as elections, they are mainly used as a weapon against women.

“Not surprising,” says Jolanda Spiess-Hegglin from #NetCourage. Your association advises people who have experienced sexual violence in the digital space. It is an instrument of power in the patriarchy: “With pornography you can hit a woman at her core,” says Spiess-Hegglin. It is one of the oldest means of silencing a woman: turning her into an object and sexualizing her.

After the Swift scandal, many activists on But it has always been possible for any woman – whether a pop star or not.

The more pictures, the easier

The only thing is that when it comes to Taylor Swift, it is easier to train an AI with pictures of her because there is a lot of picture material of the pop star on the internet.

It’s not an AI problem. It’s a sexism problem that has been around for centuries.

How good is the technology today? “You have to differentiate between deepfake photos and videos,” says Jürg Tschirren. Photos can be created very convincingly using today’s online image generators. Videos, on the other hand, are still very bad.

“The perfidious thing is that deepfake porn doesn’t have to be technically well made,” says Tschirren. No matter how realistic or unrealistic the assembly looks, you are still humiliated. That’s why he summarizes: “It’s not an AI problem. It’s a sexism problem that’s been around for centuries.”

Thousands of views, a life ruined

“It pulls the rug out from under you when you discover such content,” says Jolanda Spiess-Hegglin, who herself has experienced sexual violence online. “You feel – and unfortunately often are – powerless.”

Because as soon as pornographic content finds its way into chat forums or social networks, it spreads rapidly. “It’s hard to remove content like this from the internet,” says SRF digital editor Jürg Tschirren.

Job interview topic

Suppose that among the thousands upon thousands of porn videos on the Internet there was a deepfake porn of me, the author of this article, a person who is not publicly known. The likelihood that many of my friends will see him is small.

“But the possibility is still there,” says Tschirren. This can trigger paranoia. You know this material is out there, potentially anyone could see it: my new friend, my new tennis teacher, my new employer.

“It’s pure deterrence. As a company, you immediately hesitate to hire this person, don’t you?”, says Spiess-Hegglin. She says that she has had to explain herself to clients or sponsors.

In principle, deepfakes are not illegal. However, the following applies: Anyone who creates deepfakes of a person is violating their right to the image, their personal rights or copyright.

In addition, artificial intelligence could be used to distribute pornographic images Article 197 paragraphs 1 and 2 StGB (Pornography) are applicable in the Swiss Criminal Code. Among other things, anyone who “publicly exhibits or shows pornographic images or otherwise offers them to someone without being asked” will be punished.

The responsible criminal authority would have to decide whether cases like that of Taylor Swift are actually “pornographic” within the meaning of Article 197, the Federal Office of Justice said upon request.

Finally, the principle of accuracy of the data in accordance with Article 6 paragraph 5 of the new data protection law (nDSG; SR 235.1). Accordingly, every person who processes personal data must ensure its accuracy.

In the event of unlawful data processing, the data subject is entitled to various legal rights, such as the deletion or destruction of the data (Art. 30 ff. DSG).

The victim therefore bears responsibility – and must address deepfake experiences during a job interview or first date. As those affected, you have neither good protection through laws nor sophisticated technical options to react quickly, says Spiess-Hegglin. “These women are often left to their own devices.”

What can you do?

Victim support or associations like #NetzCourage try to support those affected and provide them with psychological care. “When it comes to digital violence, the loss of trust plays a major role. “You feel alone when people don’t stand up for you when such content appears,” says Spiess-Hegglin.

It is important that those affected have someone at their side who takes them seriously and supports them. Specifically, this means: If pornographic content appears, everything should be backed up as quickly as possible: take screenshots, including the URL and time information.

“It’s best to ask someone you know, but not too close to, for help,” says Spiess-Hegglin. So that you don’t have to watch the content yourself and be re-traumatized.

Digital violence can lead people to commit suicide.

Spiess-Hegglin recommends contacting specialist departments and the police. “It’s best to go to a police station in a larger city.” In smaller towns, the police lack the competence and capacity to deal with victims of digital violence.

Recognition and commitment

Jolanda Spiess-Hegglin hopes that in the future digital violence will be recognized as an act of violence by the general public and that more will happen at the legal level. “Digital violence can lead people to commit suicide.”

As long as society does not change the causes, such as sexism and misogyny, sexual violence against women will not stop. Just take on new forms.

source site-72