Sextortion: videos or photos, now the scammers are getting into deepfakes


Alexander Boero

June 07, 2023 at 11:10 a.m.

4

deepfake © Who is Danny / Shutterstock

© Who is Danny / Shutterstock

Hackers can push the scourge of “sextortion” or sexual blackmail even further with the deepfake technique. And they don’t hesitate.

What happens when hackers have certain techniques (here deepfake) as ingredient number one, and Internet users serve them ingredient number two on a plate (photos and videos posted on social networks)? You’ve got everything you need to whip up a truly embarrassing sextortion, all thanks to the seasoning that makes the difference: deepfake. Explanations.

Sextortion and deepfake, a combo that seduces hackers

You will excuse us for the comparison, but it explains how easy it is today to be tricked using these two simple things: content that is part of our daily lives and technology powered by artificial intelligence. .

As a reminder, sextortion (or sexual blackmail) is a crime that does not date from yesterday, but which today consists of blackmailing its potential victim by threatening to reveal its intimate information. Sometimes, the hacker has photos, videos or intimate messages that are prohibited for those under 18. Other times, most of the time even, these attempts are based on the wind, the bluff one will say, like the famous e-mail which makes you believe that you were filmed, using your webcam , in front of a porn site. In exchange for the so-called silence of your executioner, you must then go to the cash register, and therefore pay the ransom.

cybercrime

Once that said, place to deepfake. This technology allows you to create or modify an audio or video recording using artificial intelligence, to say or do almost anything you want to a person whose audio recordings, videos or photos you have. , and that you may never have met in your life.

The number of victims on the rise, the blackmailers are no longer waiting

These photos and videos are found absolutely everywhere: on Facebook, on Instagram, on TikTok, on Twitter and others. And unless you cement your networks by playing on confidentiality and the limited number of your followers and subscribers, there is a good chance that you could potentially be the subject of a deepfake.

The problem, and it is the very serious FBI which alerts us to this, is that the “sextortionists” now feed on all this accessible and sometimes innocuous content to transform it, thanks to tools for creating deepfakes, into sexually explicit content that can be quite believable. And the victim will think that these videos or photos, even false ones, could be compromising if they are broadcast to a wider audience. She will then be more likely to pay the pirate.

The FBI has in any case observed an increase in the number of victims of sextortion of this type in recent weeks. The worst is in the evolution of the method: some hackers skip the sextortion stage altogether, by directly publishing the false content on adult sites, to put more pressure on the victims. Some platforms, like Twitch, make it cause for instant banning. So always be careful when posting photos or videos online, especially for younger people.

Sources: Bleeping ComputerClubic



Source link -99