Sextortion based on deepfakes on the rise


Sextortion and deepfakes: the malicious cocktail was expected, it is coming to fruition. The US Federal Bureau of Investigation has just deplored in a recent alert an increase, since April 2023, of victims of sextortion reporting the use of false images or videos.

New Malicious Opportunities

As the FBI notes, blackmailers are increasingly relying on artificial intelligence-based visual creation tools to create compromising photos or videos. “Advances in technology offer new opportunities for malicious actors to find and target victims”, underlines this judicial police and intelligence service.

Part of the industry promotes this kind of trickery in an uninhibited way. An application had hijacked the images of Emma Watson and Scarlett Johansson for the advertisement of its hyperfaking service on Facebook, featuring the two actresses initiating a sexual relationship. There are also black market sales of artificial intelligence tools that lack the safeguards designed to prevent this kind of abuse. According to Deeptrace, 96% of online deepfakes are pornographic in nature.

Recommendations

So many images then shared on social networks or on pornographic sites for the purpose of harassment or blackmail, which are then very difficult to completely remove from the Internet. The FBI recommends caution when posting personal photos or videos online, recommends limiting the public exposure of its content on social networks, and suggests regularly checking search engines to see if photos or videos do not have not been shared without your knowledge.

In 2022, the federal office had counted more than 3,000 underage victims of sextortion. It was usually extortion targeting boys, whom cybercriminals blackmailed after convincing them to record themselves in explicit videos or photos.




Source link -97