iPhone: these applications could generate pornographic images, Apple finally corrects the situation


As part of the fight against the misuse of artificial intelligence, Apple removed several AI image generation apps from its App Store after learning they could be used to create images of non-consensual nudity .

app store
Credits: 123rf

404 Media investigated how some app developers were using Instagram ads to promote the ability of their offerings to “ undress any girl for free » by generating explicit images created by AI.

According to the report, some of these Instagram ads linked directly to apps listed on the Apple App Store under innocuous titles such as “art generator,” thereby masking their true purpose, which is to enabling the non-consensual production of fake pornography through AI.

Also read – Android: Google removes thousands of apps from the Play Store, here’s why

Apple finally removes apps from the App Store

Although Apple did not initially respond to 404 Media’s request for comment, the company quickly took action after the publication provided direct links to the offending apps and their associated ad campaigns.

In total, Apple removed three apps from its App Store following the alert, but the report states that this is likely a ” cat and mouse game “, Apple apparently not being able to to independently identify applications violating current policy, without external assistance.

Apple, like Google, took some time to react. Reports from 2022 about these applications already warned about their capabilities. The two technology giants were therefore alerted several years ago, but at the time refused to delete these applications. Instead, developers were asked to stop serving the ads on adult sites, and the apps were allowed to remain in the App Store.

Despite the injunction, one of the apps continued to market its features until 2024, when it was removed from the Google Play Store. We therefore had to wait several months before Apple finally decided to remove these applications from its storefront. As AI-created porn deepfakes claim more and more victims, there is hope that companies in the industry will respond more quickly in the future.

Source: 404Media



Source link -101