Apple stops scanning iPhone photos for child pornography


The NeuralMatch tool, presented in 2021 to fight against child pornography photos, is no longer developed by Apple. This software had caused a strong controversy during its presentation. The company has since adopted another strategy.

It’s a decision that would go almost unnoticed, as the announcement of the upcoming arrival of end-to-end encryption on iCloud takes all the light. It fits in one sentence. During his interview with the Wall Street Journal on December 7, 2022, Craig Federighi, Apple’s vice-president and head of software engineering, confirmed the termination of a project that caused a lot of ink in 2021.

A tool in the fight against child pornography

The tool in question was to enter into the fight against the dissemination and storage of child pornography content (which is called in English CSAM, for child sexual abuse material). It was presented in the summer of 2021. The ambition of the American company? Mobilize software capable of scanning the contents on the iPhone to detect this kind of file.

A binary signature is associated with each image sent to iCloud. // Source: Apple

This tool, called NeuralMatch, worked by retrieving the signature of an image sent to iCloud (as part of the backup of the iPhone) to compare it with the signatures of child pornography images in a database. If multiple matches were found, an alert should be issued for further investigation.

When presenting this tool, Apple took care to describe the protective measures to avoid false positives and drifts. This did not extinguish the criticism against NeuralMatch, which appeared immediately after its presentation; these considered that the Cupertino company was putting its finger in a gear that it could not get rid of.

NeuralMatch was criticized from the outset

Faced with the outcry and fears that this tool for legitimate purposes will one day be misused — for other types of illegal content at the request of the authorities, by increasingly broadening the spectrum of requests — or weaken the confidentiality of content on the iPhone, the company ended up putting everything on hold.

At the time, the company justified this postponement to have more time to correct certain aspects of NeuralMatch, in order to take into account feedback from the public, but also from specialists and privacy organizations. But more than a year later, it’s a reversal in the group’s plans, since there is no longer any question of developing NeuralMatch.

The judgment of NeuralMatch does not sign the renunciation of Apple to fight child pornography content, nor the group’s efforts to preserve the tranquility of individuals, especially minors. The proof: with iOS 16, iPadOS 16 or macOS Ventura, there is now an option that operates locally on the device to filter out unwanted nudity.

Apple would not have been the first tech company to explore this analysis against CSAM content. In 2014, Google reported to authorities a user who had sent child pornography photos via Gmail. The Mountain View firm has since mobilized other tools to combat them. In Europe, there are also reflections on this subject. A battle plan was presented in 2022, but it too sparked a lively controversy, for the same reasons.



Source link -100