Apple rolls out measures to tackle child pornography in the UK


Louise Jean

April 25, 2022 at 1:45 p.m.

0

kidiphone

An artificial intelligence will scan all the messages received and sent by the iPhone belonging to children to ensure that sexual content is masked. A feature that will only be available in the UK for the time being.

The feature will allow parents to ensure that their children’s iPhones do not receive child abuse content. The analyzes are carried out by the iPhone itself, and not processed by Apple, which makes it possible to guarantee the confidentiality of the analyzed data.

Protect children from sexual content

All photos received on iPhones belonging to children will be scanned once parents activate the feature. The AI ​​focuses on images containing nudity: once detected, the image will be blurred and the user warned of the sexual nature of the photo. Next, the child will be directed to online resources for child safety on the internet. The same protection will be granted to children who send photos containing nudity: in this case, the user is encouraged not to send the photo and to speak to an adult.

The tool maintains end-to-end message encryption and according to Apple, scans for received and sent content do not leave the affected iPhone. Originally, the feature was intended to alert parents directly, but eventually it seems it will simply warn children of the sexual nature of the content, the aim being to protect them from pedophiles present on the Internet. The firm also plans to introduce features ready to intervene when content related to child exploitation is searched with Spotlight, Safari or Siri. The best protection is still keeping the youngest away from unsecured devices, and educating and informing children who use the internet regularly.

Source : The Guardian



Source link -99