Worried about an infection contracted by his young child, a father sent photos to a doctor. Two days later, Google blocked his account and notified the police as part of the fight against the possession and sharing of child pornography images.
The limits of artificial intelligence. An article from New York Times reports that Google’s AI is getting a bit too aggressive in some situations, treating situations that aren’t child pornography as child pornography. A father has paid the price after taking photos of his child’s groin infection to send to the doctor.
Google found the content of the photos to be against its terms of service and potentially illegal. Consequently, the American firm has closed google account of the user concerned and warned the authorities, triggering the opening of a police investigation against him.
The fight against child pornography poses interpretation problems for AI
In this case, it was a nurse who had asked the father of the family to send the photos ahead of a video consultation, so that the doctor could analyze them. The facts date back to February 2021, when some medical practices in the United States were no longer taking physical consultations due to COVID-19. Two days after the photos were taken, the individual received an alert from Google telling him that his account was locked. He has then lost access to emails, contacts, photos…and even his phone number, because he was a Google Fi customer, a US-exclusive virtual mobile operator service.
The police investigation concluded after several months that the user had not committed any crime, but the victim of the misunderstanding had to resolve to let the investigators access all of his data and content stored with Google. The surveillance systems put in place by GAFAM to detect illegal content are strongly criticized by privacy advocates. Recently, Apple shelved a previously announced controversial anti-child pornography system to make changes to it.
Source: The New York Times