In-article:

the Europe plan strongly criticized by the European equivalent of the CNIL, in the name of respect for privacy

Two European authorities responsible for the protection of personal data recently warned Brussels. In a report published on Friday 29 July, the European Data Protection Supervisor and the European Data Protection Board expressed concern about the “risks” for privacy, what would a draft regulation of the European Commission, making it possible to oblige a platform or an online messaging service to detect pedophile content.

Presented in May, this project centered on the fight against child pornography, would establish a “European center for combating the sexual exploitation of children” and would require large online platforms to “measure the risks of distribution of child pornography or solicitation of a sexual nature”quickly remove illegal content and report all detected child pornography content.

Openly denounced in recent months by privacy NGOs, the boss of WhatsApp – Will Cathcart – and other technology companies, it is now up to the European equivalent of the CNIL to publicly disavow this regulation. In question, a provision within it which plans to impose a scan of all messages, including those encrypted, to detect any child pornography content – child sexual abuse material (CSAM), in English – and take appropriate action, if necessary.

Read also Article reserved for our subscribers Apple accused of opening the door to surveillance of private content

Measures deemed “extremely worrying”

The European text thus provides for a “targeted detection obligation”, which would force the services of major platforms to remove content reported by a national authority. Companies will therefore have to “deploy technologies that are as non-intrusive as possible, in accordance with the law and existing technologies, and must limit false positive rates as much as possible”.

A statement that is far too vague, according to its detractors, for whom the text creates a contradictory injunction between monitoring the messages exchanged and respect for privacy. “The measures proposed to detect solicitations of children in the context of interpersonal communications services [messageries] are extremely worrying”judges the European Data Protection Supervisor, Wojciech Wiewiorowski.

These concerns are in line with those expressed in particular by Germany, which has warned against monitoring encrypted messaging, such as WhatsApp. For German Pirate Party MEP Patrick Breyer, “controlling instant messages would be equivalent to asking La Poste to open and read all mail”. This is “a step towards Chinese-style state surveillance”he added in a press release.

A possible risk for the right to respect for private life

“While supporting the objectives and intentions behind this proposal”the European Data Protection Board and the European Data Protection Supervisor (EDPS) “express their serious concerns about the impact of the planned measures on privacy and personal data”.

According to these European authorities, “there is a risk that the proposal could give rise to a generalized and indiscriminate analysis of the content of almost any type of electronic communication”. It therefore calls for a “clarification” the conditions under which detection injunctions may be addressed to service providers.

“The use of technologies (…) such as artificial intelligence is likely to generate errors and represents a high degree of intrusion into the privacy of individuals”. [Dès lors le] text may pose more risk to individuals, and by extension to society as a whole, than to criminals. »

Read also: Article reserved for our subscribers With the metaverse, a new step in the difficult fight against sexual assault on the Internet

The European Commission defends itself

European Commissioner for Home Affairs, Ylva Johansson, defended his draft regulation in the face of criticism in early July, in a blog post. She recalled that any detection was framed by safeguards, “targeted and time-limited”. A new European center will be responsible for checking that content is not wrongly reported before being sent to the police.

“No innocent image will end up in police databases”, assured the Swedish commissioner. Ylva Johansson also added that the detection of child pornography images was already carried out on a voluntary basis by platforms, such as Facebook and Twitter, or messaging services, such as Gmail. The proposed regulation must be negotiated with the European Parliament and the Council, which represent the Member States.

The World with AFP

source site-30