Yes, Google scans your emails and Drive documents for child pornography content


Thibaut keutchayan

December 28, 2021 at 4:15 p.m.

19

Google Clubic © Clubic.com

© Clubic

In the sprawling fight against child pornography, the main players in the online storage
have an active part to play in detecting potentially illegal content stored through their services.

This is particularly the case with Google which, like Apple for example, is active in the face of this scourge. And for that, both YouTube and the Drive have been combed through.

Child pornography, a sadly increasing trend

It is here that the ” rise “Estimated by Google of the files falling under” Child Sexual Abuse Materials “(CSAM) stored within it. What is all the more distressing first of all is that the practice is unfortunately not new and has not, for decades, been the subject of particularly virulent stalking. But also, it should be noted that the software capable of detecting potential child pornography content are themselves more and more precise. What to give figures that are cold in the back.

According to Google services, which annually transmit transparency reports to the National Center for Missing and Exploited Children (NCMEC), 1.5 million potentially CSAM contents, grouped together in some 180,000 reports, were sent to NCMEC for the period January-June 2020. For the second half of 2020, this is more than 2.9 million content in over 365,000 reports that have been reported by Google. Finally, for the first six months of 2021, 3.4 million pieces of content detected and transmitted in 410,000 reports reached NCMEC. More than double in just a year and a half, therefore.

But how exactly does Google do it?

It is not easy to track child pornography content, even if Google has, for its services, the “advantage” of not practicing end-to-end encryption, which allows it to perform potential scans of accounts. of its users. But the question is subject to many issues, especially with regard to respect for the right to privacy. The sling of users, Apple services among others, against Apple’s will to scan files to detect potential CSAMs, shows that such actions had not (at all) been unanimous.

For its part, Google announces that it is working in two ways, first of all by searching, thanks to a software called “CSAI Match” in particular used on YouTube, the alphanumeric representations of the same file already previously declared as illegal. This allows the firm to report CSAM content already identified by NCMEC and, for example, found in an email stored in Gmail. Moreover, Google uses analysis tools capable of detecting, in particular, the abuse of minors on various files, including photos and videos.

Go beyond photos and videos

What is also tracked by Google are cartoons, films, or various works of art that could constitute CSAM content. And in the concrete case of art, the border becomes as porous as it is complex to define. A concrete case was a holder of cartoons detected by Google services in Kansas at the end of 2020. First transmitted to the NCMEC, the files concerned did not however make it possible to condemn the one who would in fact be an artist well known in the community, whose identity has (logically) not been revealed by Forbes since no prosecution has been brought against him.

In fact, to fall under the law in the United States, cartoons must be particularly ” obscene “Or be deprived” of literary, scientific, artistic or political interest “. This case only illustrates the figures mentioned at the beginning of this article, on the one hand questionable, on the other, darkly revealing of the infamy that constitutes child pornography.

And the Mountain View firm, like so many others, still has its work cut out for it. On the one hand, to improve the way in which it finds and report this illegal content, and on the other hand to limit the impact on the private life that these measures have on ordinary people, in order to make them better accepted and make, thus, all the more effective. Remember that in 2019, France was the third country hosting the most child pornography content in the world, behind the Netherlands and … the United States.

On the same subject :
Photo scan: Apple responds to controversy

Sources: Google
, Forbes



Source link -99