Facial recognition service Clearview AI fights Canadian ban


Biometric facial recognition service Clearview AI is reluctant to delete facial photos it collected in Canada without consent. In December, the data protection authorities of the provinces of Alberta, British Columbia and Quebec ordered Clearview to be deleted and permanently banned the operation of the facial recognition service. Clearview AI is now taking legal action against this.

The company does not want Canadian authorities to ban Clearview’s facial recognition, nor does it want to delete photos it has already collected. The company considers it “impossible” to delete the images. The Supreme Court of British Columbia is responsible for reviewing the decisions of the data protection authority in the first instance. The request for review submitted there automatically has a suspensive effect on the decision of the data protection authority.

The New York company Clearview AI has collected more than three billion facial photos on the Internet. With this, it has trained a facial recognition algorithm, which it rents out. However, the company did not even attempt to obtain the consent of those affected. Clearview AI claims to be nothing more than a search engine for images, like Google Images. The copied images are publicly available, so everything is legal.

In February 2021, the Canadian Data Protection Agency officially ruled that Clearview’s facial recognition is illegal in Canada. The authorities of Australia, France and the UK have made similar findings. The company has paused its service to Canadian customers, but wants to resume operations.

As of June 2020, Clearview AI has distributed its facial recognition service to the Royal Canadian Mounted Police Force (RCMP) for a fee, and to thousands of users in Canada via free trial accounts. These included, for example, pharmacy chains, dozens of local police authorities, insurance authorities and so on.

The local police authorities initially denied using the service, but after a data leak on the Clearview server they had to admit that they had used the test accesses – allegedly without the knowledge of the police chiefs. The use by the RCMP is particularly explosive because the Federal Police had assured the Federal Data Protection Authority that facial recognition would only be introduced after a technology assessment with regard to data protection had been completed – but then simply bought it from Clearview AI.

In its submission to the Supreme Court of British Columbia published on Monday, Clearview puts forward a long series of arguments: Because the company has no facilities or employees in British Columbia, the authority is not responsible at all, and its decision is insufficiently reasoned, and formulated as such that it is unclear how Clearview AI can even fulfill it. Contrary to what the authority decided, Clearview AI wants to pursue a legitimate purpose with its facial recognition service: tracking down criminals and victims.

Contrary to further statements by the authority, the photos that Clearview dusted off from social networks are public data, which means that the Canadian data protection law PIPA does not apply, the company says. (What Clearview doesn’t mention: Google, LinkedIn, Meta, Twitter, and YouTube have all slapped cease and desist letters on the photo collector because grabbing user images violates the terms of service of their respective services. Note.)

In general, the decision of the data protection authority was “unreasonable” (roughly: unfair, unreasonable, inappropriate). However, if the authority correctly interpreted the Data Protection Act, the relevant provisions of the Act should be repealed as unconstitutional. Specifically, Clearview AI relies on paragraph 2 litera b) of the Canadian Charter of Rights and Freedoms. This section guarantees freedom of speech and freedom of the press.

The term “(un)reasonable” is important in Canadian administrative law but not precisely defined. Crucially, in the present context, Canadian courts give administrative agencies wide latitude in interpreting those laws that are the core competence of that administrative agency. Unless the respective law contains provisions to the contrary, the court does not examine whether the administrative authority has interpreted its technical norms correctly, but only whether its interpretation was “reasonable”.

This saves the court from delving into specialist knowledge; it assumes that the experts of the managing authority are better versed in technical matters. This makes it easier for administrative authorities to defend their decisions in court. Clearview AI will therefore have to show that the data protection authority not only interpreted the data protection law incorrectly, but also interpreted it as “unreasonable” in the first place.

Because the Charter of Rights and Freedoms is not a specialized law, these arguments would suffice to show that the provisions of the Data Protection Act are in fact unconstitutional. However, freedom of speech in Canada is not interpreted as radically as in the USA. Laws restricting free speech are permissible as long as they are “reasonable” and justifiable in a free and democratic society.

The procedure is called Clearview AI v. Information and Privacy Commissioner for British Columbia and is pending with the Supreme Court of British Columbia in Vancouver under Case S-220204.


(ds)

To home page



Source link -64