What will algorithmic video surveillance track during the Paris Olympics?


Algorithmic video surveillance will be deployed in France during the Paris Olympics in 2024. But the texts make sure not to use it for facial recognition or tracking people in the crowd.

The Paris Olympic Games will take place from July 26 to August 11, 2024 and, like any major event, the means to secure the stadiums and their surroundings will be substantial. This protection will be partly entrusted to technology: it is during the Olympic Games that algorithmic video surveillance will be deployed, in support of the police.

This algorithmic video surveillance will consist, as its name suggests, in mobilizing video surveillance cameras and connecting the video streams to algorithms capable of processing and analyzing the images, in order to detect events considered to be at risk. This work will be done in real time, to intervene as quickly as possible, if necessary.

Algorithmic video surveillance will pass through the cameras. // Source: Alex Knight

Throughout the legislative course of the text, adopted by Parliament at the end of May and published in the Official Journal in stride, the government assured that this video surveillance by artificial intelligence (AI) will not perform any facial recognition of people appearing on the screen. ‘screen. It’s about focusing only on situations ” abnormal “.

The publication at the end of August in the Official Journal of the decree setting the conditions for the implementation of this algorithmic video surveillance makes it possible today to know more precisely the said unusual situations that the authorities wish to monitor during the competition – and beyond, since it is is an extended experiment until March 31, 2025.

What algorithmic video surveillance will look for

Eight predetermined events were selected:

  • presence of abandoned objects;
  • presence or use of weapons, among those mentioned in article R. 311-2 of the internal security code;
  • non-respect by a person or a vehicle of the common direction of traffic;
  • crossing or presence of a person or a vehicle in a prohibited or sensitive area;
  • presence of a person on the ground following a fall;
  • crowd movement ;
  • excessive density of people;
  • fire starts.

These events have been selected in that they are likely to present or reveal a risk of an act of terrorism or a serious threat to the safety of persons “, indicates the decree. They can be observed from cameras, but also from drones that can be deployed according to circumstances and operational needs.

This list restricted to eight predetermined events is seen favorably in the deliberation of the National Commission for Computing and Liberties. So, ” no algorithmic processing may be designed, acquired by the State or implemented in the operating phase to detect events other than those listed therein. »

Neither will artificial intelligences have the last word. AIs only report scenes that match what they were designed to do (with the possible biases and errors that any system may have). Then, authorized human agents are “ responsible for viewing the captured images “, in order to ” to confirm the report or to remove the doubt. »

The other services which will be able to benefit from the analysis of these images, and intervene if necessary, are the police, the gendarmerie, the fire and rescue services, and the protection units of the SNCF and the RATP, for safety in public transport. All will undergo training on the protection of personal data and training on the use of the system.

Identification and facial recognition ruled out

In order to eliminate the risk of misuse of this video surveillance by algorithms, the decree sets a ban on use for the purpose of identifying people. Article two of the decree, specifically, prohibits the implementation of any facial recognition technique, the handling of biometric data or the crossing of these treatments with others.

These treatments do not use any biometric identification system, do not process any biometric data and do not implement any facial recognition technique. They cannot carry out any reconciliation, interconnection or automated linking with other processing of personal data. “, warns the decree.

Face recognition.  // Source: EFF (cropped image)
The decree rules out facial recognition. // Source: EFF (cropped image)

Nor can these tools [produire] no other result and cannot found, by themselves, any individual decision or any act of prosecution “. They therefore cannot follow a particular individual in the crowd, even if he is not identified. This absence of re-identification also applies to other data, such as the recognition of clothing.

Before the expected end of the experiment, the government must, no later than December 31, 2024, submit an evaluation report on the implementation of the experiment, the content of which will be determined by decree in Council of State. and after the CNIL’s opinion. As for the Commission, it must be kept informed every three months of the state of the experiment.


If you liked this article, you will like the following ones: do not miss them by subscribing to Numerama on Google News.



Source link -100