On November 23rd, 2022 an article by Le Parisien, a French Newspaper, revealed that the French Government had dropped its project to deploy facial recognition to support security arrangements at the 2024 Paris Olympics. In fact, the debate on the possible implementation of facial recognition systems during the Olympic Games is part of a broader debate which divides political leaders on whether AI-driven biometric systems should be used to monitor public places.
The use of facial recognition technologies for criminal investigation purposes has been under the spotlight for many years in France and in the European Union. In this article accepted for publication in the European Review of Digital Administration & Law, T. Christakis & A. Lodie discuss a major decision issued last year by the French Conseil d’Etat.
The Italian ‘Garante per la protezione dei dati personali’ (Italian data protection authority) published a press release on November 14th, 2022, in which it announced that it had opened two separate investigations into the use of ‘smart video systems’ by two Italian municipalities.
The ‘Commission Nationale de l’Informatique et des Libertés’ (CNIL – the French DPA) released its final decision on October 20th, 2022, sanctioning Clearview AI for its unlawful activity, which consists of collecting images of millions of individuals from the open web without any legal basis under the GDPR for doing so.
On September 24th, 2022, the French NGO ‘La Quadrature du Net’ challenged the use of technology-driven tools by French police forces before the Commission Nationale de l’Informatique et des Libertés (CNIL – French DPA). By means of three separate complaints, the NGO wants to raise awareness about what it calls the ‘technopolice’, which amounts to the police using methods that may pose risks to privacy. These complaints follow a petition published by LQDN which collected 15 248 signatures.
An Argentinian Court has declared the use of the ‘Facial Recognition for Fugitives System’ (Sistema de Reconocimiento Facial de Prófugos), deployed in Buenos Aires in 2019, unconstitutional. In April 2022, in response to the legal challenge presented by Observatorio de Derecho Informático Argentino (ODIA), and several other human rights organisations, the judge suspended use of the system.
On July 13th, 2022, the Greek Data Protection Authority (DPA) released its decision about the claims brought by the NGO Homo Digitalis on the processing of individuals’ biometric data by Clearview and fined Clearview 20 million euros. This lawsuit follows many similar cases filed by Noyb, Privacy International, and the Hermès Center for transparency and Digital Human Rights before the French, British, Italian and Austrian DPAs to stop the American start-up collecting biometric data belonging to European citizens.
In June 2022, the Ada Lovelace institute published an ‘Independent legal review of the governance of biometric data in England and Wales’ written by Matthew Ryder. This review aims to address the current legal uncertainty concerning the collection, use and processing of biometric data in England and Wales. It also puts forward 10 recommendations to improve the legal framework as well as the governance of biometrics in England and Wales.
At a time when ad hoc legislation on AI is being negotiated at the European level, the French Senate published on May 10, 2022, a report proposing regulations on biometric recognition in public spaces. The purpose of this report is to put forward a framework for facial recognition experimentation and to reinforce French and European technological sovereignty.
In 2019, Buenos Aires City Council introduced the ‘Facial Recognition for Fugitives System’ (Sistema de Reconocimiento Facial de Prófugos (SNRP)), which involved deploying 9,500 surveillance cameras equipped with facial recognition technology.
This is the first ever detailed analysis of what is the most widespread way in which Facial Recognition is used in public (& private) spaces: to authorise access to a place or to a service. The 3rd Report in our #MAPFRE series should be of great interest to lawyers interested in data protection; AI ethics specialists; the private sector; data controllers; DPAs and the EDPB; policymakers; and the general public, who will find here an accessible way to understand all these issues.
The French DPA, CNIL, stressed that “the current debate on facial recognition is sometimes distorted by a poor grasp of this technology and how it works”. This 2nd of 6 Reports of our MAPFRE series provides a path to understanding with a classification table presenting in the most accessible way the different facial processing functionalities and applications used in public spaces.
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.