By Theodore Christakis (project leader), Karine Bannelier, Claude Castelluccia, Daniel Le Métayer.
This is the first ever detailed analysis of what is the most widespread way in which Facial Recognition is used in public (& private) spaces: to authorise access to a place or to a service. The 3rd Report in our #MAPFRE series should be of great interest to lawyers interested in data protection; AI ethics specialists; the private sector; data controllers; DPAs and the EDPB; policymakers; and the general public, who will find here an accessible way to understand all these issues.
Part 1 of our “MAPping the use of Facial Recognition in public spaces in Europe” (MAPFRE) project reports explained in detail what “facial recognition” means, addressed the issues surrounding definitions, presented the political landscape and set out the exact material and geographical scope of the study. Part 2 of our Reports presented, in the most accessible way possible, how facial recognition works and produced a “Classification Table” with illustrations, explanations and examples, detailing the uses of facial recognition/analysis in public spaces, in order to help avoid conflating the diverse ways in which facial recognition is used and to bring nuance and precision to the public debate.
This 3rd Report focuses on what is, undoubtedly, the most widespread way in which Facial Recognition Technologies (FRT) are used in public (and private) spaces: Facial Recognition for authorisation purposes.
Facial recognition is often used to authorise access to a space (e.g. access control) or to a service (e.g. to make a payment). Depending on the situation, both verification and identification functionalities (terms that are explained in our 2nd Report) can be used. Millions of people use FRT to unlock their phones every day. Private entities (such as banks) or public authorities (such as the French government in terms of the now abandoned ALICEM project) increasingly envisage using FRT as a means of providing strong authentication in order to control access to private or public online services, such as e-banking, or administrative websites that concern income, health or other personal matters. FRT is increasingly being considered as a means of improving security when controlling and managing access to private areas (building entrances, goods warehouses, etc.).
In public spaces, FRT is being used as an authentication tool for automated international border controls (for example at airports) or to manage access in places as diverse as airports, stadiums or schools. Pre Covid-19, there were a lot of projects to use in the future FRT in order to “accelerate people flows”, “improve the customer experience”, “speed up operations” and “reduce queuing time” for users of different services (e.g. passengers boarding a plane or shopping) but the advent of the Covid-19 pandemic has further boosted calls for investment in FRTs in order to provide contactless services and reduce the risk of contamination. Supermarkets, such as Carrefour, which was involved in a pilot project in Romania, or transport utilities in “smart cities”, such as the EMT bus network in Madrid, which teamed with Mastercard to conduct a pilot project that enables users to pay on EMT buses using FRT, have implemented facial recognition payment systems that permit consumers to complete transactions by simply having their faces scanned. In Europe, similar pilot projects are currently being tested enabling the management of payments in restaurants, cafés and shops.
Despite this widespread existing use or projected use of FRT for authorisation purposes we are not aware of any detailed study that is focusing on this specific issue. We hope that the present analytic study will help fill this gap by focusing on the specific issue of the use of FRT for authorisation purposes in public spaces in Europe.
We have examined in detail seven “emblematic” cases of FRT being used for authorisation purposes in public spaces in Europe. We have reviewed the documents disseminated by data controllers concerning all of these cases (and several others). We have sought out the reactions of civil society and other actors. We have dived into EU and Member State laws. We have analysed a number of Data Protection Authority (DPA) opinions. We have identified Court decisions of relevance to this matter.
Our panoramic analysis enables the identification of convergences among EU Member States, but also the risks of divergence with regard to certain specific, important ways in which FRTs are used. It also permits an assessment of whether the GDPR, as interpreted by DPAs and Courts around Europe, is a sufficient means of regulating the use of FRT for authorisation purposes in public spaces in Europe – or whether new rules are needed.
What are the main issues in practice in terms of the legal basis invoked by data controllers? What is the difference between “consent” and “voluntary” in relation to the ways in which FRT is used? Are the “alternative (non-biometric) solutions” proposed satisfactory? What are the positions of DPAs and Courts around Europe on the important issues around necessity and proportionality, including the key “less intrusive means” criterion? What are the divergences among DPAs on these issues? Is harmonisation needed and if so, how is this to be achieved? What are the lessons learned concerning the issue of DPIAs and evaluations? These are some of the questions examined in this report.
Our study ends with a series of specific recommendations that we are making, in relation to data controllers, the EDPB as well as stakeholders making proposals for new FRT rules.
We make three recommendations vis-à-vis those data controllers wishing to use facial recognition applications for authorisation purposes:
1) Data controllers should understand that they have the burden of proof in terms of meeting all of the GDPR requirements, including understanding exactly how the necessity and proportionality principles as well as the principles relating to processing of personal data should be applied in this field.
2) Data controllers should understand the limits of the “cooperative” use of facial recognition when used for authorisation purposes. Deployments of FR systems for authorisation purposes in public spaces in Europe have almost always been based on consent or have been used in a “voluntary” way. However, this does not mean that consent is almighty. First, there are situations (such as the various failed attempts to introduce FRT in schools in Europe) where consent could not be justified as being “freely given” because of an imbalance of power between users and data controllers. Second, consensual and other “voluntary” uses of FRT imply the existence of alternative solutions which must be as available and as effective as those that involve the use of FRT.
3) Data controllers should conduct DPIAs and evaluation reports and publish them to the extent possible and compatible with industrial secrets and property rights. Our study found that there is a serious lack of information available on DPIAs and evaluations of the effectiveness of FRT systems. As we explain, this is regrettable for several reasons.
We make two recommendations in relation to the EDPB:
1) The EDPB should ensure that there is harmonization on issues such as the use of centralised databases, and those principles that relate to the processing of personal data. A diverging interpretation of the GDPR on issues such as the implementation of IATA’s “One ID” concept for air travel or “pay by face” applications in Europe could create legal tension and operational difficulties.
2) The EDPB could also produce guidance on the approach that should be followed both for DPIAs and evaluation reports where FRT authorisation applications are concerned.
Finally, a recommendation regarding policy makers and other stakeholders formulating new legislative proposals: there is often a great deal of confusion about the different proposals that concern the regulation of facial recognition. It is therefore important for all stakeholders to distinguish the numerous ways in which FRT is used for authorisation purposes from other use cases and to target their proposals accordingly. For instance, proposals calling for a broad ban on “biometric recognition in public spaces” are likely to result in all of the ways in which FRT is used for authorisation purposes being prohibited. Policy-makers should take this into consideration, and make sure that this is their intention, before they make such proposals.
To download the full report, please click here.
The first report entitled “Mapping the Use of Facial Recognition in Public Spaces in Europe – Part 1: A Quest For Clarity: Unpicking The “Catch-All” Term”, is available here.
The second report entitled “Mapping the Use of Facial Recognition in Public Spaces in Europe – Part 2: Classification”, is available here.