On May 17th, 2023, the European Data Protection Board (EDPB) published its final report on the use of facial recognition technologies (FRTs) by Law Enforcement Authorities (LEAs). This report opposes mass surveillance, and, according to the EDPB, ‘the use of facial recognition by law enforcement agencies must be necessary, limited, and proportionate’.
The EDPB’s report follows a provisional version that was presented a year ago. The changes contained in the new report mostly concern ensuring that there is human intervention in the use of FRTs, and that the systems are reliable, and address the risks posed by infringing on the presumption of innocence (Articles 47 and 48 of the EU Charter of Fundamental Rights). A reference for the need for effective supervision by data protection supervisory authorities has also been added.
The purpose of the guidelines is to provide information about FRTs, and the legal framework that applies to FRTs in relation to law enforcement.
In this document, the EDPB first outlines what facial recognition technology is, its functions, applications and associated risks and benefits. It then proposes a suitable legal framework, bringing together the Law Enforcement Directive (LED), the EU Charter of Fundamental Rights and the European Convention on Human Rights (ECHR).
In addition, in the appendices of the document, there is an instrument for provisionally classifying the sensitivity of a given use case, practical advice for law enforcement authorities wishing to acquire and operate an FRT system, and finally typical use cases (it is emphasised that FRTs should be used in such a way that meets the necessity and proportionality test).
The EDPB concludes that the way in which LEAs use FRT remains largely dependent on whether the personal data processing required for particular operations will be authorised by the authorities. It also states that AI-assisted systems are contrary to the Charter.
This document commands attention in terms of how facial recognition is implemented on a day-to-day basis, despite its “softness” as a set of guidelines. This is a softness that leaves room for interpretation and therefore, the risk that there will be disparities in how it is applied.