On January 28, 2020, The Council of Europe (CoE) adopted a new set of guidelines on facial recognition addressed to governments, legislators and businesses. The guidelines were developed by the Consultative Committee of the Council of Europe, after a 7-year process, that resulted in the updating of Convention 108 (Convention for the Protection of Individuals with regard to Automatic Processing of Personnal Data, ETS No.108) commonly called Convention 108+. Convention 108+ recognizes the sensitivity of biometric data: “the inclusion of data uniquely identifying a person under the special categories of data in Article 6 of the modernised Convention for the Protection of Individuals with regard to the Processing of Personal Data”.
“At is best, facial recognition can be convenient, helping us to navigate obstacles in our everyday lives. At its worst, it threatens our essential human rights, including privacy, equal treatment and non-discrimination, empowering state authorities and others to monitor and control important aspects of our lives – often without our knowledge or consent.
“But this can be stopped. These guidelines ensure the protection of people’s personal dignity, human rights and fundamental freedoms, including the security of their personal data.”
Source: Council of Europe Secretary General Marija Pejčinović Burić, Facial recognition: strict regulation is needed to prevent human rights violations.
In the first section of the Guidelines addressed to legislators and decision-makers, the CoE focuses on the principle of lawfulness, highlighting that the processing of biometric data “shall only be authorized if such processing relies on an appropriate legal basis, and complementary and appropriate safeguards are enshrined in domestic law” and that the “necessity of the use of facial recognition technologies (FRTs) has to be assessed together with the proportionality to the purpose and the impact on the rights of the data subjects”. This section also includes a discussion about the use of facial recognition in the public and private sectors. With regard to the use of FRTs in the public sector, the CoE asserts that “consent should not, as a rule, be the legal ground used for facial recognition […] considering the imbalance of powers between data subjects and public authorities”. With respect to its use in the private sector, the CoE recalls the requirement to collect the “explicit, specific, free and informed consent of data subjects whose biometric data is processed” and stresses that “data subjects should be offered alternative solutions to the use of facial recognition technologies”. Finally, the CoE also addresses the issue of supervisory authority involvement, considering that they should “[…] be consulted on proposals for any legislative or administrative measures”, as well as the certification of such technologies and the need to improve the awareness of the persons concerned.
In the second part of the guidelines addressed to developers, manufacturers and service providers of FRTs, the CoE focuses on issues relating to the development and manufacturing phases of facial recognition technologies. The guidelines pay particular attention to the representativeness of the data used, recalling that there is an obligation for the data to be accurate. Indeed, developers and manufacturers “will have to take steps to ensure that facial recognition data are accurate” to avoid “unintended discrimination” or bias. With regard to the representativeness of the data, the question of data life expectancy and reliability is also addressed. With that in mind, the CoE underlines that “a facial recognition system requires periodic renewal of data (the photos of faces to be recognized) in order to train and improve the algorithm used”, the objective being to ensure the best possible level of reliability in the system. Finally, the CoE provides guidance to developers and manufacturers on awareness and responsibility.
In the third part of the Guidelines, addressed to entities that use facial recognition technologies, the CoE recalls that these entities:
- “have to comply with all the applicable data protection principles and provisions while processing biometric data in their use of facial recognition technologies”;
- “have to be able to demonstrate that this use is strictly necessary, and proportionate, in the specific context of their use and that it does not interfere with the rights of the data subjects”;
- “have to assure that the voluntary use of the technology will not have an impact on individuals who happen to unintentionally come into contact with it”.
Finally, the CoE focuses on data processing legitimacy and data quality. It develops and focuses its guidelines on core principles that should be applied to FRTs such as transparency and fairness; purpose limitation, data minimisation and limited data retention periods; and accuracy. The CoE also mentions the importance of data security, highlighting that entities using facial recognition should implement “strong security measures, both at the technical and organizational levels (…) to protect facial recognition data and image sets”.
The focus then moves on to the question of accountability; a list of organisational measures is provided so that the entities deploying these technologies can “comply with their obligations” and are “able to demonstrate that the data processing (is) under their control”. The new guidelines also stress the importance of data protection impact assessments and data protection by design by fostering FRT compliance while adding that “in addition to the respect of legal obligations, giving an ethical framework to the use of this technology is also crucial”.
In the final section of the Guidelines, the CoE focuses on the Rights of data subjects, reminding us that “facial recognition is based on the processing of personal data, all the rights provided for in Article 9 of Convention 108+ are guaranteed to the data subjects”, and then underlying that the restriction of these rights must comply with the conditions set in Article 11 of Convention 108+.