On February 18th, 2021, the President of the CNIL issued a warning to a sports club who were considering using a facial recognition system to stop people with stadium bans from attending their games. According to the Data Protection Authority, this use of facial recognition technology does not comply with the General Data Protection Regulation (GDPR) and the French Data Protection Act (“Loi Informatique et Libertés”).
Facial Recognition in French Stadium Challenged
Following reports that a sports club intended to use a facial recognition system on spectators, the President of the CNIL decided to investigate. This system, which was in an experimental phase, was intended to be used to identify people with stadium bans, detect abandoned objects and combat terrorism.
Such experimentation has already been carried out, for instance at the Metz football stadium. Although the CNIL does not directly refer to this club, it seems that this is the only ongoing trial of facial recognition technology that is known about in France.
In January 2021, the media reported that Metz football stadium were experimenting with the facial recognition technology Two-i. This technology was tested solely on volunteers drawn from club staff but as there was no priorconsultation with supporters (who were the intended targets of the system) the news aroused anger from the Supporters Association who publicly expressed their refusal to “become the laboratory rats” of facial recognition technology. Consequently, the club owners justified their decision by explaining that the project was initially aimed at identifying people with stadium bans as well as helping with the fight against terrorism.
On a more global scale, facial recognition used for security purposes in football stadiums and other sports facilities is an issue which is particularly pressing at the moment. In fact, facial recognition technology was considered for the 2016 European Football Championship football stadiums, and other international tournament sports facilities, following the terrorists attacked that occurred in France November 13 2015, which included an attempt by a suicide bomber to break into the Stade de France sports stadium.
In the years to come, these issues will take on greater importance, given that France is due to host the Rugby World Cup in 2023 and the Olympic Games in 2024. Following the aforementioned crises, the French minister for Sport, Roxana Maracineanu, stated in an interview that she was in favour of the use of facial recognition technology for security purposes in stadiums, especially when used in international tournaments and events.
CNIL’s Position
Following the DPA’s investigation and analysis of the nature of the system that was intended to be used in the stadium, it was “shown that it [the system] was based on the processing of biometric data”. However, the CNIL highlighted that “the collection and use of this sensitive data is, with some exceptions, prohibited by the General Data Protection Regulation (GDPR) and the French Data Protection Act” (Loi Informatique et Libertés).
Consequently, the DPA found that:
“In the absence of a special legislative (e.g. law) or regulatory (e.g. decree, order, etc.) provision, the implementation of such a scheme by a sports club for “counter-terrorism” purposes is unlawful. The President of the CNIL has therefore warned the sports club that, in the current legal framework, the envisaged processing cannot be implemented in a lawful manner.”
In particular, the CNIL explained that even if “[s]tadium trade ban treatments contribute to the security of sports events by allowing the organisers of such events to prevent certain persons from gaining access to their sports venues due to dangerous behaviour corresponding to breaches of obligations of a contractual nature”, they highlighted that such treatments have to be compliant with GDPR.
The CNIL refers to the fact that article R. 332-15 of the “Sport Code provides that the photograph associated with a person’s season ticket must be processed as part of the management of stadium bans, but it does not allow the implementation of a biometric device based on these photographs in particular”. The DPA also underlined that the same Code, in article R. 332-18, “explicitly provides that data subjects may not object to stadium commercial ban processing operations”.
The President of the CNIL therefore warned the sports club that “in the current legal framework, the envisaged processing could not be implemented in a lawful manner” and that “if, despite this warning, the sports club concerned proceeds with the effective implementation of the facial recognition system, it will be exposed to one or more of the corrective measures (…) including a financial penalty.”
Maéva EB.