“Bridges 2”: The London High Court Finds that the Use of Facial Recognition by the South Wales Police is Unlawful

On August 18, 2020, the London High Court handed down its judgment on appeal in the so-called “Bridges” case.

The facts concern the contestation of the use of facial recognition systems by the South Wales Police (SWP) in the public space during two events in 2019. The claimant, Edward Bridges, argued that the use of facial recognition violated both the European Convention on Human Rights and UK Data protection legislation, as well as the non-discrimination obligations imposed on the public sector.

 In the first judgement, the court had validated the legality of the use of facial recognition technologies (FRT) but the judgment on appeal reversed this decision on 3 of the 5 grounds put forward by the claimant.

Indeed, the arguments concerning the legal basis for the processing, the validity of the Data Protection Impact Assessment (DPIA) and the requirement of non-discrimination led the court to rule that the system was illegal. On the other hand, the court found that the use of FRTs by the SWP did not violate the principle of proportionality as the impact on Mr Bridges in this specific case was “negligible”. On the fourth ground, it decided that the relevant documents for processing sensitive data did indeed exist. Here are the main arguments of the Court.

1/ On the legal basis for the processing.

The Court starting by stating that the use of facial recognition by police authorities interferes with the right to privacy set out in Article 8(1) of the Convention for the Protection of Human Rights and Fundamental Freedoms. However, the Court recalled that the exceptions provided under the second paragraph of Article 8 may allow the use of this technology. The court considers, nevertheless, that the conditions laid down by Article 8(2) are not met. Indeed, the Court relies on well-established standards concerning the Article 8(2) requirement that such interference “must be in accordance with the law” and thus have a clear basis in domestic law. The Court focuses on the requirement of foreseeability defined as: not ‘conferring a discretion so broad that its scope is in practice dependent on the will of those who apply it, rather than on the law itself’. It notes two aspects of the processing where the decision seems too arbitrary to the Court. The first relates to the rules governing the constitution of databases, where it considers that the discretion left to police authorities is too broad. The second relates to the decision-making process of the place where the system is deployed, which also leaves too much discretion (§91). To support its reasoning, the Court indicates that neither the Data Protection Act of 2018, nor the Surveillance Camera Code of Practice, nor the SWP’s local policies are sufficient to frame the constitution of databases and the choice of deployment location by predictable rules of law. Following this reasoning, the Court considers that the processing ‘do not have the necessary quality of law’ (§94) and thus declares it illegal.

2/ On failures in the DPIA.

The Court states on this point that the DPIA did indeed point out the possible contradiction between Article 8 ECHR and the use of facial recognition, but the document indicates that there is no interference. However, the Court’s analysis in ground 1 shows the opposite. Consequently, by not pointing out this contradiction, the DPIA has not been properly carried out and therefore does not meet the requirements of Section 64 of the Data Protection Act.

3/ On the Public Sector Equality Duty.

The Court noted that, in general, uncertainties exist regarding the possible discrimination of facial recognition algorithms on the basis of gender or ethnicity. Without asserting that the device in question is in fact discriminatory, the Court recalls that the duty of non-discrimination is a positive obligation. This implies that the SWP had to take a proactive approach in establishing the algorithm’s non-discrimination. This step was not taken, since it was only in the context of the proceeding that technical inquiries on the characteristics of the algorithm, in particular on the basis of training data, were made. These requests remained unanswered since they were in conflict with commercial protection. Consequently, the Court considers that the obligation which was laid down by the Public Sector Equality Duty has not been respected.

4/ On the issue of proportionality

The Court started by saying that it is unnecessary to consider Ground 2 in this appeal, which related to the question of proportionality, taking into consideration that it had already found that the processing lacked an adequate legal basis (§ 131). However, the Court proceeded to an analysis on this point and found that the use of FRTs by the SWP did not violate the principle of proportionality as the impact on Mr Bridges was “negligible” (§ 143). More precisely, the Court said the following:

 “we accept the submission made by Mr Beer on behalf of SWP that the impact on each of the other members of the public who were in an analogous situation to this Appellant on the two occasions with which we are concerned for present purposes (in December 2017 and March 2018) was as negligible as the impact on the Appellant’s Article 8 rights. An impact that has very little weight cannot become weightier simply because other people were also affected. It is not a question of simple multiplication. The balancing exercise which the principle of proportionality requires is not a mathematical one; it is an exercise which calls for judgement.”

The NGO Liberty expressed great satisfaction with the judgment and welcomed the decision as ‘a major victory in the fight against discriminatory and oppressive facial recognition’.

On its side, the Surveillance Camera Commissioner considered in a statement that this decision did not mean the end of the technology, considering that it provided citizens with greater security.

TC & MB

Like this article?
Share on Facebook
Share on Twitter
Share on Linkdin
Share by Email