Argentinian Court found Facial recognition unconstitutional

An Argentinian Court has declared the use of the ‘Facial Recognition for Fugitives System’ (Sistema de Reconocimiento Facial de Prófugos), deployed in Buenos Aires in 2019, unconstitutional. In April 2022, in response to the legal challenge presented by Observatorio de Derecho Informático Argentino (ODIA), and several other human rights organisations, the judge suspended use of the system. The court issued the verdict as it considered that, among other issues, implementing the system would not be possible due to a number of irremediable legal flaws. 

            As previously explained, the ‘Facial Recognition for Fugitives System’ (FRFS) involved the deployment, in Buenos Aires, of 9,500 surveillance cameras equipped with facial recognition technology. The system was deployed as a result of a ruling by the local government and, following what appears to be a  public debate concerning the regulation of FR systems in public spaces, was approved by the Congress of Buenos Aires. The apparent objective of the system was to help detect fugitives.

The Court noted a number of issues regarding the system, some related to its legality, and others related to the problematic ways in which the system may be used. 

With regard to the legal issues surrounding the deployment of the system, the judge noted that it was deployed without a debate having taken place on the relevance and security of the system, jeopardising constitutional rights, such as guarantees that non-discrimination, privacy, and data protection, among others, would be preserved. In addition, as stated by the defendants, the Buenos Aires government did not carry out the required privacy impact assessment to determine whether using the system can be justified, and is legitimate, necessary and proportional. Furthermore, the fact that the “Special Monitoring Commission for Video Surveillance Systems” (Comisión Especial de Seguimiento de los Systems de Video Vigilancia) was never implemented, in hand with the lack of public debate regarding the implementation of the system, leaves individuals without effective or adequate guarantees concerning their privacy. 

Moreover, the judge noted a number of issues with the system. Firstly, the judge pointed out that the database on which the FR system relies has serious flaws that could lead to erroneous arrests, with detrimental consequences for the individuals involved. In order to support this claim, the judge mentioned a number of cases of individuals who were incorrectly arrested because of false positive matches. 

Secondly, expert reports cited in the resolution explained that the biometric data belonging to 15,459 individuals were loaded into the FR system, despite not being present in the fugitives database, and despite this action not being authorised by a judicial authority, or having any legal basis for doing so. This meant that biometric data belonging to people who had not been prosecuted was inputted into, and subsequently used whilst running the FR system. This was the case for 84 individuals who were entered into the system at the request of Interpol. 

Given the aforementioned issues, the Court declared the FRFS unconstitutional, since it did not comply with legal requirements around the protection of human rights in relation to the inhabitants of Buenos Aires. 

However, it appears that the judge did not rule out deploying the system in future, stating that, if the SRFP is to be implemented again, a number of requirements would need to be met, such as control mechanisms through the form of a Special Commission to monitor the video surveillance system, a  supervisory body, a register of the data from the video surveillance system, an impact assessment regarding personal data in advance of the system being deployed, and a public debate on the issues surrounding the FR system. 

            This case, which involves the misuse of facial recognition, shows some of the risks that this type of technology can pose for societies when it is deployed without adequate vigilance. Moreover, these systems produce errors, which leads us to question the effectiveness of the use of FR technology in public spaces. Hence the importance of conducting a public debate on the matter. 


Like this article?
Share on Facebook
Share on Twitter
Share on Linkdin
Share by Email