On April 8, 2020, the Committee of Ministers of the Council of Europe (CoE) released a new regulatory framework : the Recommendation on the human rights impacts of algorithmic systems.
In their recommendations, the CoE called on its 47 Member States to take a precautionary approach to the development and use of algorithmic systems and adopt legislation, policies and practices that fully respect human rights.
The CoE’s recommendation issued a set of guidelines on addressing the human rights impacts of algorithmic systems. It called on governments to ensure that they do not breach human rights through their own use, development or procurement of algorithmic systems.
‘‘ The functionality of algorithmic systems is frequently based on the systematic aggregation and analysis of data collected through the digital tracking at scale of online and offline identity and behaviour of individuals and groups. In addition to the intrusion on individuals’ privacy and the increasing potential of highly personalised manipulation, tracking at scale can have a serious adverse effect on the exercise of human rights, which must be considered throughout the entire life cycle of an algorithmic system, from the proposal stage onward’’.
To prevent human rights abuses arising from algorithmic systems or automation, the CoE recommends to regulators to establish effective and ‘‘predictable legislative, regulatory and supervisory frameworks that prevent, detect, prohibit and remedy human rights violations, whether stemming from public or private actors’’.
The CoE’s guidelines recalled general Member States obligations but also provided “analysis and modelling” recommendations for a better human rights’ implementation and safeguard at the core of the system as well as indications to the levels of transparency, accountability and effective remedies that should be expected in the case of a human rights breach caused by an algorithmic system.
The guidelines also provided recommendations on private sector actor’s responsibilities with respect to human rights and fundamental freedoms in the context of algorithmic systems.
Furthermore, as Member States have to ‘‘co-operate with each other and with all relevant stakeholders, including civil society’’ (precautionary measures), the CoE also invited Members States to develop democratic participation and awareness of these matters in order to ensure the full exercise of human rights and democratic freedoms.
The recommendation acknowledges the vast potential of algorithmic processes to foster innovation and economic development in numerous fields, including communication, education, transportation, governance and health systems. In the current COVID-19 pandemic, algorithmic systems are being used for prediction, diagnosis and research on vaccines and treatments. Enhanced digital tracking measures are being discussed in a growing number of member States – relying, again, on algorithms and automation.
At the same time, the recommendation warns of significant challenges to human rights related to the use of algorithmic systems, mostly concerning the right to a fair trial; privacy and data protection; freedom of thought, conscience and religion; the freedoms of expression and assembly; the right to equal treatment; and economic and social rights.
‘‘’While it is often argued that the costs are offset by gains in rationalisation and accuracy, it is important to note that most algorithmic systems are based on statistical models in which errors form an inevitable part, sometimes with feedback loops that maintain, replicate and reinforce pre-existing biases, errors and assumptions. (..) As a result of the large number of people affected by algorithmic systems, the number of errors in the form of false positives and false negatives, and of people who are affected by these errors and inbuilt bias, will also expand, triggering additional interferences with the exercise of human rights in multiple ways’’.
Source: Recommendation CM/Rec(2020)1 of the Committee of Ministers to member States on the human rights impacts of algorithmic systems, Council of Europe.
MEB.