French Defense Ethics Committee’s Opinion: Need for a Clear Distinction Between LAWS and PALWS

On April 29, 2021, the French Defense Ethics Committee issued an Opinion on the integration of autonomy into lethal weapon systems. Through this work, the Committee aims to establish a clear definition of autonomy and a clear distinction between fully (LAWS) and partially autonomous lethal weapon systems (PAWLS).

As a reminder, the Defense Ethics Committee is a permanent structure which considers the ethical issues brought about by new technologies in the field of defense. It was set up in January 2020 by the French Minister for the Armed Forces, Florence Parlyand brings together qualified civilian and military figures who offer expertise in the fields of operations, science, medicine, philosophy, history and law. The Committee is responsible for providing insights to the Minister for the Armed Forces on the ethical issues raised by scientific and technical innovations intended for the military, as well as issues concerning the evolution of the military field.

Following the publication of its first report on the “augmented soldier” in December 2020, the Defense Ethics Committee submitted a new opinion on the integration of autonomy in lethal weapon systems. According to their press release, the issue was identified by the French Ministry of Defense as being important, due to the fact that they were “aware of the strategic, legal and ethical issues raised by the development of military applications of artificial intelligence (AI), and in particular by the potential emergence of fully autonomous lethal weapon systems (LAWS)”.

In 2019, France already expressed its position on LAWS and does not wish to develop so-called “killer robots”. Such a position was highlighted by the Defense Committee in the report’s conclusions. They confirmed France’s decision not to develop or use fully autonomous lethal weapons systems and the Committee “regards this decision as unquestionably sound in light ofthe ethical, legal and operational principles governing the action of the French armed forces”. As highlighted by the report, the use of LAWS would indeed: 

  • “break the chain of command;
  • run counter to the constitutional principle of having liberty of action to dispose of the armed forces
  • not provide any assurance as to compliance with the principles of international humanitarian law (IHL); 
  • be contrary to French military ethics and the fundamental commitments made by French soldiers, i.e. honour, dignity, controlled use of force and humanity.”

However, although France does not wish to develop or deploy LAWS, the Committee highlights that “the definition of LAWS is still subject to debate” and it has no universally accepted definition. As a result, the report aims “to identify what is meant by autonomy” with an “in-depth analysis” of “the pitfalls of using certain terms such as “autonomy”, “autonomous” or “intelligent system” in relation to machines and, more generally, all terms in the lexical field of anthropomorphism when they are used to describe objects or systems”. This in-depth analysis led the Committee to distinguish between lethal autonomous weapon system (LAWS), which should be prohibited, and partially autonomous lethal weapon systems (PALWS). According to the report:

  • “LAWS are lethal weapon systems programmed to be capable of changing their rules of operation and therefore are likely to depart from the employment framework initially defined. Their software may compute decisions in order to perform actions without any assessment of the situation by the command.”
  • “PALWS are lethal weapon systems integrating automation and software: 

·   to which, after assessing the situation and under their responsibility, the military command can assign the computation and execution of tasks related to critical functions such as identification, classification, interception and engagement within time and space limits and under conditions; 

·   which include technical safeguards or intrinsic characteristics to prevent failures, misuse and relinquishment by the command of two essential duties, namely situation assessment and reporting.”

This distinction led the Defense Committee to emphasize that the prohibition of LAWS “should not be extended to the integration of automation into low-level functions of certain lethal weapon systems” as these technologies may have operational benefits, identified as the “5 Ps” (performance, pertinence, precision, protection and permanence). Moreover, the Committee believes that the integration of these technologies, by means of their design and implementation, should therefore “only be contemplated if all the guarantees of their correct use are met”, namely the “5 Cs” (command, risk control, compliance, competence, confidence).As a result of this analysis, the Committee has “identified six guiding principles and set out 25 guidelines relating to methodology, research, use, design and training” with regard to PALWS.

In addition, the Committee underlines that if France does not intend to develop or use fully autonomous weapon systems, it “should continue research in the fields of defence artificial intelligence and weapon systems automation” in order to avoid losing ground in the scientific and technological fields, to counter enemy development of LAWS and to defend itself “against this type of weapon in the likely event of their use by an enemy State or terrorist group against [their] troops or population”. The Committee emphasizes that “such research must be governed by a strict ethical and legal framework and be conducted in compliance with legal review mechanisms.”

This approach is in line with civil society organisations’ repeated call for a ban on fully autonomous lethal weapons.

LAWS have indeed been at the centre of crucial debates since the late 2000s. Civil society organisations in particular have called for a pre-emptive ban of these technologies. Action on Armed Violence (formerly Landmine Action) was one of the first organisations to do so, in 2008, when they called for an international treaty to stop robots being used to autonomously exercise lethal force. In September 2009, the International Committee for Robot Arms Control (ICRAC) called for a ban on the development and operational use of unmanned autonomous armed systems. Over the years, there have been numerous campaigns that have called for a ban of LAWS, such as Article 36, who called for a ban on all military systems capable of autonomous targeting and firing, and Human Rights Watch and the Harvard Law School International Human Rights Clinic, who called for a pre-emptive ban on these weapons in their report Losing Humanity: The Case Against Killer Robots, published in 2012. So far, the most influential campaign has been the Campaign called Stop Killer Robots, a coalition of NGOs working to bring about an international ban on all fully autonomous weapons, and to retain meaningful human control over the use of lethal force. Created in 2013, this coalition still continues to call on United Nations’ Members States to develop and establish policies that will prevent the development of future LAWS.  

To date no legally binding international instrument has been produced for these technologies, despite the existence of the dedicated Group of Governmental Experts (GGE) on LAWS and the publication in 2019 of their guiding principles, and theEuropean Commission’s legislative proposal on AI of April 2021 that clearly states that this regulation “shall not apply to AI systems developed or used exclusively for military purposes”. 

States are therefore adopting their own position on the matter and the French Committee provides some interesting food for thought by stating that there is a clear “difference in nature between LAWS, and the notion of PALWS (…) which designates a system that cannot be deployed without human intervention: the human remains at the heart of decisions on the use of lethal force”.

In the near future, it remains to be seen how this independent opinion will be viewed by the Minister for the Armed Forces, once the report’s conclusions have been appropriated by the Ministry’s departments. 

MEB.

Source : https://www.defense.gouv.fr/salle-de-presse/communiques/communique_le-comite-d-ethique-de-la-defense-publie-son-rapport-sur-l-integration-de-l-autonomie-des-systemes-d-armes-letaux

Like this article?
Share on Facebook
Share on Twitter
Share on Linkdin
Share by Email