Weapon System with Autonomous Functions and the Martens Clause: Are the use of these weapons in line with the principles of humanity and the dictates of public conscience?

By Clea Strydom

[Clea Strydom completed her B.A. Law and LL.B at Stellenbosch University, South Africa, before writing her LL.M dissertation on the International Humanitarian Law implications of weapon systems with autonomous functions through the University of Johannesburg, South Africa.]

Introduction

States are increasingly implementing artificial intelligence (AI) to pursue autonomy in weapon systems for armed conflict for various reasons, including, faster reaction time, faster data collection and processing, and being able to use robots instead of risking human combatants’ lives. There are, however, concerns that weapon systems with autonomous functions cannot be used in compliance with International Humanitarian Law (IHL), that it is unethical for machines to lethally target humans, and that their use could lead to an accountability gap.  Therefore, there has been an ongoing debate about whether to ban the development of these weapon systems. The mere fact that these systems have autonomy is not the issue the ongoing legal debate is focused on; rather it is the delegation of critical functions i.e., acquiring, tracking, selecting, and attacking targets, to weapon systems, that is of concern. The ICRC has correctly identified that “ethics, humanity and the dictates of the public conscience are at the heart of the debate about the acceptability of autonomous weapon systems.” 

Weapon Systems with Autonomous Functions

Autonomy in weapon systems should not be seen as a mere development of conventional weapons, instead, it is a paradigm shift in weapons technology that could change warfare drastically. Autonomy in weapon systems does not denote a specific new weapon but rather a shift in the level of human control over critical functions to weapon systems. This concerns a change in how warfare is conducted. While the most widely used terms are Lethal Autonomous Weapon Systems (LAWS) or Autonomous Weapon Systems (AWS), ascribing autonomy to the whole system is problematic.  It should be kept in mind that autonomy is not a type of technology, but rather a characteristic of technology, related to certain functions, instead of being attached to the object itself. Due to the problems with ascribing autonomy to the system, Andrew Williams suggests referring to “autonomous functioning in a system” in general, or “systems with autonomous functions” when referring to a specific platform or system. Therefore, the author has adopted the term weapon systems with autonomous functions (WSAF), as it indicates that the whole machine is not autonomous, but instead that it can perform certain functions with varying degrees of human interference, which will depend on various factors such as the system’s design or intelligence, the external environmental conditions in which the systems will be required to operate, the nature and complexity of the mission, as well as policy and legal regulations. It must be kept in mind that while autonomy in weapon systems is being pursued by several States, weapon systems that can perform critical functions autonomously are still a thing of the future. Therefore, the debate, including the advantages and disadvantages of autonomy in weapon systems, is at this stage still speculative.

The Martens Clause

The Martens Clause made its first appearance in the 1899 Hague Convention II and has since been included in Additional Protocol to the Geneva Conventions, Article 1(2): 

“In cases not covered by this Protocol or by other international agreements, civilians and combatants remain under the protection and authority of the principles of international law derived from established custom, from the principles of humanity and from the dictates of public conscience”.

The International Court of Justice in the Legality of the Threat or Use of Nuclear Weapons Advisory Opinion confirmed the principle contained in the Marten Clause as customary IHL and held that it “proved to be an effective means of addressing rapid evolution of military technology”. Concerning WSAF, the crux is whether the delegation of life and death decisions to a robot would be in line with the dictates of public conscience and principles of humanity.

Professor Michel Veuthy highlighted the importance of public conscience in IHL and identified that it can trigger the codification of IHL principles, be an impetus for the implementation and enforcement of IHL, and provide a safeguard for all situations not provided for or considered in the law. On the other side of the argument, Michael Schmitt argues that the Martens Clause only applies in the absence of applicable law in the Geneva Convention and Additional Protocols or international agreements such as treaties; and that since 1899, when the Martens Clause first appeared, the law relating to weapons has developed to such an extent that it covers all existing and future weapons. As a result, the role of the Martens Clause has been diminished. He argues that it is unlikely that any weapons would be found to be in contravention of the Martens Clause if it has been found to comply with IHL and applicable treaties. However, Robin Geiss points out that the IHL principles applicable to weapons are framed in a human-centric manner and might not sufficiently be able to deal with autonomy in weapon systems; therefore the Martens Clause could be used to create new laws or act as a safety net, as Veuthy suggests.

Even if it is accepted that a weapon could be banned based on the Martens Clause, several questions with no clear answers arise: first, how does one determine what the public conscience is, and secondly, which public? It is unlikely that the global public will share a common ‘conscience’. The public conscience and principles of humanity are not timeless or universal.  Several individuals have conducted surveys to try and determine public opinion on the weapon systems in question. Political scientist and current Inspector General of the United States Department of Justice, Michael Horowitz found that public opinions depend on context. In the first round of questions, Horowitz’s survey found that 48% of participants were opposed to “autonomous weapons”. However, once he put the use of the weapons in context and highlighted their benefits, opposition to them dropped to 27%. In American roboticist and robo-ethicist, Ronald Arkin’s survey participants acknowledged that “autonomous weapon systems” do have a role to play, but the majority felt that they should not be allowed to use force. IPSOS, a global market research, and public opinion specialist company has done various surveys on the views of “killer robots” for Human Rights Watch and the Campaign to Stop Killer Robots (who have called for a ban of “weapon systems that can perform critical functions autonomously). Interestingly the latest survey, conducted between November 2020 and January 2021 across 28 countries, shows that there is a correlation between opposition and the age of the respondents; with a 54% opposition average for those under 35 years of age, and 69% among those ages 50-74. This can be indicative of several factors, including that the younger generation is more accepting of technology and that the older population is more likely to have had first-hand experiences of the horrors of war. 

HRW believes that States should be considering these views when reviewing “autonomous weapons”. The perspectives do not create binding rules but may influence treaties and decisions to deploy the weapons. It is important to keep in mind that opinions change over time. While 50 years ago we could not imagine the possibility of unmanned remote-controlled systems being an integral part of military arsenals as they are today, we have come to accept them to a large extent. Surveys need to be seen in the context of the time, the way the questions are framed, and in this case, advancement in technology. As autonomy in weapon systems develop and the technology becomes more advanced, views on them will change. Armin Krishnan notes, in his book titled Killer Robots: Legality and Ethicality of Autonomous Weapons, that with “social conditioning” views on WSAF will evolve. 

Regarding the principles of humanity, there is a concern about the importance of human agency in life and death decisions. A lot of anxiety exists about losing human control over weapon systems and war in general, which raises questions beyond compliance with laws and also considers whether the deployment of such weapon systems is in line with our values.  Delegating decisions about life and death may dehumanize armed conflict even further. The concern is that allowing weapon systems to lethally target humans means they are not treated as unique human beings which is an afront on human dignity; late Professor Heyns referred to this as “death by algorithm”. It has also been argued that the anthropocentric formulation of IHL principles implicitly requires human judgment over decisions regarding force.

Conclusion

To date, the Martens Clause has never been used to ban a weapon. It must be kept in mind that at this stage the debate is still very speculative. Weapon systems that can perform critical functions autonomously, however, offer numerous advantages and it is unlikely that States will refrain from developing and deploying weapons that would give them the upper hand based on personally held views. What the Martens Clause does do is to remind us that in deciding on whether and how to design, develop and use WSAF we must do so in a way that safeguards our values instead of rendering them unsustainable. 

Views expressed in this article are the author’s own and are not representative of the official views of Jus Cogens Blog or any other institute or organization that the author may be affiliated with.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s