Fire and Forget: A Moral Defense of the Use of Autonomous Weapons Systems in War and Peace

Author(s):  
Duncan MacIntosh

Setting aside the military advantages offered by Autonomous Weapons Systems for a moment, international debate continues to feature the argument that the use of lethal force by “killer robots” inherently violates human dignity. The purpose of this chapter is to refute this assumption of inherent immorality and demonstrate situations in which deploying autonomous systems would be strategically, morally, and rationally appropriate. The second part of this chapter objects to the argument that the use of robots in warfare is somehow inherently offensive to human dignity. Overall, this chapter will demonstrate that, contrary to arguments made by some within civil society, moral employment of force is possible, even without proximate human decision-making. As discussions continue to swirl around autonomous weapons systems, it is important not to lose sight of the fact that fire-and-forget weapons are not morally exceptional or inherently evil. If an engagement complied with the established ethical framework, it is not inherently morally invalidated by the absence of a human at the point of violence. As this chapter argues, the decision to employ lethal force becomes problematic when a more thorough consideration would have demanded restraint. Assuming a legitimate target, therefore, the importance of the distance between human agency in the target authorization process and force delivery is separated by degrees. A morally justifiable decision to engage a target with rifle fire would not be ethically invalidated simply because the lethal force was delivered by a commander-authorized robotic carrier.

2021 ◽  
pp. 237-258
Author(s):  
S. Kate Devitt

The rise of human-information systems, cybernetic systems, and increasingly autonomous systems requires the application of epistemic frameworks to machines and human-machine teams. This chapter discusses higher-order design principles to guide the design, evaluation, deployment, and iteration of Lethal Autonomous Weapons Systems (LAWS) based on epistemic models. Epistemology is the study of knowledge. Epistemic models consider the role of accuracy, likelihoods, beliefs, competencies, capabilities, context, and luck in the justification of actions and the attribution of knowledge. The aim is not to provide ethical justification for or against LAWS, but to illustrate how epistemological frameworks can be used in conjunction with moral apparatus to guide the design and deployment of future systems. The models discussed in this chapter aim to make Article 36 reviews of LAWS systematic, expedient, and evaluable. A Bayesian virtue epistemology is proposed to enable justified actions under uncertainty that meet the requirements of the Laws of Armed Conflict and International Humanitarian Law. Epistemic concepts can provide some of the apparatus to meet explainability and transparency requirements in the development, evaluation, deployment, and review of ethical AI.


Author(s):  
Peter Asaro

As the militaries of technologically advanced nations seek to apply increasingly sophisticated AI and automation to weapons technologies, a host of ethical, legal, social, and political questions arise. Central among these is whether it is ethical to delegate the decision to use lethal force to an autonomous system that is not under meaningful human control. Further questions arise as to who or what could or should be held responsible when lethal force is used improperly by such systems. This chapter argues that current autonomous weapons are not legal or moral agents that can be held morally responsible or legally accountable for their choices and actions, and that therefore humans need to maintain control over such weapons systems.


Author(s):  
Tim McFarland ◽  
Jai Galliott

While some are reluctant to admit it, we are witnessing a fundamental shift in the way that advanced militaries conduct their core business of fighting. Increasingly autonomous ‘unmanned' systems are taking on the ‘dull, dirty and dangerous' roles in the military, leaving human war fighters to assume an oversight role or focus on what are often more cognitively demanding tasks. To this end, many military forces already hold unmanned systems that crawl, swim and fly, performing mine disposal, surveillance and more direct combat roles. Having found their way into the military force structure quite rapidly, especially in the United States, there has been extensive debate concerning the legality and ethicality of their use. These topics often converge, but what is legal will not necessarily be moral, and vice versa. The authors' contribution comes in clearly separating the two parts. In this paper, they provide a detailed survey of the legality of employing autonomous weapons systems in a military context.


2018 ◽  
pp. 289-316
Author(s):  
Michael W. Meier

Over the past decade, there has been a proliferation of remotely piloted aircraft or “drones” being used on the battlefield. Advances in technology are going to continue to drive changes in how future conflicts will be waged. Technological innovation, however, is not without its detractors as there are various groups calling for a moratorium or ban on the development and use of autonomous weapons systems. Some groups have called for a prohibition on the development, production, and use of fully autonomous weapons through an international legally binding instrument, while others view advances in the use of technology on the battlefield as a natural progression that will continue to make weapons systems more discriminate. The unanswered question is, which point of view will be the right one? This chapter approaches this question by addressing the meaning of “autonomy” and “autonomous weapons systems.” In addition, this chapter looks at the U.S. Department of Defense’s vision for the potential employment of autonomous systems, the legal principle applicable to these systems, and the weapons review process.


2020 ◽  
Author(s):  
Daniele Amoroso

Recent advances in robotics and AI have paved the way to robots autonomously performing a wide variety of tasks in ethically and legally sensitive domains. Among them, a prominent place is occupied by Autonomous Weapons Systems (or AWS), whose legality under international law is currently at the center of a heated academic and diplomatic debate. The AWS debate provides a uniquely representative sample of the (potentially) disruptive impact of new technologies on norms and principles of international law, in that it touches on key questions of international humanitarian law, international human rights law, international criminal law, and State responsibility. Against this backdrop, this book’s primary aim is to explore the international legal implications of autonomy in weapons systems, by inquiring what existing international law has to say in this respect, to what extent the persisting validity of its principles and categories is challenged, and what could be a way forward for future international regulation on the matter. From a broader perspective, the research carried out on the issue of the legality of AWS under international law aspires to offer some more general insights on the normative aspects of the shared control relationship between human decision-makers and artificial agents. Daniele Amoroso is Professor of International Law at the Law Department of the University of Cagliari and member of the International Committee for Robot Arms Control (ICRAC) contrattualistica internazionale” presso il Ministero del Commercio con l’Estero (ora Ministero dello Sviluppo Economico – Commercio Internazionale).


2019 ◽  
pp. 412-432
Author(s):  
Tim McFarland ◽  
Jai Galliott

While some are reluctant to admit it, we are witnessing a fundamental shift in the way that advanced militaries conduct their core business of fighting. Increasingly autonomous ‘unmanned' systems are taking on the ‘dull, dirty and dangerous' roles in the military, leaving human war fighters to assume an oversight role or focus on what are often more cognitively demanding tasks. To this end, many military forces already hold unmanned systems that crawl, swim and fly, performing mine disposal, surveillance and more direct combat roles. Having found their way into the military force structure quite rapidly, especially in the United States, there has been extensive debate concerning the legality and ethicality of their use. These topics often converge, but what is legal will not necessarily be moral, and vice versa. The authors' contribution comes in clearly separating the two parts. In this paper, they provide a detailed survey of the legality of employing autonomous weapons systems in a military context.


2020 ◽  
Vol 17 (1) ◽  
pp. 167-196
Author(s):  
Ozlem Ulgen

Military investment in robotics technology is leading to development and use of autonomous weapons, which are machines with varying degrees of autonomy in target, attack, and infliction of lethal harm (that is, injury, suffering or death). Examples of autonomous weapons include weapons systems involving levels of automation and remotely controlled human input, unmanned armed aerial vehicles (uav), remotelycontrolled robotic soldiers, bio-augmentation, and 3D printed weapons. Autonomous weapons generally fall into one of two categories: semi-autonomous, involving some degree of autonomy in certain critical functions such as acquiring, tracking, selecting, and attacking targets, along with a degree of human input or remote control (for example, uav or ‘drones’); and autonomous, involving higher levels of independent thinking as regards critical functions without the need for human input or control (for example, US Navy X-47B uav with autonomous take-off, landing, and aerial refuelling capability). The trend is clearly towards developing autonomous weapons. Development of new weapons aimed at reducing costs and casualties is not a new phenomenon in warfare. Technological advances have created greater distance between the soldier and the battlefield. A bullet fired from a rifle handled by a human has been superseded by a missile fired from a remotely controlled or autonomous machine. So what makes autonomous weapons different? What particular challenge do they pose international law? Although autonomous weapons may be employed to attack nonhuman targets, such as state infrastructure, here I am primarily concerned with their use for lethal attacks against humans. In this chapter I focus on autonomous weapons (both semi-autonomous and fully autonomous) and their impact on human dignity under two of Kant’s conceptual strands: (1) human dignity as a status entailing rights and duties; and (2) human dignity as respectful treatment. Under the first strand I explore how use of autonomous weapons denies the right of equality of persons and diminishes the duty not to harm others. In the second strand I consider how replacing human combatants with autonomous weapons debases human life and does not provide respectful treatment. Reference is made to contemporary development of Kant’s conceptual strands in icj and other international jurisprudence recognising human dignity as part of ‘elementary considerations of humanity’ in war and peace.


Author(s):  
Steven J. Barela ◽  
Avery Plaw

The possibility of allowing a machine agency over killing human beings is a justifiably concerning development, particularly when we consider the challenge of accountability in the case of illegal or unethical employment of lethal force. We have already seen how key information can be hidden or contested by deploying authorities, in the case of lethal drone strikes, for example. Therefore, this chapter argues that any effective response to autonomous weapons systems (AWS) must be underpinned by a comprehensive transparency regime that is fed by robust and reliable reporting mechanisms. This chapter offers a three-part argument in favor of a robust transparency regime. Firstly, there is a preexisting transparency gap in the deployment of core weapon systems that would be automated (such as currently remote-operated UCAVs). Second, while the Pentagon has made initial plans for addressing moral, ethical, and legal issues raised against AWS, there remains a need for effective transparency measures. Third, transparency is vital to ensure that AWS are only used with traceable lines of accountability and within established parameters. Overall this chapter argues that there is an overwhelming interest and duty for actors to ensure robust, comprehensive transparency, and accountability mechanisms. The more aggressively AWS are used, the more rigorous these mechanisms should be.


2022 ◽  
Vol 13 (4) ◽  
pp. 182-204
Author(s):  
A. Yu. Lipova

In the recent years debates surrounding the autonomous weapons systems development and regulation have gained a new momentum. Despite the fact that the development of such type of weapons continues since the twentieth century, recent technological advances open up new possibilities for development of completely autonomous combat systems that will operate without human in-tervention. In this context, international community faces a number of ethical, legal, and regulatory issues. This paper examines the ongoing debates in both the Western and the Russian expert community on the challenges and prospects for using lethal autonomous systems. The author notes that Russian and Western discourses on most of the issues have very much in common and diff erences are found mainly in the intensity of debates — in the West they are much more ac-tive. In both cases the most active debates focus around two issues: the potential implications of fully autonomous weapons systems including the unclear line of accountability, and the prospects for international legal regulation of the use of lethal autonomous weapons. Both the Russian and the Western experts agree that the contemporary international humanitarian law is unable to handle the challenges posed by aggressive development of the lethal autonomous weapons. All this points to the need to adapt the international humanitarian law to the new realities, which, in turn, requires concerted actions from leading states and international organizations.


Sign in / Sign up

Export Citation Format

Share Document