scholarly journals Normative Epistemology for Lethal Autonomous Weapons Systems

2021 ◽  
pp. 237-258
Author(s):  
S. Kate Devitt

The rise of human-information systems, cybernetic systems, and increasingly autonomous systems requires the application of epistemic frameworks to machines and human-machine teams. This chapter discusses higher-order design principles to guide the design, evaluation, deployment, and iteration of Lethal Autonomous Weapons Systems (LAWS) based on epistemic models. Epistemology is the study of knowledge. Epistemic models consider the role of accuracy, likelihoods, beliefs, competencies, capabilities, context, and luck in the justification of actions and the attribution of knowledge. The aim is not to provide ethical justification for or against LAWS, but to illustrate how epistemological frameworks can be used in conjunction with moral apparatus to guide the design and deployment of future systems. The models discussed in this chapter aim to make Article 36 reviews of LAWS systematic, expedient, and evaluable. A Bayesian virtue epistemology is proposed to enable justified actions under uncertainty that meet the requirements of the Laws of Armed Conflict and International Humanitarian Law. Epistemic concepts can provide some of the apparatus to meet explainability and transparency requirements in the development, evaluation, deployment, and review of ethical AI.

2022 ◽  
Vol 13 (4) ◽  
pp. 182-204
Author(s):  
A. Yu. Lipova

In the recent years debates surrounding the autonomous weapons systems development and regulation have gained a new momentum. Despite the fact that the development of such type of weapons continues since the twentieth century, recent technological advances open up new possibilities for development of completely autonomous combat systems that will operate without human in-tervention. In this context, international community faces a number of ethical, legal, and regulatory issues. This paper examines the ongoing debates in both the Western and the Russian expert community on the challenges and prospects for using lethal autonomous systems. The author notes that Russian and Western discourses on most of the issues have very much in common and diff erences are found mainly in the intensity of debates — in the West they are much more ac-tive. In both cases the most active debates focus around two issues: the potential implications of fully autonomous weapons systems including the unclear line of accountability, and the prospects for international legal regulation of the use of lethal autonomous weapons. Both the Russian and the Western experts agree that the contemporary international humanitarian law is unable to handle the challenges posed by aggressive development of the lethal autonomous weapons. All this points to the need to adapt the international humanitarian law to the new realities, which, in turn, requires concerted actions from leading states and international organizations.


2020 ◽  
Vol 13 (2) ◽  
pp. 115
Author(s):  
Roman Dremliuga

This article focuses on the problem of regulation of the application of the autonomous weapons systems from the perspective of the norms and principles of international humanitarian law. The article discusses the question of what restrictions are imposed on the application of such weapons in the international humanitarian law. The article presents a number of principles that must be met by both the weapons and their method of their application: distinction between civilians and combatants, military necessity, proportionality, prohibition on causing unnecessary suffering, and humanity. The author concludes that from the perspective of the principles of the international humanitarian law, it is doubtful if autonomous systems would be able to comply with these principles. Weapons that hit targets without human intervention have been applied for a long time, but they have never had the independence that they have now. The issue of compliance of autonomous weapons systems with the international humanitarian law can be considered if sufficient experience of application of such weapons in real conditions is accumulated. This study demonstrates that it is impossible to say that autonomous weapons systems do not comply with the principles of humanitarian law in general. The paper provides policy recommendations and assessments for each of the principles under consideration. The author also concludes that it would be necessary not to prohibit autonomous weapons, because they do not comply with the principles of international humanitarian law, but to develop rules for their application and for human participation in their functioning. A significant challenge to the development of such rules is the opacity of these autonomous weapons systems, if we look at them as at the complex intelligent computer systems.


2019 ◽  
Vol 7 (1) ◽  
pp. 124-131
Author(s):  
Sai Venkatesh

The objective of this paper is to legally analyze the issues surrounding the use and regulation of Autonomous Weapons Systems (AWS) and their implications on the existing principles of International Humanitarian Law (IHL). The research and mode of approach towards this issue will be directed in consonance with the New Haven School of International Legal Thought. The paper will begin by defining the terms ‘AWS’ and ‘New Haven school’ for the purpose of this study. Subsequently, it will highlight the various notable issues of contention with relation to existing principles of IHL. In doing so, the paper will earmark these issues under the scope of the New Haven method and conclude exclusively to that school of international thought.  In its conclusion, this paper will emphasize the need for AWS in today’s world, and how regulation, rather than prohibition, would be the ideal solution towards addressing the conundrum of their legality. It will also distinguish the key elements of the New Haven school and how these were directly incorporated into this paper so as to arrive at the predicated resolution, emphasizing the need for legality of AWS to attain world peace and order. 


Author(s):  
Natella Sinyaeva

The article examines the issues of possible control, from the standpoint of international humanitarian law, at the stage of developing autonomous weapons systems. The author notes that the development of autonomous weapons systems raises serious social and ethical concerns. He considers the existing norms and principles of international humanitarian law applying to control the development and use of such systems. The author considers autonomous weapons systems from the perspective of the distinction between civilians (civilian targets) and combatants (military objects), that means precautions in attack and proportionality.


Author(s):  
Laura A. Dickinson

The rise of lethal autonomous weapons systems creates numerous problems for legal regimes meant to ensure public accountability for unlawful uses of force. In particular, international humanitarian law has long relied on enforcement through individual criminal responsibility, which is complicated by autonomous weapons that fragment responsibility for decisions to deploy violence. Accordingly, there may often be no human being with the requisite level of intent to trigger individual responsibility under existing doctrine. In response, perhaps international criminal law could be reformed to account for such issues. Or, in the alternative, greater emphasis on other forms of accountability, such as tort liability and state responsibility might be useful supplements. Another form of accountability that often gets overlooked or dismissed as inconsequential is one that could be termed “administrative accountability.” This chapter provides a close look at this type of accountability and its potential.


2015 ◽  
Vol 6 (2) ◽  
pp. 247-283 ◽  
Author(s):  
Jeroen van den Boogaard

Given the swift technologic development, it may be expected that the availability of the first truly autonomous weapons systems is fast approaching. Once they are deployed, these weapons will use artificial intelligence to select and attack targets without further human intervention. Autonomous weapons systems raise the question of whether they could comply with international humanitarian law. The principle of proportionality is sometimes cited as an important obstacle to the use of autonomous weapons systems in accordance with the law. This article assesses the question whether the rule on proportionality in attacks would preclude the legal use of autonomous weapons. It analyses aspects of the proportionality rule that would militate against the use of autonomous weapons systems and aspects that would appear to benefit the protection of the civilian population if such weapons systems were used. The article concludes that autonomous weapons are unable to make proportionality assessments on an operational or strategic level on their own, and that humans should not be expected to be completely absent from the battlefield in the near future.


2019 ◽  
Vol 7 (3) ◽  
pp. 351-368
Author(s):  
Yordan Gunawan ◽  
Mohammad Haris Aulawi ◽  
Andi Rizal Ramadhan

AbstractWar and Technological development have been linked for centuries. States and military leaders have been searching for weapon systems that will minimize the risk for the soldier, as technology-enabled the destruction of combatants and non-combatants at levels not seen previously in human history. Autonomous Weapon Systems are not specifically regulated by IHL treaties. On the use of Autonomous Weapons Systems, there are three main principles that must be considered, namely principle of Distinction, Proportionality and Unnecessary Suffering. Autonomous weapon systems may provide a military advantage because those systems are able to operate free of human emotions and bias which cloud judgement. In addition, these weapon systems can operate free from the needs for self-preservation and are able to make decisions a lot quicker. Therefore, it is important to examine who, in this case, the commander can be held responsible when an Autonomous Weapon System will commit a crime.Keywords: Command Responsibility, Autonomous Weapons Systems, International Humanitarian Law AbstrakPerang dan perkembangan Teknologi telah dikaitkan selama berabad-abad. Para pemimpin negara dan militer telah mencari sistem senjata yang akan meminimalkan risiko bagi prajurit itu, karena teknologi memungkinkan penghancuran para pejuang dan non-pejuang pada tingkat yang tidak terlihat sebelumnya dalam sejarah manusia. Sistem Senjata Otonom tidak secara spesifik diatur oleh perjanjian IHL. Pada penggunaan Sistem Senjata Otonom, ada tiga prinsip utama yang harus diperhatikan, yaitu prinsip Perbedaan, Proportionalitas, dan Penderitaan yang Tidak Perlu. Sistem senjata otonom dapat memberikan keuntungan militer karena sistem tersebut dapat beroperasi bebas dari emosi manusia dan bias yang menghakimi. Selain itu, sistem senjata ini dapat beroperasi bebas dari kebutuhan untuk pelestarian diri dan mampu membuat keputusan lebih cepat. Oleh karena itu, penting untuk memeriksa siapa, dalam hal ini, komandan dapat bertanggung jawab ketika Sistem Senjata Otonom akan melakukan kejahatan.Kata kunci: Tanggung Jawab Komando, Sistem Senjata Otonom, Hukum Humaniter Internasional АннотацияВойна и развитие технологий были связаны на протяжении веков. Государственные и военные лидеры искали системы вооружений, которые минимизируют риски для солдат, потому что технология позволяет уничтожать боевиков и не боeвиков на уровне, невиданном ранее в истории человечества. Автономный Комплекс Вооружения конкретно не регулируется соглашением о МГП (Международное Гуманитарное Право). При использовании Автономного Комплекса Вооружения необходимо учитывать три основных принципа, а именно: принцип различия, пропорциональность и потери среди мирного населения. Автономный Комплекс Вооружения может обеспечить военные преимущества, поскольку он может функционировать без человеческих эмоций и субъективных предубеждений. Кроме того, эта система вооружения может работать без необходимости самосохранения и может принимать решения быстрее. Поэтому важно выяснить, кто, в этом случае, командир, может нести ответственность, когда Автономный Комплекс Вооружения совершит преступление. Ключевые слова: Командная ответственность, Автономный Комплекс Вооружения, Международное Гуманитарное Право 


Sign in / Sign up

Export Citation Format

Share Document