Man against machine: Debates in the expert community on the lethal autonomous systems

2022 ◽  
Vol 13 (4) ◽  
pp. 182-204
Author(s):  
A. Yu. Lipova

In the recent years debates surrounding the autonomous weapons systems development and regulation have gained a new momentum. Despite the fact that the development of such type of weapons continues since the twentieth century, recent technological advances open up new possibilities for development of completely autonomous combat systems that will operate without human in-tervention. In this context, international community faces a number of ethical, legal, and regulatory issues. This paper examines the ongoing debates in both the Western and the Russian expert community on the challenges and prospects for using lethal autonomous systems. The author notes that Russian and Western discourses on most of the issues have very much in common and diff erences are found mainly in the intensity of debates — in the West they are much more ac-tive. In both cases the most active debates focus around two issues: the potential implications of fully autonomous weapons systems including the unclear line of accountability, and the prospects for international legal regulation of the use of lethal autonomous weapons. Both the Russian and the Western experts agree that the contemporary international humanitarian law is unable to handle the challenges posed by aggressive development of the lethal autonomous weapons. All this points to the need to adapt the international humanitarian law to the new realities, which, in turn, requires concerted actions from leading states and international organizations.

2021 ◽  
pp. 237-258
Author(s):  
S. Kate Devitt

The rise of human-information systems, cybernetic systems, and increasingly autonomous systems requires the application of epistemic frameworks to machines and human-machine teams. This chapter discusses higher-order design principles to guide the design, evaluation, deployment, and iteration of Lethal Autonomous Weapons Systems (LAWS) based on epistemic models. Epistemology is the study of knowledge. Epistemic models consider the role of accuracy, likelihoods, beliefs, competencies, capabilities, context, and luck in the justification of actions and the attribution of knowledge. The aim is not to provide ethical justification for or against LAWS, but to illustrate how epistemological frameworks can be used in conjunction with moral apparatus to guide the design and deployment of future systems. The models discussed in this chapter aim to make Article 36 reviews of LAWS systematic, expedient, and evaluable. A Bayesian virtue epistemology is proposed to enable justified actions under uncertainty that meet the requirements of the Laws of Armed Conflict and International Humanitarian Law. Epistemic concepts can provide some of the apparatus to meet explainability and transparency requirements in the development, evaluation, deployment, and review of ethical AI.


2020 ◽  
Vol 13 (2) ◽  
pp. 115
Author(s):  
Roman Dremliuga

This article focuses on the problem of regulation of the application of the autonomous weapons systems from the perspective of the norms and principles of international humanitarian law. The article discusses the question of what restrictions are imposed on the application of such weapons in the international humanitarian law. The article presents a number of principles that must be met by both the weapons and their method of their application: distinction between civilians and combatants, military necessity, proportionality, prohibition on causing unnecessary suffering, and humanity. The author concludes that from the perspective of the principles of the international humanitarian law, it is doubtful if autonomous systems would be able to comply with these principles. Weapons that hit targets without human intervention have been applied for a long time, but they have never had the independence that they have now. The issue of compliance of autonomous weapons systems with the international humanitarian law can be considered if sufficient experience of application of such weapons in real conditions is accumulated. This study demonstrates that it is impossible to say that autonomous weapons systems do not comply with the principles of humanitarian law in general. The paper provides policy recommendations and assessments for each of the principles under consideration. The author also concludes that it would be necessary not to prohibit autonomous weapons, because they do not comply with the principles of international humanitarian law, but to develop rules for their application and for human participation in their functioning. A significant challenge to the development of such rules is the opacity of these autonomous weapons systems, if we look at them as at the complex intelligent computer systems.


2019 ◽  
Vol 7 (1) ◽  
pp. 124-131
Author(s):  
Sai Venkatesh

The objective of this paper is to legally analyze the issues surrounding the use and regulation of Autonomous Weapons Systems (AWS) and their implications on the existing principles of International Humanitarian Law (IHL). The research and mode of approach towards this issue will be directed in consonance with the New Haven School of International Legal Thought. The paper will begin by defining the terms ‘AWS’ and ‘New Haven school’ for the purpose of this study. Subsequently, it will highlight the various notable issues of contention with relation to existing principles of IHL. In doing so, the paper will earmark these issues under the scope of the New Haven method and conclude exclusively to that school of international thought.  In its conclusion, this paper will emphasize the need for AWS in today’s world, and how regulation, rather than prohibition, would be the ideal solution towards addressing the conundrum of their legality. It will also distinguish the key elements of the New Haven school and how these were directly incorporated into this paper so as to arrive at the predicated resolution, emphasizing the need for legality of AWS to attain world peace and order. 


Author(s):  
Natella Sinyaeva

The article examines the issues of possible control, from the standpoint of international humanitarian law, at the stage of developing autonomous weapons systems. The author notes that the development of autonomous weapons systems raises serious social and ethical concerns. He considers the existing norms and principles of international humanitarian law applying to control the development and use of such systems. The author considers autonomous weapons systems from the perspective of the distinction between civilians (civilian targets) and combatants (military objects), that means precautions in attack and proportionality.


Author(s):  
Laura A. Dickinson

The rise of lethal autonomous weapons systems creates numerous problems for legal regimes meant to ensure public accountability for unlawful uses of force. In particular, international humanitarian law has long relied on enforcement through individual criminal responsibility, which is complicated by autonomous weapons that fragment responsibility for decisions to deploy violence. Accordingly, there may often be no human being with the requisite level of intent to trigger individual responsibility under existing doctrine. In response, perhaps international criminal law could be reformed to account for such issues. Or, in the alternative, greater emphasis on other forms of accountability, such as tort liability and state responsibility might be useful supplements. Another form of accountability that often gets overlooked or dismissed as inconsequential is one that could be termed “administrative accountability.” This chapter provides a close look at this type of accountability and its potential.


2015 ◽  
Vol 6 (2) ◽  
pp. 247-283 ◽  
Author(s):  
Jeroen van den Boogaard

Given the swift technologic development, it may be expected that the availability of the first truly autonomous weapons systems is fast approaching. Once they are deployed, these weapons will use artificial intelligence to select and attack targets without further human intervention. Autonomous weapons systems raise the question of whether they could comply with international humanitarian law. The principle of proportionality is sometimes cited as an important obstacle to the use of autonomous weapons systems in accordance with the law. This article assesses the question whether the rule on proportionality in attacks would preclude the legal use of autonomous weapons. It analyses aspects of the proportionality rule that would militate against the use of autonomous weapons systems and aspects that would appear to benefit the protection of the civilian population if such weapons systems were used. The article concludes that autonomous weapons are unable to make proportionality assessments on an operational or strategic level on their own, and that humans should not be expected to be completely absent from the battlefield in the near future.


Author(s):  
Ruslan Melykov

The purpose of the article is to identify the methodology, used in international humanitarian law for the regulation of new types of weapons. Under the settlement of the objectives of the article, regulation is understood as the establishment of permits, prohibitions and restrictions on the use of this type of weapon in accordance with the basic principles of international humanitarian law. The article is methodologically based on the works of foreign and Ukrainian researchers, devoted to the problems of the settlement of new weapons systems in international humanitarian law. The empirical basis of the article was formed by international treaties in the field of international humanitarian law and codified customs of this industry, as reflected in the codifications, developed by the International Committee of the Red Cross. The article establishes that in international humanitarian law there is an obligation for states to assess the compliance of new weapons systems with international humanitarian law. At the same time, this norm has two disadvantages. First, it is too abstract, which allows states to avoid the obligation to assess each time with reference to the fact that a certain type of weapon does not fall under the definition of a new type of weapon. Secondly, international humanitarian law does not contain specific mechanisms to hold violating states accountable. It is concluded, that it is necessary to revise the current international legal regulation of the obligation to assess new weapons systems in the direction of its concretization and strengthening of responsibility for non-compliance. Corresponding changes can be made to the Additional Protocol to the Geneva Conventions of 1977, or introduced by adopting a separate protocol.


Sign in / Sign up

Export Citation Format

Share Document