Friend or frenemy? The role of trust in human-machine teaming and lethal autonomous weapons systems

Author(s):  
Aiden Warren ◽  
Alek Hillas

2021 ◽  
pp. 237-258
Author(s):  
S. Kate Devitt

The rise of human-information systems, cybernetic systems, and increasingly autonomous systems requires the application of epistemic frameworks to machines and human-machine teams. This chapter discusses higher-order design principles to guide the design, evaluation, deployment, and iteration of Lethal Autonomous Weapons Systems (LAWS) based on epistemic models. Epistemology is the study of knowledge. Epistemic models consider the role of accuracy, likelihoods, beliefs, competencies, capabilities, context, and luck in the justification of actions and the attribution of knowledge. The aim is not to provide ethical justification for or against LAWS, but to illustrate how epistemological frameworks can be used in conjunction with moral apparatus to guide the design and deployment of future systems. The models discussed in this chapter aim to make Article 36 reviews of LAWS systematic, expedient, and evaluable. A Bayesian virtue epistemology is proposed to enable justified actions under uncertainty that meet the requirements of the Laws of Armed Conflict and International Humanitarian Law. Epistemic concepts can provide some of the apparatus to meet explainability and transparency requirements in the development, evaluation, deployment, and review of ethical AI.





2019 ◽  
Vol 14 (2) ◽  
pp. 111-128 ◽  
Author(s):  
Hendrik Huelss

Abstract The emergence of autonomous weapons systems (AWS) is increasingly in the academic and public focus. Research largely focuses on the legal and ethical implications of AWS as a new weapons category set to revolutionize the use of force. However, the debate on AWS neglects the question of what introducing these weapons systems could mean for how decisions are made. Pursuing this from a theoretical-conceptual perspective, the article critically analyzes what impact AWS can have on norms as standards of appropriate action. The article draws on the Foucauldian “apparatus of security” to develop a concept that accommodates the role of security technologies for the conceptualization of norms guiding the use of force. It discusses to what extent a technologically mediated construction of a normal reality emerges in the interplay of machinic and human agency and how this leads to the development of norms. The article argues that AWS provide a specific construction of reality in their operation and thereby define procedural norms that tend to replace the deliberative, normative-political decision on when, how, and why to use force. The article is a theoretical-conceptual contribution to the question of why AWS matter and why we should further consider the implications of new arrangements of human-machine interactions in IR.





Sign in / Sign up

Export Citation Format

Share Document