2017
DOI: 10.1515/gj-2017-0012
|View full text |Cite
|
Sign up to set email alerts
|

The Ethical and Legal Case Against Autonomy in Weapons Systems

Abstract: In order to be counted as autonomous, a weapons system must perform the critical functions of target selection and engagement without any intervention by human operators. Human rights organizations, as well as a growing number of States, have been arguing for banning weapons systems satisfying this condition – that are usually referred to as autonomous weapons system (AWS) in this account – and for maintaining meaningful human control (MHC) over any weapons systems. This twofold goal has been pursued by levera… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 14 publications
(10 citation statements)
references
References 1 publication
0
10
0
Order By: Relevance
“…Discussion of what MHC should amount to in the case of AWS is well under way in both academic [62][63][64] and international policy fora [65]. Moreover, applications of the MHC concept to increasingly autonomous driverless vehicles and other autonomous robotic and computational systems are being actively explored [6].…”
Section: Meaningful Human Control Of Surgery Robot Autonomymentioning
confidence: 99%
“…Discussion of what MHC should amount to in the case of AWS is well under way in both academic [62][63][64] and international policy fora [65]. Moreover, applications of the MHC concept to increasingly autonomous driverless vehicles and other autonomous robotic and computational systems are being actively explored [6].…”
Section: Meaningful Human Control Of Surgery Robot Autonomymentioning
confidence: 99%
“…The structure of law and the processes of justice require the presence of a human as a legal agent, and the case against AWS is both a legal and a moral one. Amoroso and Tamburrini (2017) distinguish two kinds of argument made about AWS: deontological and consequentialist. They identify three main deontological arguments: (a) that AWS would be unable to conform to IHL and IHRL rules governing the use of lethal force; (b) that AWS would create an accountability gap; and (c) that deployment of AWS would be contrary to human dignity and the requirement that 'the taking of human life should be reserved to human decision-makers'.…”
Section: Against Autonomous Weapons Systemsmentioning
confidence: 99%
“…Tamburrini (2016) argues that swarms of AWS could weaken traditional nuclear deterrent factors based on mutually assured destruction (swarms of AWS could be used to deliver destructive attacks on strategic nuclear and eliminate an opponent's second strike nuclear capability, increasing preference for first strike strategies). Amoroso and Tamburrini (2017) point out that AWS even without the 'lethal' element, if used to destroy buildings or infrastructure, would still have a global destabilising effect.…”
Section: Against Autonomous Weapons Systemsmentioning
confidence: 99%
See 2 more Smart Citations