2015
DOI: 10.1007/s10551-015-2950-4
|View full text |Cite
|
Sign up to set email alerts
|

Making Drones to Kill Civilians: Is it Ethical?

Abstract: A drone industry has emerged in the US, initially funded almost exclusively for military applications. There are now also other uses both governmental and commercial (in the US and abroad). Many military drones are still being made, however, especially for surveillance and targeted killings. Regarding the latter this essay calls into question their legality and morality. It recognizes that the issues are complex and controversial, but less so as to the killing of non-combatant civilians. The government using d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0
1

Year Published

2017
2017
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(5 citation statements)
references
References 11 publications
0
4
0
1
Order By: Relevance
“…Besides LAWS there are other considerably autonomous AI robots such as military drones and Big Dog (Lin et al, 2014) that is considered as a carrier of military equipment instead of an attacker robot, yet still in support of war effort. Highly autonomous AI killer robots make decisions on their own-we could consider their manufacturers as facilitators but according to Byrne (2018), not as murderers themselves. Intergovernmental regimes are required to collaborate to hinder the illegal use of military AI robots.…”
Section: Cluster 4 Supra-territorial Regulationsmentioning
confidence: 99%
See 1 more Smart Citation
“…Besides LAWS there are other considerably autonomous AI robots such as military drones and Big Dog (Lin et al, 2014) that is considered as a carrier of military equipment instead of an attacker robot, yet still in support of war effort. Highly autonomous AI killer robots make decisions on their own-we could consider their manufacturers as facilitators but according to Byrne (2018), not as murderers themselves. Intergovernmental regimes are required to collaborate to hinder the illegal use of military AI robots.…”
Section: Cluster 4 Supra-territorial Regulationsmentioning
confidence: 99%
“…Developing these autonomous weapons is beyond international peace agreements, yet some countries have invested in their development. A major ethical concern is that humans barely have any chance against highly accurate and intelligent killer robots (Byrne, 2018). Applying originally military AI solutions in non-military settings, for instance, for lifesaving in emergency situations (e.g., identifying and saving people and animals in the event of a major flood) should be further encouraged as these fall under supererogatory applications.…”
Section: Regulatory Considerations For the Ethical Use Of Ai Robotsmentioning
confidence: 99%
“…Furthermore, research have demonstrated that applications of Unmanned Aerial Vehicles (UAVs) and drones constitute complex technologies augmented by a variety of technologies [ 69 ], necessitating the development of specialized legal frameworks to address civil liberties and privacy concerns [ 18 ]. Non-combatant civilians are being killed by military drones, igniting much controversy [ 6 ]. Previous research has raised concerns regarding armed and autonomous robots' ability to accurately discern between combatants and non-combatants, or to distinguish between dangerous and nonthreatening conduct [ 28 ].…”
Section: Topic Modeling and Content Analysismentioning
confidence: 99%
“…This interest was initially sparked by the increased usage of armed drones in military operations, particularly by the US in its 'War on Terror' in Afghanistan and Pakistan (Chamayou, 2015). Consequently, much work has focused on their history and emergence (Kaplan, 2018;Richardson, 2020), as well as key issues related to their usage, including implications for warfare and surveillance (Boyle, 2015;Neve, 2015), the ethics of 'killing from above' (Byrne, 2018;Kirkpatrick, 2015) and the expansion of conflict spatialities, territory and sovereignty (Kindervater, 2017;Shaw, 2013).…”
mentioning
confidence: 99%