Limit search to available items
Book Cover
E-book

Title Lethal autonomous weapons : re-examining the law and ethics of robotic warfare / edited by Jai Galliott, Duncan MacIntosh, and Jens David Ohlin
Published New York, NY : Oxford University Press, [2021]

Copies

Description 1 online resource
Series The Oxford series in ethics, national security, and the rule of law
Ethics, national security, and the rule of law series.
Contents An effort to balance the lopsided autonomous weapons debate -- Fire and forget : a moral defense of the use of autonomous weapons systems in war and peace / Duncan MacIntosh -- The robot dogs of war / Deane-Peter Baker -- Understanding AI & autonomy : problematizing the meaningful human control argument against killer robots / Tim McFarland & Jai Galliott -- The humanitarian imperative for minimally-just all in weapons / Jason Scholz and Jai Galliott -- Programming precision? Requiring robust transparency for AWS /Steven J. Barela & Avery Plaw -- May machines take lives to save lives? Human perceptions of autonomous robots (with the capacity to kill) / Matthias Scheutz and Bertram F. Malle -- The better instincts of humanity : humanitarian arguments in defense of international arms control / Natalia Jevglevskaja and Rain Liivoja -- Toward a positive statement of ethical principles for military / AI Jai Galliott -- Empirical data on attitudes towards autonomous systems / Jai Galliott, Bianca Baggiarini, Sean Rupka -- The automation of authority : discrepancies with jus ad bellum principles / Donovan Phillips -- Autonomous Weapons and the Future of Armed Conflict / Alex Leveringhaus -- Autonomous weapons and reactive attitudes / Jens David Ohlin -- Blind brains and moral machines : neuroscience and autonomous weapon systems / Nicholas G. Evans -- Enforced transparency : a solution to autonomous weapons as potentially uncontrollable weapons similar to bioweapons / Armin Krishnan -- Normative epistemology for lethal autonomous weapons systems / Kate Devitt -- Proposing a regional normative framework for limiting the potential for unintentional or escalatory engagements with increasingly autonomous weapon systems / Austin Wyatt and Jai Galliott -- The human role in autonomous weapon design and deployment / M.L. Cummings
Summary "Because of the increasing use of Unmanned Aerial Vehicles (UAVs, also commonly known as drones) in various military and para-military (i.e., CIA) settings, there has been increasing debate in the international community as to whether it is morally and ethically permissible to allow robots (flying or otherwise) the ability to decide when and where to take human life. In addition, there has been intense debate as to the legal aspects, particularly from a humanitarian law framework. In response to this growing international debate, the United States government released the Department of Defense (DoD) 3000.09 Directive (2011), which sets a policy for if and when autonomous weapons would be used in US military and para-military engagements. This US policy asserts that only "human-supervised autonomous weapon systems may be used to select and engage targets, with the exception of selecting humans as targets, for local defense ... ". This statement implies that outside of defensive applications, autonomous weapons will not be allowed to independently select and then fire upon targets without explicit approval from a human supervising the autonomous weapon system. Such a control architecture is known as human supervisory control, where a human remotely supervises an automated system (Sheridan 1992). The defense caveat in this policy is needed because the United States currently uses highly automated systems for defensive purposes, e.g., Counter Rocket, Artillery, and Mortar (C-RAM) systems and Patriot anti-missile missiles. Due to the time-critical nature of such environments (e.g., soldiers sleeping in barracks within easy reach of insurgent shoulder-launched missiles), these automated defensive systems cannot rely upon a human supervisor for permission because of the short engagement times and the inherent human neuromuscular lag which means that even if a person is paying attention, there is approximately a half-second delay in hitting a firing button, which can mean the difference for life and death for the soldiers in the barracks. So as of now, no US UAV (or any robot) will be able to launch any kind of weapon in an offensive environment without human direction and approval. However, the 3000.09 Directive does contain a clause that allows for this possibility in the future. This caveat states that the development of a weapon system that independently decides to launch a weapon is possible but first must be approved by the Under Secretary of Defense for Policy (USD(P)); the Under Secretary of Defense for Acquisition, Technology, and Logistics (USD(AT & L)); and the Chairman of the Joint Chiefs of Staff. Not all stakeholders are happy with this policy that leaves the door open for what used to be considered science fiction. Many opponents of such uses of technologies call for either an outright ban on autonomous weaponized systems, or in some cases, autonomous systems in general (Human Rights Watch 2013, Future of Life Institute 2015, Chairperson of the Informal Meeting of Experts 2016). Such groups take the position that weapons systems should always be under "meaningful human control," but do not give a precise definition of what this means. One issue in this debate that often is overlooked is that autonomy is not a discrete state, rather it is a continuum, and various weapons with different levels of autonomy have been in the US inventory for some time. Because of these ambiguities, it is often hard to draw the line between automated and autonomous systems. Present-day UAVs use the very same guidance, navigation and control technology flown on commercial aircraft. Tomahawk missiles, which have been in the US inventory for more than 30 years, are highly automated weapons with accuracies of less than a meter. These offensive missiles can navigate by themselves with no GPS, thus exhibiting some autonomy by today's definitions. Global Hawk UAVs can find their way home and land on their own without any human intervention in the case of a communication failure. The growth of the civilian UAV market is also a critical consideration in the debate as to whether these technologies should be banned outright. There is a $144.38B industry emerging for the commercial use of drones in agricultural settings, cargo delivery, first response, commercial photography, and the entertainment industry (Adroit Market Research 2019) More than $100 billion has been spent on driverless car development (Eisenstein 2018) in the past 10 years and the autonomy used in driverless cars mirrors that inside autonomous weapons. So, it is an important distinction that UAVs are simply the platform for weapon delivery (autonomous or conventional), and that autonomous systems have many peaceful and commercial uses independent of military applications"-- Provided by publisher
Bibliography Includes bibliographical references and index
Notes Online resource; title from digital cover page (viewed on April 29, 2021)
Subject Military weapons (International law)
Military weapons -- Law and legislation -- United States
Weapons systems -- Automation
Autonomous robots -- Law and legislation.
Uninhabited combat aerial vehicles (International law)
Autonomous robots -- Moral and ethical aspects
Drone aircraft -- Moral and ethical aspects
Humanitarian law.
Autonomous robots -- Law and legislation
Humanitarian law
Military weapons (International law)
Military weapons -- Law and legislation
Uninhabited combat aerial vehicles (International law)
Weapons systems -- Automation
Wapens
Drones
Robots
Ethische aspecten
United States
Form Electronic book
Author Galliott, Jai, editor.
Ohlin, Jens David, editor.
Macintosh, Duncan (Writer on autonomous weapons), editor.
LC no. 2020032679
ISBN 9780197546079
0197546072
0197546064
9780197546055
0197546056
9780197546062