NSDA Topic: Lethal Autonomous Weapons
The next topic (and the ToC topic for the year) is now out! Read our free topic analysis and access our free case below!
—————————————————————————————————————————————————————-
The first topic for the 2021 year has been announced (and consequentially, the TOC topic as well). Resolved: States ought to ban lethal autonomous weapons. This topic mirrors the many discussions of drones and military interventions that have occurred over the years but introduces a new issue to the overall debate. While there are many definitions of lethal autonomous weapons (LAWs), whether many of the existing weapons meet those interpretations, and if so under what conditions, can make the specification of scenarios more difficult. That being said, the general literature around LAWs is strong enough that such issues are only truly an obstacle for more planned based LD debate.
Lethal Autonomous Weapons: Definition
The United States Congressional Research Service defines LAWs as “a special class of weapon systems that use senior suites and computer algorithms to independently identify a target and employ an onboard weapon system to engage and destroy the target without manual human control of the system” (Sayler, 2019). While this is a fairly straightforward concept, the issue comes in the deployment of it for actual conflict scenarios. Can there be a halt function after the launch of a LAW? How long must it be autonomous overall? How strict can the parameters be in terms of targeting? Each of these questions become more important when considering that many of the automated systems currently employed by the military could be considered at least somewhat autonomous (Krishnan, 2003).
This spectrum of autonomy complicates the picture, especially as many systems that have the most potential for autonomy now also have semi-autonomus/automated functions, that allow for them to seamlessly change in terms of software in ways that can only complicate the resolution further (Trumbull, 2020). At the same time, countries such as China have defined LAWs for the case of bans in increasingly narrow ways that would preclude a theoretical weapon that may never come into existence at all (Kania, 2018). Taken together, even if a ban on LAWs were passed in reality, there would still be questions of whether many of critiqued processes and actions by LAWs would still be capable of occurring. That is ignoring entirely that many weapons that could be considered LAWs would not even be banned in the instance of China’s definition.
To access the rest of the free topic analysis + the free case, click here!
Purchase the full file and/or the yearlong subscription to our files below.