Autonomous weapon systems are often described either as more independent versions of weapons already in use or as humanoid robotic soldiers. In many ways, these analogies are useful. Analogies and allusions to popular culture make new technologies seem accessible, identify potential dangers, and buttress desired narratives. Most importantly from a legal perspective, analogical reasoning helps stretch existing law to cover developing technologies and minimize law-free zones.

But all potential analogies—weapon, combatant, child soldier, animal combatant—fail to address the legal issues raised by autonomous weapon systems, largely because they all misrepresent legally salient traits. Conceiving of autonomous weapon systems as weapons minimizes their capacity for independent and self-determined action, while the combatant, child soldier, and animal combatant comparisons overemphasize it. Furthermore, these discrete and embodied analogies limit our ability to think imaginatively about this new technology and anticipate how it might develop, thereby impeding our ability to properly regulate it.

We cannot simply graft legal regimes crafted to regulate other entities onto autonomous weapon systems. Instead, as is often the case when analogical reasoning cannot justifiably stretch extant law to answer novel legal questions, new supplemental law is needed. The sooner we escape the confines of these insufficient analogies, the sooner we can create appropriate and effective regulations for autonomous weapon systems.

Document Type


Publication Date