This chapter was written in collaboration with one of my favorite all-time coauthors and friends, Katie Szilagyi.
Lethal autonomous weapons—machines that might one day target and kill people without human intervention or oversight—are gaining attention on the world stage. While their development, deployment and perceived superiority over human soldiers are presumed to be inevitable, in this chapter we challenge the prevalent view, arguing that the adoption of these technologies is not fait accompli.
We begin by canvassing the state of the art in robotic warfare and the military advantages that autonomous weapons offer, aiming to scratch below the surface level success of robotic warfare and consider the drastic effects its implementation can have on international humanitarian law, adherence to humanitarian principles, and notions of technological neutrality. International humanitarian law governs the use of particular weapons and advancing technologies in order to ensure that the imperative of humanity modulates how war is waged. Based on the interest of protecting civilians, military actions are therefore restricted through compliance with humanitarian principles, including proportionality between collateral injuries and military advantages, discrimination between combatants and non-combatants, and military necessity for reaching concrete objectives.
This chapter suggests that serious and catastrophic consequences become foreseeable when robots are given full autonomy to pull the trigger in complicated and context-dependent situations, and that technological neutrality is not a safe presumption. We also argue that when a disruptive technology changes the nature of what is possible, there is a corresponding expansion in what can be perceived of as “necessary,” allowing lethal autonomous robots to become a force multiplier of military necessity. Ultimately, we ask our readers to consider the consequences of a future with lethal autonomous robots, when the power to implement them lies in the hands of those who have not fully come to terms with their implications.