Printer Friendly

Autonomous weapon systems: the anatomy of autonomy and the legality of lethality.

  I. INTRODUCTION
 II. THE TECHNOLOGY OF THE FUTURE
     A. Definitions
     B. Modern Weapon Systems
III. THE LAW OF THE PRESENT
     A. Weapons Law
     B. Targeting Law
 IV. CONCLUSION


[T]he art of war is simple enough; find out where your enemy is, get at him as soon as you can, and strike him as hard as you can, and keep moving on.

--Ulysses S. Grant (1)

I. INTRODUCTION

Perhaps warfare of the twenty-first century is not as simple as it was in the throes of the American Civil War. (2) Although the unmanned weapon system can hardly be considered a novel invention--its concept predates even the 1863 Battle of Chancellorsville (3)--the efficacy, use, and repercussions thereof have precipitated intense, contemporary disputation. (4) In the wake of the September 11, 2001 terrorist attacks, there has been a dramatic upsurge in the presence and use of unmanned aerial vehicles, (5) commonly identified by the media and the public as "drones." (6) The drone's rapid combat application not only gave rise to various legal issues and implications, (7) but also set the stage for the development and eventual use of fully autonomous weapon systems. (8) Taking that stage in the latter half of 2013, the international debate on autonomous weapon systems has gained significant momentum in the legal field, as well as a variety of other disciplines. (9) There is presently no indication that the surrounding discussion will lose its impetus, as commentators, scholars, non-governmental organizations, and international groups continue to show great interest in this topic. (10)

Put generally, the term "autonomous weapon system" refers to a category of weapons that are capable of operating and launching attacks without human input or guidance. (11) Although such weapons do not yet exist in any practicable form, (12) even a cursory glance at the trends and developments in weapons technologies, and in the field of robotics generally, reveals that the concept of fully autonomous weapons is not as farfetched as it might have once seemed. (13) Indeed, several nations currently use technologies that can be considered precursors of fully autonomous weapon systems. (14) As was aptly demonstrated by drones, advancements in modern robotics technology have the potential to redefine the essence and dynamics of modern armed conflicts. (15) Unfortunately, the international laws that govern such hardware and warfare do not yet address or support any legal standards specific to fully autonomous robotic weapons use. (16)

Although the debate on autonomous weapon systems is in its early stages, opponents and proponents alike have made known the challenges, advantages, values, and dangers of a worldwide trend toward robotic autonomy on the battlefield. (17) Spearheaded by Human Rights Watch and Harvard Law School's International Human Rights Clinic, critics of the development, production, and utilization of autonomous weapon systems called for their preemptive ban in November 2012. (18) Opponents assert that, because they are devoid of certain human qualities that are essential during armed conflict, these so-called "killer robots" would be incapable of complying with core International Humanitarian Law ("IHL") standards. (19) Similarly, the lack of human emotion, cognition, and situational awareness are cited as limitations of autonomous weapon systems. (20) Objectors also note the acute potential that the ease with which wars are declared and waged will increase as humans become increasingly removed from decisions to use lethal force. (21)

Supporters of fully autonomous weapon systems, which are usually members of or consultants to a nation's military, argue that a preemptive ban would be a shortsighted forfeiture of any potential gains in humanitarian protection that might emerge from such burgeoning technologies. (22) A ban, proponents assert, would also hamper the efficiency and competency of military forces and their operations. (23) Advocates insist that the call for an international ban is premature and unwarranted, asserting that attempts to thwart such technological developments are made by ill-informed parties that conflate and obfuscate the relevant legal issues. (24)

This Comment is intended to contribute to the medley of voices an additional perspective on the legality and future of fully autonomous weapon systems. It puts forth the argument that a preemptive ban is inapposite and urges the appropriate authorities to develop a modern legal framework that is tailored to embrace these state-of-the-art weapons. Part II offers a brief overview of the technology under consideration by defining, discussing, and providing choice examples of autonomous weapon systems. Part III surveys the current IHL standards applicable to wartime weaponry. It also evaluates the propriety of evaluating autonomous weapon systems by those standards, addresses potential shortcomings of such application, and proffers alternative avenues to refine current weapons laws. The Conclusion offers final thoughts and hopes for the manner in which autonomous weapon systems will be addressed as IHL develops.

II. THE TECHNOLOGY OF THE FUTURE

In discussing the legal implications of autonomous weapon system usage on the battlefield, it is indispensable to have a rudimentary knowledge of at least some of the technology at issue. The term "autonomy" is a nebulous concept to neophytes, a fact that can all too easily transform well-intentioned discussions into misguided contretemps. (25) Even a broad, foundational understanding of key terms and technologies will infuse with coherency and reason the important considerations at hand. It is thus appropriate to consider broadly applicable principles and currently existing technologies before turning to the law that governs.

A. Definitions

In order to enunciate the legal issues that underlie the use of autonomous weapon systems, the relevant technology must first be discussed. (26) A weapon system can be defined broadly as a weapon and all related materiel and personnel required for its employment. (27) The term "robots" refers to manmade machines that can sense, think, and act. (28) The level of independence a robot has with respect to the initiation and execution of actions falls somewhere within a spectrum of autonomy. (29) At one end of this spectrum lie "automated" robots, which possess some level of independence but ultimately are not self-directed, do not possess decision-making capabilities, and may require human participation. (30) The General Atomics Aeronautical Systems MQ-1 Predator drone is an often-cited example of an automated weapon system. (31) While the Predator is unmanned, it is remotely controlled by a pilot on the ground and therefore remains under human control. (32)

At the opposite end of the spectrum of autonomy are fully "autonomous" robots. (33) While a simple definition of the term might seem appropriate to the uninitiated, the term "autonomy" is rather ambiguous. (34) On November 21, 2012, the U.S. Department of Defense released Directive Number 3000.09 ("Directive"), entitled "Autonomy in Weapon Systems." (35) The Directive establishes and expounds on the United States' policy and framework for the development, testing, international sale and transfer, and employment of the gamut of autonomous weapon systems. (36) The Directive defines an "autonomous weapon system" as:
   A weapon system that, once activated, can select and
   engage targets without further intervention by a human
   operator. This includes human-supervised autonomous
   weapon systems that are designed to allow human
   operators to override operation of the weapon system,
   but can select and engage targets without further
   human input after activation. (37)


Thus, autonomous weapon systems are those capable of independently initiating and executing an attack, without being prompted by a human operator. (38)

It is important to note that autonomous robots, whether weaponized or not, do not require human-operator input, nor do they preclude such input. (39) Additionally, full autonomy does not represent the strict notion that a human will never be involved in a robot's functioning. (40) Indeed, a human necessarily will be involved in the production and programming of even the most autonomous weapon system. (41)

Just two days before the Department of Defense released the Directive, Human Rights Watch released Losing Humanity: The Case against Killer Robots. (42) In discussing the dangers and predicted unlawfulness of autonomous weapon systems, Losing Humanity separates unmanned robotic weapons into three tiers of autonomy: "human-in-the-loop weapons," "human-on-theloop weapons," and "human-out-of-the-loop weapons." (43) In this context, the "loop" is a reference to the OODA loop. (44) Under the OODA loop concept, combatants seek to reduce processing and decision-making speeds so as to gain an advantage over their foes. (45) An increase in the autonomy a robot has with respect to the decision-making process reduces the input required from human operators. (46) Such a result could quicken the pace and efficiency of battle. (47)

Losing Humanity defines human-in-the-loop weapons as robotic weapons capable of targeting and striking solely as a result of human directive. (48) Human-on-the-loop weapons are those that are capable of independently targeting and delivering force while under the supervision of a human operator who is armed with override capabilities. (49) Finally, human-out-of-the-loop weapons are defined as "[r]obots that are capable of selecting targets and delivering force without any human input or interaction." (50) Both human-on-the-loop and human-out-of-the-loop weapons fit the description of what is typically considered an autonomous weapon system, (51) as both are capable of wholly independent functioning. (52)

B. Modern Weapon Systems

The United States' policy on autonomous weapon systems was the first announced by any government in the world. (53) As detailed in the Directive, it is U.S. policy that autonomous weapon systems "shall be designed" to keep humans at least "on" the loop and thus in possession of the ultimate decision to exercise lethal force. (54) In compliance with this policy, there are no reported plans to develop fully autonomous weapon systems other than human-on-the-loop defensive weapons. (55) Such weapon systems, which have been termed "automatic weapons defense systems," are designed to respond to incoming threats, with limited human directive. (56)

While the United States maintains its policy of non-development, its Department of Defense is currently creating a variety of autonomous features for use in its unmanned weapon systems, (57) as has been its practice for some years. (58) Additionally, in 2012, the Department of Defense prescribed that autonomy be more "aggressively" incorporated into military missions. (59) Indeed, it has also advocated for an increase in autonomy as recently as December 2013, stating in a report that although "much more work needs to occur," one of its goals is to "[t]ake the 'man' out of unmanned." (60) A 2013 Human Rights Watch review of the United States' policy briefly celebrated what appeared to be an effective moratorium on fully autonomous weapons, and then acutely elucidated various ambiguities and inadequacies found therein. (61)

Autonomous weapon systems are currently in development worldwide. (62) For an amphibious example, consider the U.S. Navy's MK 15 Phalanx Close-In Weapons System. As a "rapidfire, computer-controlled" weapon system, the Phalanx is capable of automatically and independently acquiring, tracking, and engaging a hostile threat. (63) The U.S. Navy describes the Phalanx as being capable of "autonomously performing its own search, detect, evaluation, track, engage and kill assessment functions." (64) A terrestrial example is the Samsung Techwin SGR-1, which was developed for South Korea's use in the Korean Demilitarized Zone. (65) This system is designed to select and track targets automatically and is capable of making the decision to fire on a target, completely independent of human input. (66)

Finally, we take to the sky to find the U.S. Navy's X-47B. The X-47B is a fully autonomous aircraft that has successfully launched from and landed on an undocked aircraft carrier, (67) which is considered by many to be the "single biggest test of airmanship." (68) While the X-47B is currently unarmed, its design enables it to support up to a 4,500-pound payload in its twin internal weapons bays. (69) Akin to the X-47B is the United Kingdom's "Taranis" aircraft, which is capable of autonomous flight but has not yet been weaponized. (70) Although the United Kingdom appears to be on a path toward full autonomy, a member of the Parliament announced in June 2013 that the United Kingdom is not developing and will not be using fully autonomous weapon systems. (71) Though keen observers note the possibility that the United Kingdom will equip its aerial arsenal with greater autonomy, regardless of the fact that officials represent the country as dissociated from the development of autonomous weapon systems. (72)

III. THE LAW OF THE PRESENT

The technological developments of autonomous weapon systems have far outpaced the legal developments thereof, a fact that has precipitated the recent contentions. (73) That is to say that, at present, there exists no treaty that governs autonomous weapon systems specifically. (74) The legal community has thus found itself hopelessly meandering, trying to somehow work these new-age feats of technology into the static, aged IHL framework. (75) While this undertaking has proven to be sizeable, a number of experts have chosen to pursue zealously a workable solution to this imminent problem. (76)

In considering the legal principles by which autonomous weapon systems must abide, there are two distinct areas of law that must be analyzed: weapons law and targeting law. (77) Weapons law considers the weapon system itself to determine whether it complies with international norms or is unlawful per se. (78) The primary concerns of weapons law are whether a weapon will cause unnecessary suffering and whether it is indiscriminate by nature. (79) Targeting law, on the other hand, looks at the use of a weapon system on the battlefield with respect to who may be targeted, precautions that that weapon's operators must take during its use, and the lawful use of force. (80) Put differently, weapons law is concerned with the means of warfare, whereas targeting law is concerned with the methods of warfare. (81)

At a broad, foundational level, there are two premises that apply to autonomous weapon systems. The first premise is that, unless explicitly prohibited, states are presumed to have the right to develop and test weapons because they possess the right to defend themselves. (82) The second premise is somewhat more helpful in furnishing clarity and consensus: it is incontestable that IHL, as it currently exists and is interpreted, governs autonomous weapon systems. (83) While these premises do provide a foundation on which an analysis can be based, any attempt to confine autonomous weapon systems to current IHL principles will almost certainly be frustrated. (84) Notwithstanding the impediments inherent in doing so, however, the dictates of IHL require that, at a minimum, the use of autonomous weapon systems comport wholly with its principles. (85) A proper, thorough legal analysis of autonomous weapon systems must thus have at its core the well-established principles of IHL: humanity, military necessity, distinction, and proportionality. (86)

A. Weapons Law

In order to limit the collateral damage and calamities that accompany war, (87) international law dictates that it is impermissible to use certain weapons during periods of armed conflict regardless of against whom or in what context they would otherwise be used. (88) Inquiry into weapons law is a journey to determine the legality of a weapon system itself, without regard to its use. (89) Specifically, analysis under weapons law is concerned with preventing states from developing those weapons that cause unnecessary suffering or superfluous injury, as well as those that are incapable of distinguishing legitimate military targets from civil objects and persons. (90) A weapon that, by its very nature, causes superfluous injury or unnecessary suffering or is fundamentally incapable of adhering to established IHL principles is unlawful per se. (91) A state's compliance with the international norms of unlawful weapon prohibition is determined by a "rigorous and multidisciplinary" review that considers both of the aforementioned factors. (92)

Chemical weapons are a prime example of unlawfulness per se, as analyzed under weapons law. (93) Following the Chemical Weapons Convention in 1993, international law completely banned the development, production, stockpiling, transfer, or use of such weapons. (94) The effects that this blanket prohibition attempted to prevent were evidenced by the events that transpired in Syria in the latter part of 2013. (95) This tragic example demonstrates that a violation of weapons law--by either the development or use of a prohibited weapon--has the potential to facilitate not only war crimes, but also public and political turbulence. (96)

Turning to the two factors of weapons law--humanity and indiscrimination by nature--it is now appropriate to consider how current the international law framework approaches weapons systems. As previously stated, weapons law considers a weapon system without regard to how forces will would (or do) employ it on the battlefield. (97) The purpose of evaluation under weapons law is to prevent warring parties from developing or using weapon systems that are designed without regard to IHL, as such systems will be incapable of complying with international norms. (98) Weapons law is not likely to preclude the use or development of autonomous weapon systems, as they are designed to improve the efficiency of warfare while also minimizing collateral damage. (99)

1. Humanity (Superfluous Injury or Unnecessary Suffering)

At the outset, it is important to note that the right of the parties to any armed conflict to choose methods or means of warfare is not unlimited. (100) This notion of limited rights serves as a bedrock principle of IHL, along with the coordinate principle of humanity. (101) Addressed in article 35(2) of the Protocol Additional to the Geneva Conventions of August 1949, and relating to the Protection of Victims of International Armed Conflict ("Additional Protocol I"), the principle of humanity prohibits the use of "weapons, projectiles and material and methods of warfare of a nature to cause superfluous injury or unnecessary suffering." (102) This well-established principle applies to aerial warfare. (103)

The International Court of Justice interpreted this principle in a 1996 advisory opinion, Legality of the Threat or Use of Nuclear Weapons. (104) The Court stated that it is "prohibited to cause unnecessary suffering to combatants" and that it is "accordingly prohibited to use weapons causing them such harm or uselessly aggravating their suffering." (105) A review of international treaties demonstrates that certain weapons, by their very nature and the purpose for which they were designed, automatically violate the principle of humanity by causing such unnecessary suffering and superfluous injury. (106) For example, it is prohibited to employ poison, poisoned weapons, (107) or "dum-dum" bullets. (108) Such weapons inherently inflict damage beyond what is required to disable the enemy, which is considered the "only purpose of war." (109)

The principle of humanity is uniquely broad in several ways. An initial distinction of the principle is that it bears on weapons law in particular, whereas most other IHL principles relate to targeting law. (110) Second, unlike many features of IHL, which are concerned with the protection of civilians or those combatants recognized as hors de combat, (111) this principle of unnecessary suffering is aimed at minimizing the affliction opposing combatants experience. (112) Also interesting is that this principle is one of customary law. (113) As such, even those states not party to Additional Protocol I generally recognize and abide by this principle. (114)

Whether a weapon causes unnecessary suffering or superfluous injury essentially hinges on the nature of the weapon itself. (115) Because the relevant prohibition is on unnecessary suffering in particular, it stands to reason that some level of suffering is considered necessary, and thus permissible, in armed conflict. (116) A weapon system is properly categorized as one that inflicts unnecessary suffering "only if it inevitably or in its normal use has a particular effect, and the injury caused thereby is considered by governments as disproportionate to the military necessity for that effect." (117) Thus, a proper balance must be struck between the military necessity (or military advantage) and the concomitant suffering that results from a state's employment of a particular weapon system. (118)

By design, autonomous weapon systems are not calculated to "cause unnecessary suffering." (119) Instead, their intended purposes are to increase the efficiency of combat, reduce casualty counts, increase safety, and extend human capabilities generally, among others. (120) The platform from which force is delivered is not determinative of the attack itself. Rather, this Comment argues, it is the munitions that are attached to the platform--as well as how those munitions are utilized--that will determine the type and amount of suffering that an attack inflicts. (121)

It is conceivable that a combatant or other human operator could equip an autonomous platform with non-traditional ordnance, such as blinding lasers or glass-filled bomblets. Such misuse of the autonomous platform could render that system unlawful per se simply because of the armaments the system supports. (122) However, the mere possibility that a weapon system might exact severe or disproportionate suffering or injury is, by itself, insufficient to render that weapon system unlawful per se. (123) Thus, autonomous weapon systems will likely surmount this prong of weapons law. (124)

2. Indiscriminate by Nature

Prior to the 1977 promulgation of Additional Protocol I, there existed no rule that prohibited indiscriminate weapons. (125) Modern IHL, however, applies to the legitimacy of weapon systems and their use on the battlefield. (126) Specifically, article 51(4) of Additional Protocol I prohibits "indiscriminate attacks," a term that has a bifurcated definition in the context of weapons law. (127) While this prong of weapons law is similar in name to the principle of distinction, (128) the latter is a facet of targeting law and is thus concerned with how a weapon system is utilized. (129)

The first definition of indiscrimination is found in article 51(4)(b) of Additional Protocol I, which identifies indiscriminate attacks as "those which employ a method or means of combat which cannot be directed at a specific military objective[.]" (130) The legal departments of several U.S. military forces understand this principle to mean that forces must employ weapons that are capable of being aimed at a military objective with a "reasonable degree of accuracy." (131) In practice, this translates to a prohibition against weapons that are incapable of distinguishing combatants from civilians and legitimate military objectives from protected civilian objects. (132)

An example of a weapon system that is indiscriminate by nature is Japan's World War II-era "balloon bomb." (133) Launched from the shores of Japan, these weapons were designed to traverse the Pacific Ocean and drop incendiary and antipersonnel bombs on American soil. (134) The intended effects were to strike panic in American civilians and to lay waste to any object the balloons' contents contacted. (135) Once launched, the final destination of a balloon bomb was determined solely by prevailing wind patterns, and each bomb merely "had a good chance of reaching North America." (136) While only a fraction of the balloon bombs launched reached North America, those that arrived were scattered throughout the continent, including remote Alaskan islands and parts of Mexico. (137)

As described above, Japanese forces were incapable of controlling the flight path of their bomb-toting balloons after departure. (138) While the incendiaries were designed to fall upon and destroy American forests, there existed no means by which Japanese soldiers could usher them toward their intended targets. (139) Consequentially, because it was not possible for this weapon system to distinguish between appropriate military objectives, such as American infantrymen and tanks deployed on the battlefield, and the noncombatant civilian population, such as unsuspecting women and children, (140) balloon bombs are properly classified as weapons that are indiscriminate by nature and are thus violative of Additional Protocol I. (141)

The second definition of indiscriminate attacks is found in article 51(4)(c) of Additional Protocol I, which labels indiscriminate attacks as "those which employ a method or means of combat the effects of which cannot be limited as required by this Protocol." (142) The import of this rule is that parties to a battle may not use weapons that have effects that cannot be administered or controlled. (143) In addition to any immediate effects, this prohibition covers any dangerous force that a weapon might precipitate. (144)

A modern example of such an uncontrollable weapon is any weapon within the broad category of biological weapons. (145) These weapons characteristically cannot be controlled after their release. (146) While they can be directed at and used against specific targets, their effects may well spread far beyond their intended targets in a number of ways. (147) Thus, the manner in which biological attacks operate and permeate is inherently intractable, for which reason warring forces are not permitted to utilize them. (148)

Although it is directly applicable to autonomous weapon systems, this prong of weapons law appears to be prohibitively narrow in practice and therefore will not likely serve as a bar to the development and use of autonomous weapon systems. (149) As was previously noted in the discussion of the principle of humanity, an autonomous weapon system's compliance with the principle of distinction will ultimately depend on the munitions with which it is equipped. (150) That is, the platform from which an attack is launched has no effect on the force delivered, or on what is struck. (151) If a fully autonomous platform were equipped with a biological contagion or some other inherently indiscriminate weapon, however, the weapon system would invariably be unlawful per se. (152)

The current state of weapons technology would arguably allow autonomous platforms to overcome weapons law in its entirety. (153) Indeed, the International Court of Justice set forth a stringent standard for rendering a weapon system unlawful per se in its Nuclear Weapons advisory opinion. (154) After reviewing the implications, pitfalls, and effects related to the use of nuclear weapons, the Court ultimately held that there was insufficient evidence to declare the threat or use of nuclear weapons illegal in all circumstances. (155) The Court's Nuclear Weapons ruling effectively set a seemingly insurmountable standard that will pose a sizeable challenge to the opponents of autonomous weapon systems. (156)

3. Article 36 Weapons Review

Although discussed only briefly here, (157) a final consideration in the weapons law arena is the legal review of new weapon systems. (158) An autonomous weapon system is classified as a means of warfare, as the term is used in the context of IHL. (159) As such, it is to be governed by article 36 of Additional Protocol I, which sets out the framework for the international legal review of new and developing weapons. (160) Specifically, article 36 provides:
   In the study, development, acquisition or adoption of a new weapon,
   means or method of warfare, a High Contracting Party is under an
   obligation to determine whether its employment would, in some or
   all circumstances, be prohibited by this Protocol or by any other
   rule of international law applicable to the High Contracting Party.
   (161)


A preemptive, proactive purpose underlies international efforts to create and enforce legal review standards for new weapons. (162) Indeed, the purpose of such review is to encourage states to contemplate any potential undesirable effects that could flow from the acquisition or development of weapons that might contravene the principles of IHL. (163)

Theoretically, the reach of the article 36 requirements extends not only to those states that actually produce or are actively developing new weapons, but all states party to Additional Protocol I. (164) In practice, however, few states that have ratified Additional Protocol I are believed to have set up programs or systems for the purpose of reviewing weapons before they are utilized. (165) Regardless of the lack in encouraging or particularly effective results with respect to the review of autonomous weapon systems, (166) several states have implemented policies so as to abide by the directive of article 36. (167) Such participating states include Canada, (168) the United Kingdom, Germany, France, Australia, (169) and even some non-ratifying states, such as the United States. (170) Indeed, the International Committee of the Red Cross ("ICRC") asserts that the systematic assessment of the legality of all new weapons is a requirement that applies to all states, regardless of their adoption of Additional Protocol I. (171)

Commentators assert that an article 36 weapons review must consider the use of a weapon system in addition to the nature of the weapon system itself, (172) rather than solely the latter. This Comment proffers an opposing view. That is to say that weapons reviews under article 36 are properly considered solely in the context of weapons law, and that new weapons should be reviewed independent of what their actual use might entail. (173) Support for this view comes from a commentary published by the ICRC. The ICRC commentary indicates that weapons reviews are "to be made on the basis of normal use of the weapon as anticipated at the time of evaluation." (174) This directly supports a weapons law-based analysis, as the reviewing body gives deference to the design and intended use of the weapon, without considering any possible misuse or abuse. (175)

Article 36 would seem to pose an insurmountable obstacle if it required that all new weapons be reviewed for any possible misuse in any conceivable situation. (176) Any such ex ante imaginings would be speculative and patently inappropriate due to the varying, unpredictable contexts in which weapons are used on the battlefield. (177) It would not be difficult to imagine how such a standard would serve to chill innovation and weapons developments. The only conceivable situation in which the use of a weapon system would have bearing on its legal review is if the intended or predicted use rendered a violation of IHL inevitable. (178) In such case, an assessment of use prior to the weapon's battlefield use would be warranted. (179) Absent such circumstances, however, weapons reviews under article 36 are properly designated a factor of weapons law and, thus, should not consider uses of weapons beyond those that are intended. (180)

B. Targeting Law

Following the above discussion of whether a weapon system is, or should be, deemed unlawful per se in the IHL context, it is appropriate to consider the uses of the weapon systems that are employed on the battlefield. Targeting law is the primary focus once a state has engaged in war. That is, targeting law is concerned with the conduct of hostilities, or jus in bello. (181) Specifically, targeting law considers how a weapon system would be used on the battlefield. (182) The inquiry of whether autonomous weapon systems will be capable of complying with international norms is a complex and intricate one. (183)

Before each of the remaining fundamental principles of IHL (184) is discussed individually, it is prudent to note that contemporary IHL strives to strike a balance between two of those principles: military necessity and humanity. (185) Once established, this delicate equilibrium "permeates the entirety of that field of law [IHL]," (186) thereby "reducing] the sum total of permissible military action from that which IHL does not expressly prohibit to that which is actually necessary for the accomplishment of a legitimate military purpose in the prevailing circumstances." (187) The ascendancy of either concept can produce untoward results. (188)

1. Distinction

Among the primary goals of IHL is that of distinguishing between combatants and civilians during times of war. (189) Indeed, distinction has been characterized as "the most significant battlefield concept a combatant must observe." (190) This vitally important principle is set forth in article 48 of Additional Protocol I: "In order to ensure respect for and protection of the civilian population and civilian objects, the Parties to the conflict shall at all times distinguish between the civilian population and combatants and between civilian objects and military objectives and accordingly shall direct their operations only against military objectives." (191)

As a codification of customary law, (192) the principle of distinction is internationally recognized as "cardinal." (193) Distinction encompasses two primary rules relating to the protection of civilians:

Article 51(2): The civilian population as such, as well as individual civilians, shall not be the object of attack. Acts or threats of violence the primary purpose of which is to spread terror among the civilian population are prohibited. (194)

Article 52(1): Civilian objects shall not be the object of attack or of reprisals. Civilian objects are all objects which are not military objectives. (195)

Considered together, these rules create the principle of distinction, which presents one of the most difficult challenges to ensuring that autonomous weapon systems are capable of abiding by IHL. (196)

Experts predict that the principle of distinction will be a particular difficulty for autonomous weapon systems to comply with and master. (197) To be sure, "[t]he principal legal issue with automated weapons is their ability to discriminate between lawful targets and civilians and civilian objects." (198) Roboticist Noel Sharkey has called attention to the binary nature of the robotic decision-making process and, more specifically, to the inability of robots to act outside of a pre-defined set of criteria. (199) Absent a clear, unambiguous definition of what exactly a "civilian" is--something that IHL is currently unable to provide (200)--autonomous weapon systems would be incapable of accounting for all relevant factors and making a sound decision. (201) Put simply, "[t]here are no visual or sensing systems up to that challenge." (202)

Because battlefield decisions are critically dependent upon the circumstances of immediate, unpredictable combat situations, the ability to observe and digest situational information is of the utmost importance. (203) The legal status of a person or a structure may fall within the "variously shaded gray area" of targetability due to its oscillation between combatant and civilian status. (204) One can easily conjure up a situation in which military forces

regularly utilize a bridge that serves as the sole means by which citizens egress their town. (205) When the military forces occupy the bridge, it could be considered a military objective, in which case it would be targetable. When the bridge is later used to serve its intended, civil purpose, however, the structure's status reverts to that of a civilian object, thus making it an improper target of military attack.

The permissibility of an attack on the hypothetical bridge described above would turn on intimate, immediate details, as they existed at the exact moment an attack is desired. (206) In the proposed example, an autonomous weapon system would need to possess and exercise judgment and decision-making abilities. (207) Ultimately, in theory, a capable autonomous weapon system would determine that the legitimacy of the proposed target is ambiguous and, as a result, would need to be able to abort its mission. (208)

As can be seen from the above-described hypothetical, the legal and ethical ramifications that accompany an autonomous weapon system's ability to make lethal decisions independently are stark reminders of technological shortcomings. The seemingly binary, emotionless nature of robots (209) lends credence to the argument that autonomous weapon systems will be unsympathetic, inadaptable war machines. (210) Given the erratic, oftentimes ambiguous situations in which combatants find themselves, autonomous weapon systems' "restricted abilities to interpret context and to make value-based calculations" will almost certainly preclude their widespread use, at least until technology improves vastly. (211) In light of the inherent inability of autonomous--and even automated--weapon systems to distinguish between combatants and civilians, in addition to the repercussions that flow from such shortcomings, (212) proponents and designers of autonomous weapon system have much ground to cover before these machines can satisfy IHL standards. (213)

2. Military Necessity

The principle of military necessity is understood to "justif[y] those measures not forbidden by international law which are indispensable for securing the complete submission of the enemy as soon as possible." (214) Tracing its roots to Francis Lieber's 1863 definition of military necessity, (215) this interpretation of the principle limits permissible measures to "legitimate military objectives," which are those that offer a definite military advantage. (216) The use of force that is not necessary to secure a military advantage is unlawful. (217) Thus, "wanton killing or destruction" is strictly prohibited. (218) This characterization of the principle of military necessity aligns with article 52(2) of Additional Protocol I. (219)

As is obvious, compliance with the principle of military necessity will require autonomous weapon systems to be capable of identifying legitimate military targets and independently determining whether the destruction of identified targets would offer a definite military advantage. (220) Whether the destruction of a target offers some military advantage requires the attacking force to first determine that the target is a legitimate target; thus, the principle of distinction plays a key role. (221) Because this on-the-fly sort of analysis and balancing is contextual in nature, (222) it is nearly impossible to predict and prepare for what an autonomous weapon system would encounter in a war zone. (223) In practice, such uncertainty would necessarily require an autonomous weapon system's programming to be of the utmost sophistication, reliability, and dependability. (224) Even then, a fear still remains that programming glitches or a lack of due consideration to the relevant features of a context might precipitate disastrous results. (225)

Although humankind has habitually overcome seemingly impossible hurdles--especially with respect to technology--the unpredictability and ambiguity that are intrinsic to any battlefield activity present more than a simple uphill battle for proponents of autonomous weapon systems. (226) Some roboticists, such as Dr. Ronald Arkin, are confident that autonomous weapon systems will exceed human capabilities, especially with respect to ethics. (227) While Dr. Arkin strongly holds his beliefs, he appears to concede that the task of designing workable "perceptual algorithms" would pose particular difficulty to those in his field. (228) A great deal of progress must be made before autonomous weapon systems mature beyond the "formative stage" (229) and can be entrusted with tasks of such delicate differentiation. (230)

Notwithstanding these arguments and considerations, autonomous weapon systems could well prove to be excellent tools with respect to military necessity. "The goal of military necessity is to identify and pursue lawful military objectives that achieve the conflict's aims and swift termination." (231) Autonomous weapon systems would, in theory, work to accomplish this goal, as they would be designed to increase efficiency, reduce casualties, and assist with high-risk missions. (232)

3. Proportionality

In the course of an armed conflict, the force used in an attack must always be proportionate to the military advantage that may potentially be gained. (233) This requirement serves to prohibit any means of attack that is considered "unreasonable or excessive." (234) Enshrined in articles 51(5)(b) and 57(2)(a)(iii) of Additional Protocol I, the principle of proportionality mandates that attacking forces shall refrain from deciding to launch any "attack which may be expected to cause incidental loss of civilian life, injury to civilians, damage to civilian objects, or a combination thereof, which would be excessive in relation to the concrete and direct military advantage anticipated." (235)

In practice, the principle of proportionality requires an attacker to make real-time battlefield assessments of whether the potential military advantage of an attack outweighs the potential humanitarian consequences. (236) While it is possible to imagine a target that is purely military, such as a tank that is located in a deserted field and operated by a member of a state's armed forces, targets are often found in situations fraught with potential collateral damage to civilians or civilian property. (237) This inherent unpredictability requires a sophisticated, intimate perceptiveness and understanding of immediate circumstances and prevailing concerns. (238) Such a complex, delicate decisionmaking process has historically been rife with uncertainty and imperfection, so much so that modern combatants continue to struggle with the challenges of applying the laws of war properly. (239) Awareness, adaptability, and the capability to react swiftly to changing circumstances are essential components successfully navigating the legal obligations imposed by IHL, a challenge for which autonomous weapon systems are not currently prepared. (240)

Arguably, autonomous weapon systems have the potential to apply force more proportionately and appropriately than human soldiers. (241) Similarly, some argue that autonomous weapon systems could--and will--be capable of acting more ethically on the battlefield than human beings. (242) Proponents note that robots are free from emotion and thus do not act hastily out to fear and are incapable of being angered. (243) But it is precisely this lack of emotion that concerns others, who caution against any rash decisions to develop or implement autonomous weapon systems. (244) Regardless of what proponents may hope and dream with respect to the potential of autonomous weapon systems to act proportionately, reality may not reflect such ideals. The asserted advantages of a robot's ability to refrain from being influenced by emotion may not be realized due to flaws in design or programming. (245)

The ability of an autonomous weapon system to apply force proportionately will ultimately depend on the armaments with which it is equipped, (246) as well as the capabilities allowed by its programming and design. (247) Because proportionality is concerned with preventing excessive collateral damage, a factor which itself depends greatly on context, (248) human judgment is likely to be an irreplaceable requirement, (249) at least until technology develops immensely.

IV. CONCLUSION

Autonomous weapon systems are undoubtedly frontrunners with respect to contemporary developments in both technology and armed conflicts. Although states have yet to employ active autonomous weapon systems, fervent discourse regarding legal, ethical, moral, and technological ramifications of their use has nonetheless is underway. This is a prime indication of the transmutation of warfare that autonomous weapon systems will assuredly bring. While numerous individuals and organizations have vocalized their support of the prospect of such a revolution, opposition of similar strength has developed swiftly. (250)

As is the case with all technological advancements that have a substantial effect on the law of armed conflict, the governing legal framework is far from proactive. (251) Some still-applicable international standards that relate to the conduct of hostilities have evolved little, if at all, from their eighteenth-century beginnings. Opponents and proponents alike opine that the current legal framework is not adequately prepared to govern autonomous weapon systems. (252) Put simply, the current legal framework is not a modern one by any means.

An analysis of the legality of autonomous weapon systems is somewhat convoluted, partially due to the fact that there is no narrowly tailored legal framework in place. Current IHL standards consider autonomous weapon systems in light of two analyses: weapons law, which considers whether a weapon system is unlawful per se on account of its design, and targeting law, which considers whether a weapon system is capable of acting and being employed in a lawful manner. Autonomous weapon systems will likely surmount the weapons-law analysis, as they are neither designed nor intended to cause unnecessary suffering. (253) Their autonomy has no direct or significant effect on the probability that they would cause superfluous injury. (254) Instead, it is the munitions with which autonomous systems are equipped that will determine whether it is unlawful per se.

Because autonomous weapon systems should not categorically be considered unlawful per se, it is prudent to consider them in light of targeting law, the second track of analysis. This framework addresses the battlefield practicalities that will flow from autonomous weapon system use. Under this analysis, the manner in which autonomous weapon systems would actually be utilized must be scrutinized strictly. A principal consideration is whether autonomous weapon systems will be capable of distinguishing between legitimate military materiel and objectives from civilian personnel and objects. (255) Targeting the former is proper and expected during combat, whereas targeting the latter is strictly prohibited.

Because an autonomous weapon system's capabilities will depend primarily on its programming and design, various seemingly insurmountable obstacles exist. Drones provide a uniquely relevant and instructive demonstration of such hurdles. These state-of-the-art weapon systems are a favorite among technologically advanced states. Their use, however, has sparked worldwide debate, outrage, and devastation. (256) Drones have proven their incapability of precision targeting and general unreliability since their debut early in the twenty-first century. (257) If anything can be learned from drones, it is that technology can be a blessing, as well as a curse.

Although the pugnacious desire to operationalize autonomous weapon systems is idealistic and far from grounded in practical considerations, autonomous platforms could nevertheless prove to be invaluable, if employed for nonlethal functions. One such non-combat role for an autonomous platform could be that of casualty evacuation transport vehicle. Fully autonomous platforms, such as the United States' X-47B (258) and the United Kingdom's Taranis, (259) could be utilized to transport wounded and deceased combatants from an active warzone. The alternative, traditional means of such retrieval is to deploy human forces, which ultimately subjects additional individuals to the hazards of the battlefield.

Numerous additional nonlethal roles in which autonomous platforms can serve exist. For example, cargo transportation would almost certainly come within the purview of autonomous robots. (260) Those autonomous platforms that aid in battlefield evacuation operations could function similarly in transporting soldiers, thereby reducing manpower and resource expenditures in daily operations. Autonomous robots could also work to reduce risks and causalities associated with ordnance disposal operations. While remotely controlled robots such as the iRobot 510 PackBot have produced favorable results on this front, (261) fully automating these operations could greatly increase efficiency while minimizing resource consumption.

The abovementioned, nonlethal uses of autonomous robots represent a small sample of viable, prudent options for how to employ these feats of technology. In consideration of the current IHL framework, a hasty decision to utilize autonomous weapon systems could precipitate an international arms race and, perhaps, a grim sequel to the Cold War. (262) If, however, states insist on employing autonomous weapon systems prior to developing a comprehensive, accommodating legal framework, they should ensure that their use is limited and regulated. An appealing suggestion for regulating autonomous weapon systems is to confine their targeting capabilities to non-human targets. (263) This concept of "letfting] the machines target other machines" (264) would likely curb many of the fears that opponents have regarding collateral damage and civilian casualties. (265) While the preferred avenue would be to refrain altogether from arming autonomous platforms, this machine-only targeting framework could serve as a workable, casualty-limiting middle ground. The only certainty proffered here is that autonomous weapon systems must be employed with forethought and prudence, regardless of which path is chosen.

International standards must be addressed and altered before robots outnumber humans on the battlefield. While we should let fear neither inform nor curb innovation, we must consider fully the potential consequences of developing wholly autonomous weapon systems. Many states, organizations, and individuals have expressed fears of the consequences that might flow from the development of robot armies. (266) Indeed, it is a telling fact that the states leading efforts to create and implement new weapons legal regimes are those that are the furthest from developing them. (267) This serves as a reminder of the fear and uncertainty that accompany the prospect of unprecedented weapon systems. Although the United Nations has agreed to address this issue and the European Parliament has passed a resolution calling for the ban of the development, production, and use of autonomous weapon systems, (268) more must be done. Immediate international cooperation is required in order to develop a new framework before autonomous weapon systems are able to claim their first victim.

(1.) John H. Brinton, Personal Memoirs of John H. Brinton, Major and Surgeon, U. S. V. 1861-1865, at 239 (1914).

(2.) See Markus Wagner, Autonomy in the Battlespace: Independently Operating Weapon Systems and the Law of Armed Conflict, in INTERNATIONAL HUMANITARIAN LAW AND THE CHANGING Technology OF WAR 99, 112 (Dan Saxon ed., 2012) (describing new complexities and challenges that have accompanied evolutions and developments of weaponry and warfare).

(3.) In 1863, an inventive Charles Perley patented an unmanned aerial instrument that was designed to "injure an enemy that is entirely out of the range of cannon-shot and too far for bombs to be thrown from mortars...." Improvement in Discharging Explosive Shells from Balloons, U.S. Patent No. 37,771 (filed Jan. 24, 1862) (issued Feb. 24, 1863).

(4.) E.g., Nidhi Subbaraman, 'Terminator' on Hold? Debate to Stop Killer Robots Takes Global Stage, NBC NEWS (Oct. 21. 2013), http://www.nbcnews.com/technology/ terminator-hold-debate-stop-killer-robots-takes-global-stage-8C11433704 (reporting the interest of activists, scholars, and UN diplomats in discussing the regulation of autonomous weapon systems).

(5.) William C. Marra & Sonia K. McNeil, Understanding "The Loop": Regulating the Next Generation of War Machines, 36 Harv. J.L. & PUB. POLY 1139, 1165 (2013).

(6.) Philip Alston, Legal Robotic Technologies: The Implications for Human Rights and International Humanitarian Law, 21 J.L. INFO. & SCI. 35, 35 (2011).

(7.) See, e.g., Tony Rock, Yesterday's Laws, Tomorrow's Technology: The Laws of War and Unmanned Warfare, 24 N.Y. INT'L L. REV. 39, 43 (2011) (stating that the United States' use of drones "spark[ed] debates about the legality of such strikes").

(8.) Noel Sharkey, The Ethical Frontiers of Robotics, 322 SCI. 1800, 1801 (2008).

(9.) Peter Asaro, On Banning Autonomous Weapon Systems: Human Rights, Automation, and the Dehumanization of Lethal Decision-Making, 94 INT'L REV. RED CROSS 687, 688 (2012); Gabi Siboni & Yoni Eshpar, Dilemmas in the Use of Autonomous Weapons, 16 STRATEGIC ASSESSMENT 75, 75 (2014).

(10.) See, e.g., Kenneth Anderson & Matthew Waxman, Killer Robots and the Laws of War, WALL ST. J. (Nov. 3, 2013), http://online.wsj.com/news/articles/ SB10001424052702304655104579163361884479576 (addressing and rebutting proffered criticisms of autonomous weapon systems).

(11.) Armin Krishnan, Killer Robots: Legality and Ethicality of Autonomous WEAPONS 3 (2009) ("[Autonomous weapons] can be defined as weapons, which are programmable, which are activated or released into the environment, and which from then on no longer require human intervention for selecting or attacking targets.").

(12.) Int'l Comm, of the Red Cross, International Humanitarian Law and the Challenges of Contemporary Armed Conflicts 39 (2011) ("[Autonomous weapon] systems have not yet been weaponized...."); Autonomous Weapons Systems, EUR. U. INST., http://www.eui.eu/DepartmentsAndCentres/AcademyEuropeanLaw/Projects/ AutonomousWeaponsSystems.aspx (last updated Apr. 23, 2014) ("Fully autonomous weapons systems do not yet exist.").

(13.) Cf. Tyler D. Evans, Note, At War with the Robots: Autonomous Weapon Systems and the Martens Clause, 41 HOFSTRA L. REV. 697, 706 (2013) ("Some military and robotics experts have predicted that the technology required to establish truly autonomous weapons could be available within a few decades."); P.W. SINGER, WIRED FOR WAR: THE ROBOTICS REVOLUTION AND CONFLICT IN THE TWENTY-FIRST CENTURY 128 (2009) ("[A]utonomous robots on the battlefield will be the norm within twenty years.").

(14.) Nick Cumming-Bruce, U.N. Expert Calls for Halt on Robots for Military, N.Y. TIMES, May 31, 2013, at A9; Eric Talbot Jensen, The Future of the Law of Armed Conflict: Ostriches, Butterflies, and Nanobots, 35 MICH. J. INT'L L. 253, 288 (2014) ("[B]esides the U.S., there are 43 other nations that are also building, buying and using military robotics today.").

(15.) See Gary Marchant et al.. International Governance of Autonomous Military Robots, 12 COLUM. SCI. & TECH. L. Rev. 272, 274 (2011) ("[T]he robots of today have extraordinary capabilities and are quickly changing the landscape of battle and dynamics of war."). See Julie Goodrich, Comment, Driving Miss Daisy: An Autonomous Chauffeur System, 51 HOUS. L. REV. 265, 294 (2013), for a discussion of the implications of nonmilitary autonomous robotics.

(16.) Noel Sharkey, Grounds for Discrimination: Autonomous Robot Weapons, 11 RUSI DEFENCE Systems 86, 88 (2008); Marchant et al., supra note 15, at 289; see Timothy Coughlin, The Future of Robotic Weaponry and the Law of Armed Conflict: Irreconcilable Differences?, 17 UCL JURISPRUDENCE Rev. 67, 67-68 (2011) (noting that "technological development and legal structures are in a constant state of ebb and flow").

(17.) E.g., Duke Law, LENS Conference 2013, Building the Terminator? Law and Policy for Autonomous Weapons Systems, YOUTUBE (Mar. 4, 2013), https://www.youtube.com/ watch?v=6PVLkLjdeog (debating the merits of autonomous weapons systems from the perspective of national security law).

(18.) Bonnie Docherty, Human Rights Watch & Int'l Human Rights Clinic, Harvard Law Sch., Losing Humanity: The Case Against Killer Robots 46 (Steve Goose ed., 2012) [hereinafter LOSING HUMANITY]. While Human Rights Watch is a leading voice in the crusade against autonomous weapon systems, its efforts are supported by several other non-governmental organizations, such as the Campaign to Stop Killer Robots. See, e.g., Campaign Launch in London, CAMPAIGN TO STOP KILLER ROBOTS (Apr. 26, 2013), http://www.stopkillerrobots.org/2013/04/campaign-launch-in-london (announcing the launch of a campaign to support the preemptive ban of fully autonomous weapon systems).

(19.) Losing Humanity, supra note 18, at 2-3, 36.

(20.) See Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, Rep. of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, [paragraph][paragraph] 55-56, Human Rights Council, U.N. Doc. A/HRC/23/47 (Apr. 9, 2013) (by Christof Heyns) (juxtaposing mechanical calculations with human judgments).

(21.) E.g., John Markoff, War Machines: Recruiting Robots for Combat, N.Y. TIMES, Nov. 28. 2010, at A1 (warning that technological development in warfare may diminish barriers to instigating military action).

(22.) See, e.g., KENNETH ANDERSON & MATTHEW WAXMAN, LAW AND ETHICS FOR Autonomous Weapon Systems: Why a Ban Won't Work and How the Laws of War CAN 21-22 (2013) (discussing the numerous humanitarian risks of prohibiting autonomous weapons systems).

(23.) E.g., Jeffrey S. Thurnher, Naval War College, Legal Implications of Autonomous Weapon Systems Brief, YouTube (May 31, 2013), https://www.youtube.com/ watch/?v=muQFmY8HvUA.

(24.) See, e.g., Michael N. Schmitt, Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics, HARV. NAT'L SEC. J. FEATURES 2-3 (2013), http://harvardnsj.org/wp-content/uploads/2013/02/ Schmitt-Autonomous-WeaponSystems-and-IHL-Final.pdf (criticizing arguments discussed in Losing Humanity).

(25.) Marra & McNeil, supra note 5, at 1142-43.

(26.) Id. at 1143.

(27.) See U.S. DKP'T OF DEF., DEPARTMENT OF DEFENSE DICTIONARY OF MILITARY AND ASSOCIATED Terms 285 (2010) (defining a weapon system as "[a] combination of one or more weapons with all related equipment, materials, services, personnel, and means of delivery and deployment (if applicable) required for self-sufficiency").

(28.) SINGER, supra note 13, at 67.

(29.) Id. at 74. This understanding of autonomy comports with the views generally accepted in the robotics industry. See, e.g., CELLULA ROBOTICS, ACHIEVING ROBOT AUTONOMY 3 ("The level of autonomy of a system can be measured by the level of supervision required to operate it....").

(30.) Marra & McNeil, supra note 5, at 1150.

(31.) Jeremiah Gertler, Cong. Research Serv., R42136, U.S. Unmanned Aerial SYSTEMS 33 (2012) (describing the MQ-1 Predator's high-profile use in Iraq and Afghanistan and how it has become the Department of Defense's most recognizable unmanned aircraft system).

(32.) Id. at 33-34.

(33.) See Office of the Sec'y of Def., U.S. Dep't of Def., Unmanned Aircraft Systems Roadmap 2005-2030, at 48 (2005) (displaying the varying levels of robot autonomy, with fully autonomous systems at the top of the spectrum).

(34.) Ronald Arkin, Governing Lethal Behavior in Autonomous Robots 37 (2009).

(35.) U.S. Dep't of Def., Directive No. 3000.09, Autonomy in Weapon Systems l (Nov. 21, 2012) [hereinafter DoD DIRECTIVE No. 3000.09].

(36.) Id. [paragraph][paragraph] 1(a), 4(a)(1), 4(e). The Directive currently serves as "the most extensive public pronouncement" of any state's intentions to research, develop, and deploy autonomous weapon systems. Kenneth Anderson et ah, Adapting the Law of Armed Conflict to Autonomous Weapon Systems, 90 INT'L L. STUD. 386, 387 (2014).

(37.) DoD DIRECTIVE No. 3000.09, supra note 35. at 13-14.

(38.) Markus Wagner, Taking Humans Out of the Loop: Implications for International Humanitarian Law, 21 J.L. Info. & SCI. 155, 158-59 (2011).

(39.) Marra & McNeil, supra note 5, at 1152.

(40.) Schmitt, supra note 24, at 4 ("[A] fully autonomous system is never entirely human-free.").

(41.) See id. (reiterating the inevitability of human input required for automated weapon systems): Michael N. Schmitt & Jeffrey S. Thurnher, "Out of the Loop": Autonomous Weapon Systems and the Law of Armed Conflict, 4 HARV. NAT'L SEC. J. 231, 235 (2013).

(42.) Ban 'Killer Robots' Before It's too Late, HUMAN RIGHTS WATCH (Nov. 19, 2012), http://www.hrw.org/news/2012/ll/19/ban-killer-robots-it-s-too-late.

(43.) LOSING Humanity, supra note 18, at 2. The Department of Defense analogues are "semi-autonomous weapon system," "human-supervised autonomous weapon system," and "autonomous weapon system," respectively. DoD DIRECTIVE No. 3000.09, supra note 35, at 13-14. This Comment adopts Losing Humanity's terminology.

(44.) See Marra & McNeil, supra note 5, at 1144-49, for a discussion of the OODA loop and how it applies to autonomous weapon systems.

(45.) See Schmitt & Thurnher. supra note 41, at 238-39 & n.29 (explaining that victory over enemy forces requires faster OODA loop completion through the use of autonomous weapon systems).

(46.) Id. at 241.

(47.) Project Alpha, Concept Exploration Dep't, Joint Futures Lab, Joint Experimentation Directorate, U.S. Joint Forces Command, Unmanned Effects (UFX): Taking The Human Out of the Loop 4-5 (2003).

(48.) Losing Humanity, supra note 18, at 2.

(49.) Id.

(50.) Id.

(51.) See id. ("The term 'fully autonomous weapon' refers both out-of-the-loop weapons and those that allow a human on the loop, but that are effectively out-of-the-loop weapons because the supervision is so limited.").

(52.) Darren M. Stewart, New Technology and the Law of Armed Conflict, 87 INT'L L. STUD. 271, 276 (2011) (describing "autonomous systems" as independently functioning vehicle systems).

(53.) Matthew Bolton, US Must Impose Moratorium and Seek Global Ban on Killer Robots, The Hill (Apr. 24, 2013, 2:55 PM), http://www.thehill.com/blogs/congress-blog/ technology/295807-us-must-impost-moratorium-and-seek-global-ban-on-killer-robots.

(54.) See DoD DIRECTIVE No. 3000.09, supra note 35, H 4(a) ("Autonomous and semiautonomous weapon systems shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force.").

(55.) Schmitt & Thurnher, supra note 41, at 236-37.

(56.) Losing Humanity, supra note 18, at 9.

(57.) Jeffrey S. Thurnher, No One at the Controls: Legal Implications of Fully Autonomous Targeting, JOINT FORCES Q., 4th Quarter 2012, at 77, 80.

(58.) Jeffrey S. Thurnher, Legal Implications of Autonomous Weapon SYSTEMS 1 (2013), available at https://www.dropbox.eom/s/cxh7bpumyxtrsrl/Thurnher.pdf.

(59.) Def. Sci. Bd., U.S. Dep't of Def., Task Force Report: The Role of Autonomy IN DoD Systems 1 (2012) [hereinafter TASK FORCE REPORT],

(60.) Office of the Sec'y of Def., U.S. Dep't of Def., Unmanned Systems Integrated Roadmap FY 2013-2038, at 25 (2013).

(61.) Human Rights Watch & Int'l Human Rights Clinic, Harvard Law Sch., Review of the 2012 US Policy on Autonomy in Weapons Systems 2 (2013). Such shortcomings include, among others, the possibility of waiver, the limited applicability, and the moratorium's impermanency. Id. at 4-5, 7-8.

(62.) See, e.g., Frank Sauer, Autonomous Weapons Systems. Humanising or Dehumanising Warfare?, 4 GLOBAL GOVERNANCE SPOTLIGHT, 1, 2 (2014) (listing Germany, the United Kingdom, and France as countries that are developing autonomous weapon systems).

(63.) News Release, Raytheon Co., Raytheon Awarded $57.8 Million Phalanx Contract (May 18, 2012), available at http://investor.raytheon.com/phoenix.zhtml?c= 84193&p=irol-newsArticle&ID=1697580.

(64.) United States Navy Fact File: MK 15 - Phalanx Close-In Weapons System (CIWS), http://www.navy.mil/navydata/fact_display.asp?cid=2100&tid=487&ct=2 (last updated Nov. 15, 2013).

(65.) ARKIN, supra note 34, at 167.

(66.) Id. at 10, 167.

(67.) Brandon Vinson, X-47B Makes First Arrested Landing at Sea, U.S. NAVY (Jul. 10, 2013), http://www.navy.mil/submit/display.asp?story_id=75298.

(68.) Usnavyhistory, Sea Legs, YOUTUBE (Jul. 21, 2011), https://www.youtube.com/ watch?v=xJG4R3bJNEM&t=55s.

(69.) X-47B UCAS: Unmanned Combat Air System, NORTHROP GRUMMAN CORP. (Apr. 1, 2014), http://www.northropgrumman.com/Capabilities/X47BUCAS/Documents/ UCAS-D_Data_Sheet.pdf.

(70.) See Schmitt & Thurnher, supra note 41, at 239 (noting that the systems "are not yet designed to autonomously attack an enemy").

(71.) 17 June 2013, PARL. Deb., H.C. (2013) 732-34 (U.K.) ("As a matter of policy, Her Majesty's Government are clear that the operation of our weapons will always be under human control as an absolute guarantee of human oversight and authority and of accountability for weapons usage.").

(72.) E.g., LOSING Humanity, supra note 18, at 17-18; Schmitt & Thurnher, supra note 41, at 239.

(73.) Chronology, CAMPAIGN TO STOP KILLER ROBOTS, http://www.stopkillerrobots.org/ chronology (last visited Nov. 13, 2014); see supra note 16 and accompanying text.

(74.) Benjamin Kastan, Autonomous Weapons Systems: A Coming Legal "Singularity"?, 2013 U. ILL. J.L. TECH. & POL'Y 45, 54 (2013); see Interview by Gerhard Dabringer with Armin Krishnan, Assistant Professor, E. Carolina Univ. (Nov. 23, 2009) ("[T]he existing legal and moral framework for war as defined by the laws of armed conflict and Just War Theory is utterly unprepared for dealing with many aspects of robotic warfare.").

(75.) See KRISHNAN, supra note 11, at 89 (describing international law as "not quite clear" with respect to autonomous weapon systems, which has necessitated reliance on fundamental principles and customs of the laws of war).

(76.) See, e.g., Vik Kanwar, Post-Human Humanitarian Law: The Law of War in the Age of Robotic Weapons, 2 HARV. NAT'LSEC. J. 577, 617-19 (2011) (book review).

(77.) Nils Melzer, Human Rights Implications of the Usage of Drones and Unmanned Robots in Warfare 27 (European Union ed., 2013).

(78.) William H. Boothby, Weapons and the Law of Armed Conflict 1 (2009).

(79.) See Int'l Comm, of the Red Cross, Weapons That May Cause Unnecessary Suffering or Have Indiscriminate Effects: Report on the Work of Experts [paragraph] 20 (1973) (explaining that any legal issue regarding the use of weapons is specified by the rules that seek to prevent unnecessary suffering and indiscriminate weapons or methods of combat).

(80.) Ian Henderson, The Contemporary Law of Targeting: Military Objectives, Proportionality and Precautions in Attack under Additional Protocol I, at l (2009).

(81.) BOOTHBY, supra note 78, at 4. For the purposes of this Comment, "means" refers to a weapon system and "methods" refers to how a weapon system is employed on the battlefield.

(82.) Guido Den Dekker, The Law of Arms Control: International Supervision and Enforcement 44-45 (2001).

(83.) INT'L Comm, of the Red Cross, supra note 12, at 36 ("There can be no doubt that IHL applies to new weaponry and to the employment in warfare of new technological developments, as recognized, inter alia, in article 36 of Additional Protocol I.").

(84.) See Hin-Yan Liu, Categorization and Legality of Autonomous and Remote Weapons Systems, 94 INT'L REV. RED CROSS 627, 632 (2012) (explaining how the decisionmaking capacity of autonomous weapon systems challenges the adequacy of modern IHL, as "its categories have not yet been adapted to accommodate non-human decisionmaking entities capable of inflicting violence").

(85.) E.g., Marchant et al., supra note 15, at 289; see generally Anderson et at, supra note 36 (discussing considerations and suggestions in charting a legal course for autonomous weapon systems).

(86.) Krishnan, supra note 11, at 91; A. P. V. Rogers, Law on the Battlefield 3 (1996).

(87.) Burrus M. Carnahan, Lincoln, Lieber and the Laws of War: The Origins and Limits of the Principle of Military Necessity, 92 Am. J. INT'L L. 213, 213 & n.6 (1998).

(88.) See WILLIAM H. BOOTHBY, THE Law OF TARGETING 258-59 (2012) [hereinafter THE LAW OF Targeting] (stating that it is impermissible to use weapons that by nature cause superfluous injury or unnecessary suffering).

(89.) See JEFF A. BOVARNICK ET AL., INT'L & OPERATIONAL LAW DEP'T, JUDGE Advocate Gen.'s Legal Ctr. & Sch., U.S. Army, Law of War Deskbook 157 (Gregory S. Musselman ed., 2011) (claiming that the concept of unnecessary suffering or humanity is targeted at the weaponry).

(90.) Rosario Dominguez-Mates, New Weaponry Technologies and International Humanitarian Law: Their Consequences on the Human Being and the Environment,

in The New Challenges of Humanitarian Law in Armed Conflicts 91, 107-08 (Pablo Antonio Fernandez-Sanchez ed., 2005).

(91.) James G. Foy, Autonomous Weapons Systems: Taking the Human Out of International Humanitarian Law 13 (Apr. 20, 2013), http://papers.ssrn.com/ abstract=2290995.

(92.) Int'l Comm, of the Red Cross & Int'l Fed'n of Red Cross & Red Crescent Soc'ys, 28th International Conference of the Red Cross and Red Crescent, Dec. 2-6, 2003: Adoption OF THE DECLARATION AND AGENDA FOR HUMANITARIAN ACTION [paragraph][paragraph] 2.5.1 (2003), available at http://www.icrc.org/eng/assets/files/other/icrc_002_1103.pdf ("Reviews should involve a multidisciplinary approach, including military, legal, environmental and health-related considerations."); see discussion infra Part III.A.3 (discussing article 36 weapons review).

(93.) See UK MINISTRY OF DEF., THE MANUAL OF THE LAW OF ARMED CONFLICT [paragraph] 1.35 (2004) (referring to the Chemical Weapons Convention of 1993).

(94.) Id. [paragraph][paragraph] 6.8-6.8.4.

(95.) E.g., Somini Sengupta & Rick Gladstone, U.N. Reports Attacks Using Chemicals in Syria, N.Y. TIMES, Dec. 13, 2013, at A12.

(96.) See Matthew Waxman, International Law & The Politics of Urban Air OPERATIONS 6 n.6 (2000) (explaining that U.S. political and military decision-makers respect the laws of armed conflict for policy reasons, out of traditional commitment to rules of law, and because they have a "strong interest in upholding international norms, which tend to be stabilizing forces and increase the predictability of state actions").

(97.) Schmitt & Thurnher, supra note 41, at 243-44.

(98.) See MELZER, supra note 77 (distinguishing weapons law from targeting law and explaining its objectives).

(99.) Task Force Report, supra note 59, at 15-16, 19.

(100.) Protocol Additional to the Geneva Conventions of 12 August 1949, and Relating to the Protection of Victims of International Armed Conflicts, art. 35(1), June 8, 1977, 1125 U.N.T.S. 3 [hereinafter Additional Protocol I],

(101.) Robert Kolb & Richard Hyde, An Introduction to the International Law of Armed Conflicts 45 (2008).

(102.) Additional Protocol I, supra note 100, art. 35(2).

(103.) L. C. Green, The Contemporary Law of Armed Conflict 193 (2d ed. 2000).

(104.) Legality of the Threat or Use of Nuclear Weapons, Advisory Opinion, 1996 I.C.J. 226 (July 8).

(105.) Id. [paragraph] 78. The Court also made the important declaration that "States do not have unlimited freedom of choice of means in the weapons they use." Id.

(106.) KRISHNAN, supra note 11, at 96.

(107.) Gary D. Solis, The Law of Armed Conflict: International Humanitarian Law in War 270 (2010).

(108.) Id. Dum-dum bullets are those that either expand or flatten easily in the human body. UK Ministry OFDEF., supra note 93, If 6.9.

(109.) Judith Gardam, Necessity, Proportionality and the Use of Force by STATES 36 (James Crawford & John S. Bell eds., 2004); see UK MINISTRY OF Def., supra note 93, If 6.9.1.

(110.) See The Law of Targeting, supra note 88, at 259 (declaring that the rule prohibiting injury that lacks military utility is "central in importance" to weapons law).

(111.) "Hors de combat" refers to the status of a combatant who is no longer directly participating in hostilities. Additional Protocol I, supra note 100, art. 41(2); MICHAEL Byers, War Law: International Law and Armed Conflict 127 (2005) (declaring that soldiers who have been injured are classified as hors de combat and are afforded similar protections as civilians); see, e.g., Practice Relating to Rule 6. Civilians' Loss of Protection from Attack, Int'l Comm. Red Cross, https://www.icrc.org/customary-ihl/eng/docs/ v2_rul_rule6 (last visited Oct. 7, 2014) (listing various treaties in which civilians or combatants recognized as hors de combat are afforded protections).

(112.) See Christopher Greenwood, Current Issues in the Law of Armed Conflict: Weapons, Targets and International Criminal Liability, 1 SING. J. INT'L & COMP. L. 441, 443 (1997) (asserting that the principle of unnecessary suffering's primary concern "is with the protection of enemy combatants").

(113.) The Law of Targeting, supra note 88, at 259.

(114.) Jean-Marie Henckaerts & Louise Doswald-Beck, Int'l Comm, of the Red Cross, Customary International Humanitarian Law 237-38 (2005). The United States is one such state that has signed but not ratified Additional Protocol I. E.g., Curtis A. Bradley, Unratified Treaties, Domestic Politics, and the U.S. Constitution, 48 HARV. Int'l L.J. 307, 309 (2007).

(115.) See HENDERSON, supra note 80, at 12 n.64 (categorizing weapons into three classes, with varying degrees of lawfulness).

(116.) Richard p. DiMeglio et al., Int'l & Operational Law Dep't, Judge Advocate Gen.'s Legal Ctr. & Sch., U.S. Army, Law of Armed Conflict Deskbook 154 (William J. Johnson & Wayne Roberts eds., 2013).

(117.) Alan Apple et al., Int'l & Operational Law Dep't, Judge Advocate Gen.'s Legal Ctr. & Sch., U.S. Army, Operational Law Handbook 14 (William Johnson & Wayne Roberts eds., 2013).

(118.) Greenwood, supra note 112, at 446.

(119.) Kastan, supra note 74, at 62.

(120.) See TASK FORCE REPORT, supra note 59, at 15-16, 19 (discussing the realized potential of unmanned aerial vehicles).

(121.) See DIMEGLIO ET AC., supra note 116, at 153-54.

(122.) David E. Graham, The Law of Armed Conflict in Asymmetric Urban Armed Conflict, 87 INT'L L. STUD. 301, 304-05 (2011).

(123.) See Justin McClelland, The Review of Weapons in Accordance with Article 36 of Additional Protocol I, 85 INT'L REV. Red CROSS 397, 407 (2003) ("What is needed is an assessment of whether, in the normal intended use of a weapon, it would be of a nature to cause [unnecessary] injury or suffering.").

(124.) See Kastan, supra note 74, at 62 ("[Autonomous weapon systems] are, quite simply, not designed to 'cause unnecessary suffering,' therefore, they would meet the per se requirements of the humanity principle.").

(125.) Boothby, supra note 78, at 77.

(126.) See id. at 78 (describing modern IHL under Additional Protocol I and its relation to weapon systems).

(127.) Id. at 77-78.

(128.) See discussion infra Part III.B.1 (discussing the principle of distinction).

(129.) Indiscrimination by nature pertains to whether a weapon system is capable of being aimed at a legitimate military objective. See BOOTHBY, supra note 78, at 78. On the other hand, the principle of discrimination is concerned with whether a weapon is actually aimed at such a proper target, regardless of its lawfulness. See infra Part III.B.l (discussing the principle of distinction).

(130.) Additional Protocol I, supra note 100, art. 51(4)(b).

(131.) U.S. Navy, U.S. Marine Corps. & U.S. Coast Guard, The Commander's Handbook on the Law of Naval Operations H 9.1.2 (2007) (contrasting "an artillery round that is capable of being directed with a reasonable degree of accuracy at a military target" with "uncontrolled balloon-borne bombs ... lacking] the capability of direction").

(132.) Schmitt & Thurnher, supra note 41, at 245.

(133.) See generally BERT WEBBER, RETALIATION: JAPANESE ATTACKS AND ALLIED Countermeasures on the Pacific Coast in World War II (1975) (describing balloon bomb attacks and resulting American casualties).

(134.) Robert C. Mikesh, Japan's World War II Balloon Bomb Attacks on North America, 9 SMITHSONIAN ANNALS OF FLIGHT 1-2 (1973).

(135.) Id. at 3.

(136.) Id. at 1, 23-24 (explaining that data on a balloon's direction and distance of travel had to be assumed beyond a certain distance from its launch site).

(137.) Id. at 1.

(138.) Id. at 1, 23-24; see supra note 136 and accompanying text (discussing wind patterns effect on balloon bombs).

(139.) See Mikesh, supra note 134, at 1. 23-24 (stating that the flight path of each balloon bomb had to be tracked in order to see if it actually had a chance of reaching the target).

(140.) A tragic example of the balloon bomb's indiscriminate nature involved the death of a woman and five children who happened upon a grounded balloon in the woods of Oregon, some four weeks after the balloon offensive had ceased. Id. at 67.

(141.) See Additional Protocol I, supra note 100, art. 51(4) (listing what constitutes an indiscriminate attack).

(142.) Id., art. 51(4)(c).

(143.) Naval War College, CyCon 2013 | Michael Schmitt: Autonomous Weapons Systems, YouTube (July 25, 2013), https://www.youtube.com/watch?v=YsdjABmimSQ.

(144.) See ROGERS, supra note 86, at 21 (noting that the rule covers situations in which the attacker cannot control the effects of an attack, such as dangerous forces it releases).

(145.) Marie Anderson et al., Int'l & Operational Law Dep't, Judge Advocate Gen.'s Legal Ctr. & Sch., U.S. Army, Operational Law Handbook 149-50 (Marie Anderson & Emily Zukauskas eds., 2008).

(146.) INT'L Comm, of the Red Cross, supra note 79, t 48 (noting "the poor degree of control, whether in space or in time, which the user of the weapons can exert over their effects").

(147.) See id. (listing natural processes, such as wind or drainage, and dispersion by living carriers of the agent or of the disease as examples of transmission).

(148.) Id. (noting that the factors of unmanageability "militate against the military utility of biological weapons").

(149.) See Foy, supra note 91, at 13-14 (stating most weapons will be capable of adhering to the principles of IHL because of the limited restriction).

(150.) See Schmitt & Thurnher, supra note 41, at 250 (providing examples of innocuous autonomous platforms that would be rendered unlawful if equipped with certain ordnance).

(151.) Cf. William H. Boothby, Autonomous Systems: Precautions in Attack, in International Humanitarian Law and New Weapon Technologies 119, 121 (Wolff Heintschel von Heinegg & Gian Luca Beruto eds., 2012) ("Autonomous platforms give computer based equipment the complex task of deciding what should be attacked, perhaps which weapon should be used, what the angle of attack should be, [and] the altitude from which the weapon will be released.").

(152.) Schmitt, supra note 24, at 8-9.

(153.) Recent and current weapons technologies utilized by drones have reportedly yielded high rates of accuracy when deployed. See The Ethics of Warfare: Drones and the Man, ECONOMIST, July 30, 2011, at 10 (reporting that militants comprised eighty percent of the fatalities caused by drone attacks from 2004 to 2011); but see David Kilcullen & Andrew McDonald Exum, Op-Ed., Death from Above, Outrage Down Below, N.Y. TIMES, May 17, 2009, at WK13 (reporting that drone strikes had a "hit rate" of two percent from 2006 to 2009).

(154.) Legality of the Threat or Use of Nuclear Weapons, supra note 104, H 39.

(155.) Id. T 95.

(156.) See Meredith Hagger & Tim McCormack, Regulating the Use of Unmanned Combat Vehicles: Are General Principles of International Humanitarian Law Sufficient?, 21 J.L. INFO. & SCI. 74, 83-84 (2011) ("Given the Court's finding that even nuclear weapons are not inherently incapable of distinguishing between combatants and civilians, it is extremely unlikely that an other category of weapon will cross this threshold of illegality.").

(157.) See Isabelle Daoust et al., New Wars, New Weapons? The Obligation of States to Assess the Legality of Means and Methods of Warfare, 84 Int'l Rev. Red CROSS 345 (2002) for an in-depth discussion of article 36 and its treatment by various states.

(158.) See ANDERSON & WAXMAN, supra note 22, at 10 (describing the legal review of new weapon systems).

(159.) Schmitt & Thurnher, supra note 41. at 271.

(160.) Id.; Boothby, supra note 151, at 120-21.

(161.) Additional Protocol I, supra note 100, art. 36.

(162.) See LOSING HUMANITY, supra note 18, at 24 ("The purpose of a weapons review is to determine if the new or modified weapon would be prohibited by international law.").

(163.) Dirk Jan Carpentier, Death from Above: A Legal Analysis of Unmanned Drones 10 (Spring 2011) (unpublished LL.M thesis, Ghent University) (on file with author).

(164.) See Additional Protocol I, supra note 100, pmbl. (declaring that "the provisions of the Geneva Conventions" and of Additional Protocol I arc to be "fully applied in all circumstances to all persons who are protected by those instruments, without any adverse distinction based on the nature or origin of the armed conflict or on the causes espoused by or attributed to the Parties to the conflict").

(165.) BOOTHBY, supra note 78, at 341; see also Marie Jacobsson. Modern Weaponry and Warfare: The Application of Article 36 of Additional Protocol I by Governments, 82 INT'L L. STUD. 183, 184-86 (2006) (explaining the unsuccessful measures that the ICRC took to bring its SIrUS project into compliance with article 36).

(166.) Chantal Grut, The Challenge of Autonomous Lethal Robotics to International Humanitarian Law, 18 J. CONFLICT & SECURITY L. 5, 9 (2013).

(167.) Boothby, supra note 78, at 341.

(168.) Id.

(169.) Id.

(170.) U.S. policy mandates that "[t]he acquisition and procurement of DoD weapons and weapon systems shall be consistent with ... the law of armed conflict...." U.S. Dep't of Def., Directive No. 5000.01, The Defense Acquisition System end. 1, [paragraph] E1.1.15 (Nov. 20, 2007).

(171.) Kathleen Lawand, Int'l Comm, of the Red Cross, A Guide to the Legal Review of New Weapons, Means and Methods of Warfare: Measures to Implement Article 36 of Additional Protocol I of 1977, at 4 (2006).

(172.) See, e.g., Liu, supra note 84, at 634-35 & n.31 ("Indeed it would be nonsensical to consider the characteristics of a weapon isolated from the context of its use....").

(173.) See INT'L COMM. OF THE RED CROSS, COMMENTARY ON THE ADDITIONAL Protocols of 8 June 1977 to the Geneva Conventions of 12 August 1949 l 1466 (Yves Sandoz et al. eds., 1987) (noting that under the Protocol and other applicable international law, the determination of a weapons legality is to be determined on the

"basis of normal use").

(174.) Id.

(175.) Yoram Dinstein, The Conduct of Hostilities Under the Law ok INTERNATIONAL Armed Conflict 80 (2004) ("[T]he correct interpretation of the wording [of article 36] is that the clause applies only to the 'normal or expected use' of a new weapon.").

(176.) Lawand, supra note 171, at 10.

(177.) Schmitt & Thurnher, supra note 41, at 274.

(178.) See Schmitt, supra note 24, at 30 ("Legal reviews do not generally consider use issues since they are contextual by nature, whereas the sole context in a determination of whether a weapon is lawful per se is its intended use in the abstract.").

(179.) Id.

(180.) See id. ("Because the assessment is contextual, it is generally inappropriate to make ex ante judgments as to a weapon's compliance with the rule.")

(181.) Jus in bello is defined as the "area of law that governs the conduct of belligerents during war." HENDERSON, supra note 80, at 3.

(182.) See ANDERSON & WAXMAN, supra note 22, at 11 (stating targeting law governs any particular use of an autonomous weapon system, including the use on the battlefield environment and operational settings, in determining its lawfulness).

(183.) See INT'L COMM. OF THE RED CROSS, supra note 12, at 40 (noting that development of IHL compliant autonomous system may ultimately prove impossible).

(184.) The principle of humanity was considered in the discussion of weapons law. See discussion supra Part III.A.1. The remaining principles are distinction, military necessity, and proportionality. See ROGERS, supra note 86, at 3-8 (listing the general principles of law on the battlefield).

(185.) E.g., Introduction to the Law of Armed Conflict (LOAC), GENEVA CALL, http://www.genevacall.org/wp-content/uploads/dlm_uploads/2013/ll/The-Law-of-ArmedConflict.pdf (last visited Oct. 13, 2014).

(186.) Shane R. Reeves & Jeffrey S. Thurnher, Are We Reaching a Tipping Point? How Contemporary Challenges Are Affecting the Military Necessity-Humanity Balance, HARV. NAT'L. SEC. J. Features 1 (June 24, 2013, 6:31 AM), http://www.harvardnsj.org/ wp-content/uploads/2013/06/HNSJ-Necessity-Humanity-Balance_PDF-formatl.pdf.

(187.) Nils Melzer, Int'l Comm, of the Red Cross, Interpretive Guidance on the Notion of Direct Participation in Hostilities under International Humanitarian Law 79 (2009).

(188.) Reeves & Thurnher, supra note 186, at 2.

(189.) BYERS, supra note 111, at 118.

(190.) SOLIS, supra note 107, at 251.

(191.) Additional Protocol I, supra note 100, art. 48.

(192.) BOOTHBY, supra note 78, at 71 n.6.

(193.) Legality of the Threat or Use of Nuclear Weapons, supra note 104, f 78.

(194.) Additional Protocol I, supra note 100, art. 51(2).

(195.) Id. art. 52(1).

(196.) LOSING Humanity, supra note 18, at 30 (predicting that, without human supervision, weapon systems would be incapable of distinguishing combatants from civilians).

(197.) See, e.g., Kathleen Lawand, Fully Autonomous Weapon Systems (Nov. 25, 2013), http://www.icrc.org/eng/resources/documents/statement/2013/09-03-autonomousweapons.htm ("[T]he central challenge of such systems will remain how to ensure hey are capable of being used in a manner that allows the distinction between military objectives and civilian objects....").

(198.) Alan Backstrom & Ian Henderson, New Capabilities in Warfare: An Overview of Contemporary Technological Developments and the Associated Legal and Engineering Issues in Article 36 Weapons Reviews, 94 Int'l Rev. Red CROSS 483, 488 (2012).

(199.) Sharkey, supra note 16, at 87-88.

(200.) But see Introduction to the Law of Armed Conflict, supra note 185 (defining "civilians" as "any persons who are not members of the state armed forces or organised armed groups.").

(201.) Sharkey, supra note 16, at 88.

(202.) Id. at 87.

(203.) See Tony Gillespie & Robin West, Requirements for Autonomous Unmanned Air Systems set by Legal Issues, 4 INT'L C2 J. 1, 11 (2010) ("[A]s much relevant information as practical must be available and it must be clear, definable and intelligible as it may be needed later to justify the action taken.").

(204.) Matthew C. Waxman, International Law and the Politics of Urban Air Operations 10 (2000).

(205.) See Wagner, supra note 38, at 160 (noting that areas can be both civilian and military in nature such as a bridge, used by both civilians and the military).

(206.) See Lawand, supra note 197 (noting that in order to achieve compliance with IHL, autonomous weapons systems will need the capability to distinguish between military and non-military targets in a dynamic battlefield environment in which the status of targets is unclear or changing from minute to minute).

(207.) See Wagner, supra note 38, at 161 (discussing that the system "would have to be able to determine whether a particular target is civilian or military in nature").

(208.) Id.

(209.) See Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, supra note 20, [paragraph][paragraph] 55-56 (citing qualities possessed by humans but not by robots).

(210.) See, e.g., id. [paragraph][paragraph] 89-97 (acknowledging morality concerns related to the legal and moral implications of taking humans out of the loop).

(211.) Id. [paragraph] 56.

(212.) A 2010 car bombing attempt at New York City's Times Square provides an example of the negative impact that drone strike inaccuracies have had on individuals and the United States' foreign relations. The would-be perpetrator indicated that drone strikes were the primary reason for his actions. See Andrea Elliott, A Bloody Siege, A Call to Action, A Times Sq. Plot, N.Y. TIMES, June 23, 2010, at Al, for a discussion of Faisal Shahzad's attempted attack and subsequent prosecution.

(213.) See Schmitt, supra note 24, at 35-36 (noting that autonomous weaponry is still in its infancy).

(214.) SOLIS, supra note 107, at 258.

(215.) Instructions for the Government of Armies of the United States in the Field, General Orders No. 100 [section] 1, art. 14, (Apr. 24, 1863) ("Military necessity, as understood by modern civilized nations, consists in the necessity of those measures which are indispensible for securing the ends of the war, and which are lawful according to the modern law and usages of war."); Carnahan, supra note 87, at 215.

(216.) Bovarnick et al., supra note 89, at 140.

(217.) UK Ministry of Def., supra note 93, [paragraph] 2.2.1.

(218.) Id.

(219.) See Bovarnick et al., supra note 89, at 141; Additional Protocol I, supra note 100, art. 52(2) ("Attacks shall be limited strictly to military objectives.").

(220.) Kastan, supra note 74, at 58.

(221.) Id.

(222.) Schmitt & Thurnher, supra note 41, at 255.

(223.) See Jonathan David Herbach, Into the Caves of Steel: Precaution, Cognition and Robotic Weapon Systems Under the International Law of Armed Conflict, 4 AMSTERDAM L.F. 3, 18 (2012) (noting that battlefield ambiguities, such as a child picking up a fallen soldier's assault rifle, are "nearly infinite and often subtle").

(224.) Id. at 18-19.

(225.) Jordan J. Paust et al.. International Criminal Law 722 (4th ed. 2013).

(226.) See BOOTHBY, supra note 78, at 71 (noting remarkable advances in weapons technology, yet acknowledging that human prowess is not unlimited).

(227.) Cornelia Dean, A Soldier, Taking Orders from its Ethical Judgment Center, N.Y. Times, Nov. 25, 2008, at D1.

(228.) Id. at D4.

(229.) See Task Force Report, supra note 59, at 16.

(230.) See Stewart, supra note 52, at 282 (noting that, during an armed conflict, "the fog of war creates ambiguity and unpredictability beyond the imagination of even the most gifted programmer").

(231.) DiMeglio ET AL., supra note 116, at 139; Bovarnick et al., supra note 89, at 141.

(232.) See Task Force Report, supra note 59, at 15-17 (detailing potential and realized benefits of unmanned aerial, terrestrial, and maritime vehicles).

(233.) See, e.g., Nathan A. Canestaro, Legal and Policy Constraints on the Conduct of Aerial Precision Warfare, 37 VAND. J. TRANSNAT'L L. 431, 462 (2004) ("The doctrine of proportionality prohibits military action in which the negative effects of an attack outweigh the military gain caused by the damage to the enemy, as well as any means of attack that is 'unreasonable or excessive.'").

(234.) Id.

(235.) Additional Protocol I, supra note 100, arts. 51(5)(b), 57(2)(a)(iii).

(236.) The principle of proportionality is similar to the principle of military necessity in that both require that combatants balance the military goal with the potential civilian damage to ensure a justified attack and prevent unnecessary damage. KRISHNAN, supra note 11, at 92.

(237.) See DIMEGLIO ET AL., supra note 116, at 151 (noting that is a rare for a target to be a purely military target).

(238.) See Herbach, supra note 223, at 18 (noting the human-like "situational awareness" that an autonomous weapon would need in order to comply with IHL).

(239.) Indeed, complete guides exist to assist combatants in navigating the murkiness of battlefield decision-making. See, e.g., MORRIS GREENSPAN, THE SOLDIER'S GUIDE to THE Laws OF War (1969) (surveying the gamut of ambiguities and difficulties a soldier could encounter during times of war).

(240.) See Herbach, supra note 223, at 18-19 (detailing that "situational awareness" is required to meet the legal obligations to make autonomous robotic weapons systems legal).

(241.) KRISHNAN, supra note 11, at 92.

(242.) See, e.g., Ronald C. Arkin, Ethical Robots in Warfare, GA. INST. OF TECH., http://www.cc.gatech.edu/ai/robot-lab/online-publications/arkin-rev.pdf (last visited Nov. 13, 2014) (arguing that robots have greater ethical potential, which could result in increased proportionate use of force).

(243.) E.g., Rise of the Drones: Unmanned Systems and the Future of War: Hearing Before the Subeomm. on Nat'l Sec. & Foreign Affairs of the H. Comm, on Oversight & Gov't Reform, 111th Cong. 13-14 (2010) (statement of Edward Barrett).

(244.) See, e.g., Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, supra note 20, [paragraph] 56 (arguing that autonomous weapons systems have restricted abilities to interpret context and to make value-based calculations).

(245.) See Alston, supra note 6, at 49 (stating that the design and limitation of actual technologies may limit the hypothetical advantages).

(246.) See KRISHNAN, supra note 11, at 92-93 (comparing the difference in potential for collateral damage of AWS systems equipped with nuclear weapons to those with more precise weapons such as lasers or microprojectiles).

(247.) See Alston, supra note 6, at 49 (stating that hypothetical advantages may not be reflected in the design or programming of actual technologies).

(248.) Losing Humanity, supra note 18, at 32.

(249.) E.g., Noel Sharkey, Automating Warfare: Lessons Learned from the Drones, 21 J.L. INFO. & SCI. 140, 144 (2011).

(250.) See, e.g., Anderson et at. supra note 36, at 395-406 (describing objections to the development of autonomous weapon systems and constructive counterpoints).

(251.) E.g., SINGER, supra note 13, at 387 ("[W]hile technological change is speeding up exponentially, legal change remains glacial. Chemical weapons were first introduced in World War I, but they weren't fully banned until eighty-two years later. Even worse, if we look back at history, the biggest developments in law only came after some catastrophe.").

(252.) E.g., Interview by Gerhard Dabringer with Armin Krishnan, supra note 74 ("[T]he existing legal and moral framework for war as defined by the laws of armed conflict and Just War Theory is utterly unprepared for dealing with many aspects of robotic warfare.").

(253.) Kastan, supra note 74, at 62.

(254.) Schmitt, supra note 24, at 35.

(255.) Schmitt & Thurnher, supra note 41, at 235 ("The crux of full anatomy, therefore, is the capability to identify, target and attack a person or object without human interface.").

(256.) See, e.g.. Rock, supra note 7 (describing a drone attack that took the lives of civilians and the ensuing disputation about unmanned attacks).

(257.) E.g., Kilcullen & McDonald, supra note 153 (reporting that drone strikes had a "hit rate" of two percent from 2006 to 2009); Douglas C. Lovelace Jr., Preface to 133 Terrorism Commentary on Security Documents: The Drone Wars of the 21st CENTURY: COSTS AND BENEFITS, at vii (Kristen E. Boon & Douglas C. Lovelace, Jr. eds., 2014).

(258.) See Unmanned Combat Air System Carrier Demonstration (UCAS-D), NORTHROP Grumman CORP. 1-2, http://www.northropgrumman.com/Capabilities/ X47BUCAS/Documents/X-47B_Navy_UCAS_FactSheet.pdf (last visited Nov. 5, 2014) (listing capabilities of the X-47B).

(259.) See Taranis, BAB SYSTEMS, http://www.baesystems.com/enhancedarticle/ BAES_157659/taranis (last visited Oct. 9, 2014). (listing the capabilities of the Taranis).

(260.) Project Alpha, supra note 47, at l.

(261.) iRobot 510 PackBot, IROBOT CORP., http://media.irobot.com/download/iRobot+ 510+PackBot.pdf (last visited Nov. 8, 2014) (explaining the PackBot's multitude of capabilities, such as neutralizing bombs and explosive devices; screening vehicles, cargo, buildings, and people; and searching buildings, bunkers, and sewers).

(262.) E.g., Bonnie Docherty, Human Rights Watch & Int'l Human Rights Clinic, Harvard Law Sch., Shaking the Foundations: The Human Rights Implications of Killer Robots 3 (Steve Goose ed., 2014).

(263.) John S. Canning, A Concept of Operations for Armed Autonomous Systems, NavSea WARFARE Ctrs., http://www.dtic.mil/ndia/2006disruptive_tech/canning.pdf (last visited Nov. 5, 2014).

(264.) Id.

(265.) See id. (stating that this concept would enable the machines to disarm the opponent without killing them).

(266.) E.g., UN: Nations Agree to Address 'Killer Robots', HUMAN RIGHTS WATCH (Nov. 15, 2013), http://www.hrw.org/news/2013/11/15/un-nations-agree-address-killerrobots (providing a list of more than forty countries that have publicly addressed fully autonomous weapon systems).

(267.) Cf. Michael Schmitt, Foreword to NEW TECHNOLOGIES AND THE LAW OF ARMED Conflict, at v (Hitoshi Nasu & Robert McLaughlin eds., 2013) ("Interestingly, efforts to craft new weapons legal regimes are increasingly led either by states that have a low likelihood of ever using these weapon systems in combat or by non-governmental organisations.").

(268.) The War Report: Armed Conflict in 2013, at 270 (Stuart Casey-Maslen ed., 2014).

Bradan T. Thomas, J.D. Candidate, 2015, University of Houston Law Center. B.A., 2011, University of North Texas. This Comment received the Executive Board 36 Writing Award for an Outstanding Comment in International Law. The Author would like to thank the editors of the Houston Journal of International Law for their hard work in preparing this Comment for publication, his friends for their support, his family for their love, and Tia Leone for everything.
COPYRIGHT 2015 Houston Journal of International Law
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2015 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Thomas, Bradan T.
Publication:Houston Journal of International Law
Date:Jan 1, 2015
Words:15183
Previous Article:Breaking bad policy: shifting U.S. counter-drug policy, eliminating safe havens, and facilitating international cooperation.
Next Article:Misunderstanding Islam on the use of violence.
Topics:

Terms of use | Privacy policy | Copyright © 2021 Farlex, Inc. | Feedback | For webmasters |