The prospect of fully autonomous weapons marks a new era of warfare. These weapons, also known as “killer robots” and “lethal autonomous weapons systems,” would select and engage targets without meaningful human control. Although they do not exist yet, the development of precursors and military planning documents indicate that technology is moving rapidly in that direction and is years, not decades, away. Fully autonomous weapons raise a host of moral, legal, and security concerns that have led opponents to call for a preemptive ban.
In 2013, states parties to the Convention on Conventional Weapons (CCW) first took up the issue, and they have held informal experts meetings for two years. While fully autonomous weapons are a cutting-edge issue, the discussions so far are reminiscent of a CCW process 20 years ago that addressed blinding lasers, weapons designed to cause permanent blindness, which were then on the cusp of full-scale production and deployment. At the First CCW Review Conference in 1995, states adopted Protocol IV on Blinding Laser Weapons, which preemptively banned a weapon still in development.
This paper explores the history of the blinding lasers protocol, and highlights five areas of concern articulated by critics of both fully autonomous weapons and blinding lasers:
- Concerns under the Martens Clause
- Threats to civilians
- Risks of proliferation
- The need to clarify the legal landscape
- Protection of legitimate technology.
These points of discussion helped motivate CCW states parties to adopt Protocol IV two decades ago, as did concerted outreach by Human Rights Watch, the International Committee of the Red Cross (ICRC), and national Red Cross and Red Crescent societies.
Today CCW states parties should build on that precedent and agree to a preemptive ban on fully autonomous weapons. Although there are many differences between the two types of weapons, as will be discussed below, the revolutionary nature of killer robots strengthens, rather than undermines, the case for a preemptive prohibition.
This paper is largely based on four key sources. First, it draws on the reports of four experts’ meetings on blinding lasers that the ICRC convened between 1989 and 1991 with the participation of government, military, legal, medical, and nongovernmental experts. Second, the paper looks at documents from the CCW’s Group of Governmental Experts (GGE) meeting held to help states parties prepare for the First Review Conference in 1995, particularly working papers prepared by Sweden and the ICRC. Third, it examines government statements reflected in the summary records of the First Review Conference, where states parties negotiated and adopted Protocol IV. Finally, it considers documents from regional efforts that supported the ban on blinding lasers, most notably a European Parliament resolution.
States parties to the CCW should heed the precedent of their success on blinding lasers, and increase the momentum for dealing with fully autonomous weapons. In particular, they should:
- Agree to a mandate at the Meeting of States Parties in November 2015 that creates an open-ended Group of Governmental Experts or Working Group that will devote at least three weeks to work on fully autonomous weapons in 2016.
- Agree to a formal negotiating mandate at the Review Conference in December 2016 based on the work carried out during 2016, with the aim of concluding a new protocol banning fully autonomous weapons within one or two years.
Concerns under the Martens Clause
Fully autonomous weapons raise concerns under the Martens Clause that are akin to those highlighted two decades ago during international meetings about blinding lasers. The fact that the blinding lasers discussions led to a legally binding protocol shows that the principles of humanity and dictates of public conscience enshrined in the Martens Clause can be used as a justification for a preemptive ban of a new weapon.
The Martens Clause dates back to the 1899 and 1907 Hague Conventions and was codified more recently in Article 1(2) of Protocol 1 to the Geneva Conventions, which states:
In cases not covered by this Protocol or by other international agreements, civilians and combatants remain under the protection and authority of the principles of international law derived from established custom, from the principles of humanity and from the dictates of public conscience.
As the US Military Tribunal at Nuremberg explained in 1948, the Martens Clause applies “if and when the specific provisions of [existing law] do not cover specific cases occurring in warfare.”
Given that there is often a dearth of law that covers new technology, the Martens Clause can be relevant to the review of emerging weapons. Indeed, the International Court of Justice stated that the clause has “proved to be an effective means of addressing the rapid evolution of military technology.”
There is some difference in how experts understand the significance of the Martens Clause. Some believe it elevates the principles of humanity and the dictates of public conscience to independent legal standards against which weapons should be judged; others treat these principles as guidance in interpreting existing law in application to new situations. In either case, the standards of the Martens Clause should be applied when assessing emerging weapons and technologies, such as fully autonomous weapons and blinding lasers.
Fully Autonomous Weapons
While the general rules of international humanitarian law would apply to fully autonomous weapons, there is currently no specific law dedicated to the weapons. Therefore, the principles laid out in the Martens Clause are applicable.
Fully autonomous weapons raise concerns under both the principle of humanity and the dictates of public conscience. The ICRC has described humanity as requiring both compassion and the ability to protect. Law, including humanitarian law, is written for and applied by human beings, who bring not only rational calculation but the full range of human attributes to its engagement. Fully autonomous weapons, unlike humans, would lack attributes such as compassion or a sense of morality, which serve as checks on the killing of individuals who are not lawful targets in an armed conflict. The use of fully autonomous weapons could also interfere with the ability to protect civilians because they would face challenges in complying with international humanitarian law’s foundational rules of distinction and proportionality.
Although there is no settled definition of public conscience, both public opinion and morality can play a role in shaping it. For many people the prospect of delegating life-and-death decisions to machines is profoundly disturbing and raises significant moral questions.
Finally, some commentators have argued that human rights law should inform the understanding of both humanity and public conscience. Fully autonomous weapons would threaten to undermine several human rights, including the right to life.
Concerns about compliance with the principles of the Martens Clause also emerged during the discussions leading to Protocol IV on blinding lasers. During the second roundtable of experts convened by the ICRC in 1991, ICRC lawyer Louise Doswald-Beck argued that “[d]ecisions to impose specific restrictions on the use of certain weapons may be based on policy considerations,” and “that the criteria enshrined in the Martens clause [should] be particularly taken into account, as international humanitarian law specifies, by virtue of this clause, that persons remain protected by the principle of humanity and the dictates of public conscience in cases not covered by existing treaty law.” Another participant at the roundtable said that “the Martens clause particularly addresses the problem of human suffering so that the ‘public conscience’ refers to what is seen as inhumane or socially unacceptable.”
In the years leading up to Protocol IV, states, international organizations, and civil society described blinding lasers in terms that indicated the weapons raised concerns under the principles of humanity and dictates of public conscience. In a June 1995 resolution, the European Parliament invoked the language of the Martens Clause and declared that it “believ[ed] that deliberate blinding as a method of warfare is … in contravention of established custom, the principles of humanity and the dictates of the public conscience.” Several speakers at the ICRC-convened meetings concurred that “weapons designed to blind are … socially unacceptable.”
At the First CCW Review Conference, speakers from UN agencies and civil society described blinding lasers in ways that suggested the weapons would be counter to humanity and public conscience. They referred to blinding lasers as “inhumane,” “abhorrent to the conscience of humanity,” and “unacceptable in the modern world.” A particularly effective ICRC public awareness campaign used photographs of soldiers blinded by poison gas during World War I to emphasize the fact that permanently blinding soldiers is cruel and inhumane.
Such characterizations of blinding lasers were linked to the need for a preemptive ban. For example, in its resolution, the European Parliament, “believing that deliberate blinding as a method of warfare is abhorrent[,] … [u]rge[d] Member States to ratify the laser weapon Protocol without delays or reservations.” At the CCW’s First Review Conference, Chile expressed its hope that “the Review Conference would be able to establish guidelines for preventative action to prohibit the development of inhumane technologies and thereby to avoid the need to remedy the misery they might cause.” After the protocol was adopted, China saluted the fact that “for the first time in human history, an inhumane weapon had been declared illegal and prohibited before it had actually been used.”
While blinding lasers presented different problems under the Martens Clause than fully autonomous weapons, the discussions that led to Protocol IV reveal the importance of the clause in addressing emerging weapons. Tensions with the principles of humanity and dictates of public conscience, and more generally notions of abhorrence or social unacceptability, can help drive the adoption of a preemptive ban.
Threats to Civilians
The potential to cause harm to civilians is another concern that has been associated with both fully autonomous weapons and blinding lasers. The negotiating history of Protocol IV illustrates how the desire to minimize a humanitarian threat can motivate states to adopt a ban on emerging weapons, especially if they are willing to put humanitarian considerations ahead of theoretical military benefits.
Fully Autonomous Weapons
Fully autonomous weapons present multiple risks to civilians. The weapons would face significant obstacles in complying with international humanitarian law’s bedrock principles of distinction and proportionality. In contemporary armed conflict, combatants often blend in with the civilian population, and fully autonomous weapons would not have any or a sufficiently complex understanding of human behavior and thinking reliably to distinguish the two groups. In addition, fully autonomous weapons would be unlikely to replicate the human judgment necessary to weigh military advantage and civilian harm under the proportionality test. Beyond an inability to meet the legal rules for protecting civilians, fully autonomous weapons would lack compassion or empathy, powerful checks on the killing of civilians.
The threat to civilians was similarly recognized during the process that produced the blinding lasers protocol. In addition to noting that these weapons could cause superfluous injury to soldiers, some states suggested that blinding lasers would cause civilian “suffering.” The ICRC echoed those concerns, stating that blinding lasers, which could be mass produced, would be “capable of blinding large numbers of soldiers or civilians.” Exemplifying the value some participants placed on civilian protection, Switzerland commented that “[t]he humanitarian motive could and should prevail over all military and strategic considerations. Nobody had ever lost a war by remaining human.”
During the negotiations of Protocol IV, states frequently highlighted the humanitarian benefits of a ban. Italy “hoped that the Conference would take an important step forward by adopting an international instrument of a preventive and humanitarian nature.” The Czech Republic “appealed to the Conference to adopt an additional protocol on the subject to meet humanitarian concerns.”
Other states highlighted that the protocol could reduce the harm to civilians in particular. In voicing its support for the protocol, Austria called for “[g]overnments to take steps to protect innocent civilians from future sufferings.” Denmark “hope[d] that the Conference would be able to act decisively and to agree on new and stricter provisions in the Convention leading to a much-needed effective protection of civilian populations.” Mexico “welcomed all those initiatives, which sought to secure the adoption of rules to alleviate the sufferings of civilians.” Even after Protocol IV was adopted, states continued to note their concern for civilians. According to China, the CCW and its new protocol “had played, and would continue to play, an irreplaceable role in limiting the cruelty of war and injuries to civilians.”
The discussions that led to Protocol IV show how potential threats to civilians can factor into states’ preemptive prohibition of a new type of weapon. Fully autonomous weapons pose a much greater threat to civilians than blinding lasers. Therefore, humanitarian arguments should play an especially important role in considerations of the proposal to ban fully autonomous weapons as soon as possible.
Risks of Proliferation
Warnings that fully autonomous weapons could reach the hands of parties with little regard for international law echo those issued 20 years earlier about blinding lasers. As demonstrated by the outcome of the blinding laser discussions, the risk of widespread proliferation can bolster the case for banning a problematic weapon preemptively.
In an open letter released in July 2015, more than 3,000 artificial intelligence (AI) and robotics experts and an additional 17,000 other endorsers agreed that proliferation of fully autonomous weapons was both likely and dangerous. The letter explained that fully autonomous weapons would be inexpensive to produce because they would “require no costly or hard-to-obtain raw materials.” Illegal channels of trade would then allow them to “appear on the black market.” Ultimately, the experts predicted, fully autonomous weapons would become the “Kalashnikovs of tomorrow.”
Proliferation of fully autonomous weapons, especially to repressive regimes and non-state armed groups that flout international humanitarian law, would endanger civilians. Due to their lack of emotion, the weapons could be perfect tools for abusive leaders, who could direct them against civilians free of the fear of mutiny from human troops resisting orders to commit war crimes. The AI experts’ letter described these weapons as “ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group.” Professor Christof Heyns, UN special rapporteur on extrajudicial, summary or arbitrary executions, similarly observed that fully autonomous weapons could be used “to terrorize the population at large, suppress demonstrations and fight ‘wars’ against drugs.”
The likelihood of proliferation also emerged as a key concern during the discussions surrounding blinding lasers. Participants at the ICRC experts’ meetings in 1990 and 1991 predicted that if development of blinding lasers was permitted, their proliferation would be “inevitable,” in part because they would be light and easy to transport. At the First Review Conference, the ICRC explained that blinding laser weapons would “cost no more than an ordinary rifle.” It cautioned that blinding lasers were “about to be manufactured and exported on a large scale.”
The risk of proliferation was especially problematic because there was a strong chance blinding lasers would spread to parties with little regard for international humanitarian law. Experts taking part in the blinding lasers discussions expected that “repressive regimes, terrorists or criminals” would obtain the weapons. The report from the ICRC’s 1989 experts’ roundtable emphasized that “once weapons exist, especially if they are cheap and widely available, there is a high likelihood, judging from past experiences, that they will be used in all kinds of undesirable and illegal ways.” Participants at that and later meetings expressed concern that blinding lasers would be used to target civilians during armed conflict or create “terror outside armed conflict,” or be abused in domestic policing to suppress peaceful demonstrations.
The serious risk of proliferation and the concern about its humanitarian consequences helped motivate states to adopt a preemptive ban on blinding lasers. Making the case for such a ban, Boutros Boutros-Ghali, secretary-general of the United Nations at the time, called on the First Review Conference to agree to prohibit blinding lasers “before they became a reality, since their proliferation could have terrible consequences, particularly in the hands of terrorists.”
Proponents of the ban also noted that proliferation reduced the incentive of developed countries with higher-tech militaries to allow use of blinding lasers. As the ICRC report on its 1991 roundtable explained, “[t]he advantage of the developed countries in their exclusive ownership of these laser weapons would … be shortlived, at most up to 20 or 30 years, which in fact is a very short time period when looking at the overall social costs.” This reasoning helped convince militaries that they would be better off if no one had blinding lasers than if everyone had them.
The arguments about proliferation that helped lead to the blinding lasers ban parallel those made about fully autonomous weapons. Banning those weapons absolutely and preemptively is essential to preventing their spread to parties that would ignore less restrictive regulations.
The Need to Clarify the Legal Landscape
As was the case with blinding lasers, there are questions about whether fully autonomous weapons would comply with existing international humanitarian law. Supplementing current law with a weapon-specific instrument like Protocol IV can clarify and strengthen the rules for such contentious weapons.
Fully Autonomous Weapons
The ability of fully autonomous weapons to abide by the basic rules of international humanitarian law has been the subject of debate, particularly because of uncertainties about the future of technology. Some commentators have faith that technology could be developed to allow these weapons to comply with the law. Others, however, have emphasized that there would be significant obstacles to a robot being able to distinguish between combatants and civilians or to judge the proportionality of an attack. Drawing on the precautionary principle, these experts argue that given the significant likelihood that fully autonomous weapons would cause serious harm, scientific uncertainty should not stand in the way of a preemptive ban.
Discussions about fully autonomous weapons have featured additional disputes about the adequacy of existing international humanitarian law. Those supportive of the status quo contend it would sufficiently regulate fully autonomous weapons. Advocates for a ban argue that a new legally binding instrument is needed to provide clarity about the legality of specific weapons and attacks, facilitate enforcement, and increase stigmatization of the weapons.
During discussions about blinding lasers, states also disagreed about whether the emerging weapons would violate existing international humanitarian law. Regardless of their stance on the issue, however, states ultimately concurred that new law in the form of a CCW protocol was desirable because of the clarity it would bring.
At the First Review Conference, India, which believed blinding laser weapons would contravene existing law, argued that their unlawfulness justified the adoption of Protocol IV. India contended that “[i]ntentional blinding violated the rules of international humanitarian law and should be prohibited as a method of warfare. India therefore saw merit in adding a protocol on blinding weapons to the Convention.”
Sweden, too, believed blinding lasers would violate international law, but it supported a new law in order to eliminate the doubts of others. In a 1994 working paper, Sweden explained that “a general principle does not suffice when there is a need to draw a definite conclusion with regard to the legality or illegality of new weapons or weapons systems. In most cases, a general principle of law has to be supplemented with more specific regulations and laid down in an explicit provision in an international treaty.”
Argentina called for a new protocol because of the lack of existing law dedicated to blinding lasers; such a protocol would fill a gap in the law “since no international legal instrument existed to regulate the development, manufacture, use and marketing of laser weapons, and protection from their effects was virtually impossible.”
Protocol IV specified that use of blinding lasers in armed conflict was unlawful. The clarity of the new law reduced the need for case-by-case determinations and minimized questions about legality by standardizing rules across countries. The stigma generated by a ban on blinding lasers may have contributed to the fact that the weapons have never been produced or used in armed conflict, even by states not party to the protocol. A comparable ban on fully autonomous weapons would make the debate about their legality moot and pressure even states that did not join the ban to abide by it.
Protection of Legitimate Technology
Proposals for banning emerging weapons, including fully autonomous weapons and blinding lasers, have sparked allegations that the prohibition would stifle technological advancement. The history of Protocol IV shows that such fears are unfounded. The potential for legitimate uses of a general class of technology need not present an obstacle to a preemptive ban on a specific subset that is weaponized.
Fully Autonomous Weapons
A common critique of the preemptive ban on fully autonomous weapons is that it would constrain legitimate civilian and military applications of autonomous technology. A ban, however, would apply only to the development and use of fully autonomous weapons. A prohibition would thus in no way impede the development or use of fully autonomous robotics technology, such as self-driving cars; such technology can have many positive, non-military applications. The prohibition would also not encompass the development and use of semi-autonomous weapons, including existing remote-controlled armed drones.
Research and development activities would be prohibited only if they are directed at technology that can be used exclusively for fully autonomous weapons or that is explicitly intended for use in such weapons. Even if the prohibition is a narrow one, as a matter of principle, countries would not, and should not, be permitted to contract specifically for the development of fully autonomous weapons.
The negotiators of Protocol IV expressed comparable concerns about a ban because lasers, like autonomous systems, can have lawful military and civilian applications. Their concerns ultimately did not interfere with the adoption of a preemptive ban, however, because states recognized that the prohibition of a specific system, i.e., blinding lasers, would not spill over to related technology. History has proven that their faith was well founded.
During the negotiations of Protocol IV at the First Review Conference, states made clear that they did not want to impede the development and use of laser technology for legitimate purposes. The European Union “hoped that an additional protocol on blinding lasers would be adopted in response to the humanitarian concern to avoid unnecessary suffering,” but added that the protocol should not “limit the legitimate military use of laser weapons.” The Czech Republic similarly supported a protocol “on the understanding that such a protocol would not affect legitimate military use of such weapons.”
Some states described what military technology they sought to protect. Japan “supported prohibitions on the use of blinding laser weapons,” but emphasized that the ban should not cover other military uses of lasers and that in its view “the use of laser beams for other purposes such as guidance and measurement should not be restricted as a result of the introduction of such provisions.” The United States, which had already invested in blinding lasers technology, pledged to adhere to an emerging consensus on the humanitarian concerns stemming from blinding laser weapon technology so long as the protocol would not compel it “to accept restrictions on the use of lasers designed for other purposes such as targeting, range-finding or countering optical or electro-optical devices.”
Other speakers highlighted the need to protect civilian as well as military laser technology. Italy, for example, said that it supported the new protocol, but noted that it “should not … hamper the legitimate use of laser beams for military and civilian purposes.”
Proponents of the ban on blinding lasers recognized the importance of allowing not only use but also development of legitimate laser technology. In its 1994 working paper submitted to the CCW Group of Governmental Experts, Sweden emphasized that it had “no intention to deny access to or development of laser technology, be it in the military or in the civilian context. Such an approach would not be acceptable to any modern military force, including Sweden’s armed forces.” Poland seemed confident that the protocol would not be a problem for the development of legitimate technology and said “[s]uch a new instrument would not hamper technological progress in laser targeting techniques, the humanitarian aspect of which seemed to be self-evident.”
Although many states were worried about limiting legitimate technology, their concerns ultimately did not prevent them from drafting and adopting a preemptive ban on blinding lasers. The effectiveness of Protocol IV and the evolution of laser technology over the 20 years since its adoption have proven that states were correct in believing that a preemptive ban can keep problematic weapons out of arsenals while protecting related lawful technology.
States drafted the protocol to ensure it applied only to lasers with blinding “as one of their combat functions.” As a result, militaries have continued to develop and use sophisticated laser systems for guiding weapons, a lawful application under Protocol IV. In the civilian world, laser technology has advanced to the point of being routinely used for corrective eye surgery. This precedent should reassure those who worry a preemptive ban on fully autonomous weapons would adversely affect development of autonomous technology.
While fully autonomous weapons and blinding lasers have raised many of the same concerns, there are notable differences in their specific legal problems and the character of their technology. These points of divergence do not undercut the value of the blinding lasers protocol as precedent for a new instrument on fully autonomous weapons. By contrast, the distinguishing features of fully autonomous weapons only increase the need for a ban.
Different Legal Problems
Both emerging weapons discussed in this paper threaten to contravene international humanitarian law, but the legal problems they present are not identical.
Fully autonomous weapons and blinding lasers implicate the Martens Clause in different ways. Fully autonomous weapons could undermine the principles of humanity and dictates of public conscience because the machines would determine when to take human life without meaningful human control. Blinding lasers could contravene these principles because of the “abhorrent” nature of the injuries they cause. While the effects of these weapons range from infringement of the human right to dignity to infliction of inhumane physical injury, in both cases, the concerns under the Martens Clause center around the effects of the weapons on individual victims.
These weapons also have the potential to run afoul of different legal restraints on the conduct of war. Fully autonomous weapons would face obstacles in complying with the obligation to distinguish between soldier and civilian. Blinding lasers could violate the prohibition on superfluous injury and unnecessary suffering, to combatants and civilians alike. Nevertheless, some overlaps exist. Fully autonomous weapons could undercut protections for soldiers as well as civilians because they could have trouble distinguishing active combatants from wounded ones who are hors de combat. Blinding lasers were condemned in part for the threats they posed to civilians.
Although the legal problems of fully autonomous weapons and blinding lasers differ in specifics, the general parallels mean that Protocol IV is still valuable precedent for banning problematic new weapons before their first deployment.
Differences in the Character of the Weapons
Fully autonomous weapons and blinding lasers also differ in the character of their technology. “Blinding lasers” refers to a certain type of weapon while the term “fully autonomous weapon” encompasses a broad class.
The impact of the technology on the dynamics of armed conflict is of even greater import than the scope of their definition. Blinding lasers, though novel, would not have been a groundbreaking addition to warfare. Fully autonomous weapons, by contrast, have the potential to revolutionize it. Due to their unprecedented lack of human control, fully autonomous weapons have been described as “the third revolution in warfare, after gunpowder and nuclear arms.”
The radical shift in military technology could have serious security, humanitarian, and legal implications. The introduction of fully autonomous weapons could lead to an arms race because most states would want to acquire fully autonomous weapons to keep pace with their enemies. In turn, it could create significant power imbalances and challenge global peace and security. Because of the scale of the damage they could cause, fully autonomous weapons would likely inflict far more widespread humanitarian harm than lasers designed to blind individuals. Finally, fully autonomous weapons could require a reassessment of international humanitarian law, which was designed to restrain human conduct and is ill equipped to dealing with the prospect of machines operating outside human control. Instead of undermining the calls for a ban, therefore, the unique qualities of fully autonomous weapons make a preemptive prohibition even more pressing.
 Future of Life Institute, “Autonomous Weapons: An Open Letter from AI & Robotics Researchers,” July 28, 2015, http://futureoflife.org/AI/open_letter_autonomous_weapons (accessed November 3, 2015).
 CCW Protocol IV on Blinding Laser Weapons, adopted October 13, 1995, entered into force July 30, 1998, art. 1.
 For example, a September 1995 report by Human Rights Watch laid out the case for the need to ban blinding lasers, which it called “a cruel and inhumane weapon.” Human Rights Watch, Blinding Laser Weapons: The Need to Ban a Cruel and Inhumane Weapon (New York: Human Rights Watch, 1995), https://www.hrw.org/reports/1995/General1.htm. See also the chronology of actions contained in Cristina Grisewood, “Limits of Lasers,” Magazine of the International Red Cross and Red Crescent Movement, 1996, http://www.redcross.int/EN/mag/magazine1996_2/18-19.html (accessed November 3, 2015).
 International Committee of the Red Cross (ICRC), Blinding Weapons: Reports of the Meetings of Experts Convened by the International Committee of the Red Cross on Battlefield Laser Weapons, 1989-1991 (Geneva: ICRC, 1993).
 Mines Action Canada has also examined how the blinding lasers protocol can inform the proposed ban on autonomous weapons. In a paper released in April 2015, Mines Action Canada identified five lessons from the blinding lasers protocol: “Pre-emptive bans of weapons are not new”; “Weapons can be banned due to the public revulsion to their use”; “Pre-emptive bans can work”; “Pre-emptive bans do not prevent the development of technology for civilian and related military application”; and “The ICRC and non-governmental organizations are valuable to the process.” Mine Action Canada, “Lessons from Protocol IV on Blinding Laser Weapons for the Current Discussions about Autonomous Weapons Systems,” April 2015, https://bankillerrobotscanada.files.wordpress.com/2015/04/international-... (accessed November 3, 2015).
 Protocol Additional to the Geneva Conventions of 12 August 1949, and Relating to the Protection of Victims of International Armed Conflicts (Protocol I), adopted June 8, 1977, 1125 U.N.T.S. 3, entered into force December 7, 1978, art. 1(2).
 In re Krupp, US Military Tribunal Nuremberg, judgment of July 31, 1948, in Trials of War Criminals Before the Nuremberg Military Tribunals, vol. IX, p. 1340 (emphasis added).
 ICRC, A Guide to the Legal Review of New Weapons, Means and Methods of Warfare: Measures to Implement Article 36 of Additional Protocol I of 1977 (2006), http://www.icrc.org/eng/resources/documents/publication/p0902.htm (accessed November 3, 2015), p. 17 (stating that “[c]onsideration should be given to whether the weapon accords with the principles of humanity and the dictates of public conscience,” and that “[a] weapon which is not covered by existing rules of international humanitarian law would be considered contrary to the Martens [C]lause if it is determined per se to contravene the principles of humanity or the dictates of public conscience.”).
 International Court of Justice, Advisory Opinion on the Legality of the Threat or Use of Nuclear Weapons, July 8, 1996, http://www.icj-cij.org/docket/files/95/7495.pdf (accessed November 3, 2015), para. 78.
 On this basis, any weapon conflicting with either of these standards is therefore arguably unlawful. See, for example, In re Krupp, US Military Tribunal Nuremberg, p. 1340 (asserting that the Martens Clause “is much more than a pious declaration”).
 According to this position, public conscience and the principles of humanity “serve as fundamental guidance in the interpretation of international customary or treaty rules.” Antonio Cassese, “The Martens Clause: Half a Loaf or Simply Pie in the Sky?” European Journal of International Law, vol. 11, no. 1 (2000), p. 212.
 Ibid. (“In case of doubt, international rules, in particular rules belonging to humanitarian law, must be construed so as to be consonant with general standards of humanity and the demands of public conscience.”).
 Some critics argue international humanitarian law would adequately cover autonomous weapon systems, but the most relevant rules are general ones, such as those of distinction and proportionality.
 ICRC, “The Fundamental Principles of the Red Cross and Red Crescent,” ICRC Publication ref. 0513 (1996), http://www.icrc.org/eng/assets/files/other/icrc_002_0513.pdf (accessed November 3, 2015), p. 2.
 See, for example, Lt. Col. Dave Grossman, On Killing: The Psychological Cost of Learning to Kill in War and Society (New York: Little, Brown and Company, 1995), p. 4. According to Lt. Col. Grossman, “there is within man an intense resistance to killing their fellow man. A resistance so strong that, in many circumstances, soldiers on the battlefield will die before they can overcome it.” Another expert writes, “Taking away the inhibition to kill by using robots for the job could weaken the most powerful psychological and ethical restraint in war. War would be inhumanely efficient and would no longer be constrained by the natural urge of soldiers not to kill.” Armin Krishnan, Killer Robots: Legality and Ethicality of Autonomous Weapons (Farnham: Ashgate Publishing Limited, 2009), p. 130.
 See Theodor Meron, “The Martens Clause, Principles of Humanity, and Dictates of Public Conscience,” American Journal of International Law, vol. 95 (2000), pp. 83, 85, 88. According to Meron, “the Martens clause originated as supplementary or residual protection, based on the sources of morality and of law.” Ibid., p. 79 (emphasis added).
 See, for example, Human Rights Watch and Harvard Law School’s International Human Rights Clinic (IHRC), The Need for New Law to Ban Fully Autonomous Weapons: Memorandum to Convention on Conventional Weapons Delegates, November 2013, https://www.hrw.org/news/2013/11/13/need-new-law-ban-fully-autonomous-we..., p. 8.
 In its submission to the International Court of Justice for the Nuclear Weapons case, Australia, for example, argued that “[i]nternational standards of human rights must shape conceptions of humanity and have an impact on the dictates of public conscience.” Judge Weerarnantry, dissenting in the same case, also highlighted the role of the human rights movement in shaping the dictates of public conscience: “The enormous developments in the field of human rights in the post-war years … must necessarily make their impact on assessments of such concepts as ‘considerations of humanity’ and ‘dictates of public conscience’.” Meron, “The Martens Clause, Principles of Humanity, and Dictates of Public Conscience,” American Journal of International Law, p. 84.
 For a more detailed analysis of the human rights implications of fully autonomous weapons, see Human Rights Watch and IHRC, Shaking the Foundations: The Human Rights Implications of Killer Robots, May 2014, https://www.hrw.org/report/2014/05/12/shaking-foundations/human-rights-i....
 ICRC, Blinding Weapons, p. 342 (emphasis in original removed).
 Ibid., p. 341.
 European Parliament, Resolution on the Failure of the International Conference on Anti-Personnel Mines and Laser Weapons, December 4, 1995, http://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:51995IP136... (accessed November 3, 2015).
 ICRC, Blinding Weapons, p. 85.
 Summary of Statement by Human Rights Watch, CCW First Review Conference, “Summary Record of the 6th Meeting,” CCW/CONF.I/SR.6, September 28, 1995, para. 60.
 Summary of Statement by the UN Development Programme, CCW First Review Conference, “Summary Record of the 5th Meeting,” CCW/CONF.I/SR.5, September 27, 1995, para. 50.
 Summary of Statement by Christoffel Blindenmission, CCW First Review Conference, “Summary Record of the 6th Meeting,” CCW/CONF.I/SR.6, September 28, 1995, para. 51.
 European Parliament, Resolution on the Failure of the International Conference on Anti-Personnel Mines and Laser Weapons, December 4, 1995.
 Summary of Statement by Chile, CCW First Review Conference, “Summary Record of the 3rd Meeting,” CCW/CONF.I/SR.3, September 26, 1995, para. 67. See also Summary of Statement by Christoffel Blindenmission, CCW First Review Conference, “Summary Record of the 6th Meeting,” CCW/CONF.I/SR.6, September 28, 1995, para. 50 (hailing the “proposed additional protocol [that] would prevent the further development, production and distribution, or at least the use, of an inhumane weapons system before it created victims in international or internal conflicts.”).
 Summary of Statement by China, CCW First Review Conference, “Summary Record of the 14th Meeting,” CCW/CONF.I/SR.13, May 3, 1996, para. 69.
 See Protocol I, arts. 48, 51(4) and (5)(b). See also ICRC, “Rule 1: The Principle of Distinction between Civilians and Combatants,” Customary International Humanitarian Law Database, http://www.icrc.org/customary-ihl/eng/docs/v1_rul_rule1 (accesed November 3, 2015).
 For more information on the challenges of distinction, see Human Rights Watch and IHRC, Losing Humanity: The Case Against Killer Robots, November 2012, https://www.hrw.org/report/2012/11/19/losing-humanity/case-against-kille..., pp. 30-32; Human Rights Watch and IHRC, Advancing the Debate on Killer Robots: 12 Key Arguments for a Preemptive Ban on Fully Autonomous Weapons, May 2014, https://www.hrw.org/news/2014/05/13/advancing-debate-killer-robots, p. 5.
 For more information on the challenges of proportionality, see Human Rights Watch and IHRC, Losing Humanity, pp. 32-34; Human Rights Watch and IHRC, Advancing the Debate on Killer Robots, pp. 5-8.
 Human Rights Watch and IHRC, Advancing the Debate on Killer Robots, pp. 10-12.
 See, for example, Summary of Statement by Mexico, CCW First Review Conference, “Summary Record of the 2nd Meeting,” CCW/CONF.I/SR.2, September 29, 1995, para. 67 (referring to both blinding lasers and antipersonnel landmines); Summary of Statement by Austria, CCW First Review Conference, “Summary Record of the 2nd Meeting,” CCW/CONF.I/SR.2, September 29, 1995, para. 79.
 Summary of Statement by the ICRC, CCW First Review Conference, “Summary Record of the 2nd Meeting,” CCW/CONF.I/SR.2, September 29, 1995, para. 89.
 Summary of Statement by Switzerland, CCW First Review Conference, “Summary Record of the 4th Meeting,” CCW/CONF.I/SR.4, October 3, 1995, para. 73.
 Summary of Statement by Italy, CCW First Review Conference, “Summary Record of the 2nd Meeting,” CCW/CONF.I/SR.2, September 29, 1995, para. 100.
 Summary of Statement by the Czech Republic, CCW First Review Conference, “Summary Record of the 4th Meeting,” CCW/CONF.I/SR.4, September 27, 1995, para. 85.
 Summary of Statement by Austria, CCW First Review Conference, “Summary Record of the 2nd Meeting,” CCW/CONF.I/SR.2, September 29, 1995, para. 79.
 Summary of Statement by Denmark, CCW First Review Conference, “Summary Record of the 2nd Meeting,” CCW/CONF.I/SR.2, September 29, 1995, para. 73.
 Summary of Statement by Mexico, CCW First Review Conference, “Summary Record of the 2nd Meeting,” CCW/CONF.I/SR.2, September 29, 1995, para. 67.
 Summary of Statement by China, CCW First Review Conference, “Summary Record of the 8th Meeting,” CCW/CONF.I/SR.8, October 13, 1995, para. 30.
 Future of Life Institute, “Autonomous Weapons: An Open Letter from AI & Robotics Researchers,” July 28, 2015.
 UN Commission on Human Rights, Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions, Christof Heyns, A/HRC/23/47, April 9, 2013, http://www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session... (accessed November 3, 2015), para. 84.
 ICRC, Blinding Weapons, pp. 327, 346.
 Summary of Statement by the ICRC, CCW First Review Conference, “Summary Record of the 2nd Meeting,” CCW/CONF.I/SR.2, September 29, 1995, para 89. See also ICRC, “The Rationale for Considering Other Proposals Relating to the Convention and to its Existing or Future Protocols,” CCW/CONF.I/GE/9, July 1994, p. 14 (“Once mass produced, these lasers would be cheap … costing about as much as normal rifles, and the clip-on systems even cheaper.”).
 Summary of Statement by the ICRC, CCW First Review Conference, “Summary Record of the 2nd Meeting,” CCW/CONF.I/SR.2, September 29, 1995, para. 89. In 1995, Human Rights Watch reported that several countries had research and development programs devoted to laser weapons, and China and the United States were developing, in particular, laser weapons designed to injure human eyesight. Human Rights Watch, Blinding Laser Weapons.
 ICRC, Blinding Weapons, p. 327. See also Summary of Statement by the ICRC, CCW First Review Conference, “Summary Record of the 2nd Meeting,” CCW/CONF.I/SR.2, September 29, 1995, para. 89 (predicting blinding lasers “would be disseminated rapidly not only to national armies but also to terrorists and criminals”).
 ICRC, Blinding Weapons, p. 86.
 Ibid., pp. 87, 327. See also ICRC, “The Rationale for Considering Other Proposals Relating to the Convention and Its Existing or Future Protocols,” CCW/CONF.I/GE/9, July 1994, p. 13 (“The ICRC believes that the widespread introduction and use of blinding weapons on the battlefield would dramatically and unnecessarily increase both the level of long term suffering from warfare and the costs of treating casualties. The proliferation of such weapons could also have grave consequences for domestic law enforcement and anti-terrorist efforts.”).
 According to Louise Doswald-Beck, “the [ICRC] meetings of experts had given the ICRC sufficient information as to the horrific effects of blinding laser weapons both on their victims and on society and the fact that such systems, being small arms, would be likely to proliferate widely; it accordingly felt that it could mobilize sufficient international support for a legal regulation.” Louise Doswald-Beck, “New Protocol on Blinding Laser Weapons,” International Review of the Red Cross, no. 312, June 30, 1996, http://www.icrc.org/eng/resources/documents/misc/57jn4y.htm (accessed November 3, 2015).
 Summary of Statement by UN Secretary-General Boutros Boutros-Ghali, CCW First Review Conference, “Summary Record of the 2nd Meeting,” CCW/CONF.I/SR.2, September 29, 1995, para. 5 (emphasis added).
 ICRC, Blinding Weapons, p. 346.
 Human Rights Watch and IHRC, The Need for New Law to Ban Fully Autonomous Weapons, pp. 14-15.
 Summary of Statement by India, CCW First Review Conference, “Summary Record of the 3rd Meeting,” CCW/CONF.I/SR.3, September 26, 1995, para. 24.
 Sweden, “Blinding Weapons: Explanatory Memorandum to the Proposal for a Prohibition,” CCW/CONF.I/GE/14, August 11, 1994, p. 2.
 Summary of Statement by Argentina, CCW First Review Conference, “Summary Record of the 5th Meeting,” CCW/CONF.I/SR.5, September 27, 1995, para. 48.
 Experience with the Chemical Weapons Convention and Biological Weapons Convention bolsters this conclusion.
 Human Rights Watch and IHRC, Advancing the Debate on Killer Robots, p. 26.
 Summary of Statement by Spain, speaking on behalf of the European Union, CCW First Review Conference, “Summary Record of the 2nd Meeting,” CCW/CONF.I/SR.2, September 26, 1995, para. 16.
 Summary of Statement by the Czech Republic, CCW First Review Conference, “Summary Record of the 4th Meeting,” CCW/CONF.I/SR.4, September 27, 1995, para. 84.
 Summary of Statement by Japan, CCW First Review Conference, “Summary Record of the 2nd Meeting,” CCW/CONF.I/SR.2, September 26, 1995, para. 59.
 Summary of Statement by the United States, CCW First Review Conference, “Summary Record of the 4th Meeting,” CCW/CONF.I/SR.2, September 27, 1995, para. 71. This statement was made shortly after the US Department of Defense had announced a new policy directive prohibiting “the use of lasers specifically designed to cause permanent blindness of unenhanced vision.” This policy reversed the US position opposing the development of a new CCW protocol to ban blinding lasers. US Secretary of Defense William Perry reportedly found little support among senior military officers for either the utility or appropriateness of blinding as a method of warfare. The policy change followed public questions and media attention generated in part by a Human Rights Watch report listing ten specific laser systems capable of antipersonnel use that it said were in various stages of development in the United States. See US Department of Defense Office of Assistant Secretary of Defense, “News Release, DoD Announces Policy on Blinding Lasers,” September 1, 1995; Ann Peters, “Blinding Laser Weapons: New Limits on the Technology of Warfare,” Loyola of Los Angeles International and Comparative Law Review, vol. 18, January 9, 1996, http://digitalcommons.lmu.edu/cgi/viewcontent.cgi?article=1397&context=ilr (accessed November 3, 2015), pp. 729-740; Human Rights Watch, U.S. Blinding Laser Weapons, vol. 7, no. 5, May 1995, https://www.hrw.org/reports/1995/Us2.htm.
 Summary of Statement by Italy, CCW First Review Conference, “Summary Record of the 2nd Meeting,” CCW/CONF.I/SR.2, September 26, 1995, para. 100.
 Sweden, “Blinding Weapons: Explanatory Memorandum to the Proposal for a Prohibition,” CCW/CONF.I/GE/14, August 11, 1994, p. 4.
 Summary of Statement by Poland, CCW First Review Conference, “Summary Record of the 4th Meeting,” CCW/CONF.I/SR.4 September 27, 1995, para. 11.
 CCW Protocol IV, art. 1.
 See, for example, Raytheon, “Paveway Laser Guided Bomb,” 2015, http://www.raytheon.com/capabilities/products/paveway/ (accessed November 4, 2015) (stating that “[t]he Paveway™ family of laser guided bombs has revolutionized tactical air-to-ground warfare by converting ‘dumb’ bombs into precision guided munitions”); Lockheed Martin, “Paveway II Plus Laser Guided Bomb (LGB),” 2015, http://www.lockheedmartin.com/us/products/pavewayIIpluslaserguidedbomb.html (accessed November 4, 2015).
 US National Library of Medicine, “Laser Eye Surgery,” undated, https://www.nlm.nih.gov/medlineplus/lasereyesurgery.html (accessed November 3, 2015).
 Future of Life Institute, “Autonomous Weapons: An Open Letter from AI & Robotics Researchers,” July 28, 2015.