The Covid-19 pandemic has presented new challenges to advancing Convention on Conventional Weapons (CCW) discussions on lethal autonomous weapons systems (LAWS), also known as fully autonomous weapons or “killer robots.” Before the global lockdown, states parties agreed to hold 20 days of Group of Governmental Experts (GGE) meetings in 2020-2021. The GGE was tasked with developing recommendations regarding a “normative and operational framework” on LAWS for the CCW’s milestone Sixth Review Conference in December 2021.[1] The Review Conference is widely regarded as the deadline for action on this urgent issue: it should adopt a mandate to negotiate a legally binding instrument on autonomous weapons systems, or states should choose another forum.
Delegates from 56 of CCW’s 125 states parties participated in the convention’s ninth meeting on LAWS from September 21-25, 2020. Many attended in person at the United Nations in Geneva, but the pandemic compelled others to join the meeting remotely online. The meeting chair, Ljupcho Gjorgjinski of the Republic of North Macedonia, produced a Chairperson’s Summary, which includes his personal observations of the exchange of views and 30 pre-meeting written submissions from states parties and the International Committee of the Red Cross (ICRC).[2] In this report, Human Rights Watch and the Harvard Law School International Human Rights Clinic (IHRC) examine the numerous proposals for a normative and operational framework on autonomous weapons systems made at the September 2020 meeting and identify areas of convergence. States parties that favored new law generally agreed that humans must play a role in the use of force and use of autonomous weapons systems and called for a combination of prohibitions and regulations to make that happen. They often recommended prohibiting weapons systems that do not allow for meaningful human control and expressed particular concern about machines that make life-and-death decisions. Numerous states called for positive obligations to ensure meaningful human control over the use of autonomous weapons systems. States parties also identified many of the same elements of human control, such as explainability, predictability and reliability, and temporal and geographic constraints. These elements closely align with those proposed by the Campaign to Stop Killer Roots in a publication prepared by Human Rights Watch and IHRC. [3]
In the months following the September 2020 meeting, Russia, which had not participated, objected to the hybrid nature of the proceedings and insisted that the meeting had “no official status.”[4] Disagreement over that assertion contributed to a last-minute postponement of the November 2020 CCW annual meeting and discussion of lethal autonomous weapons systems. A CCW program of work, 2021 meeting dates, and office holders were finally agreed on in writing in April 2021. The GGE met informally in June 2021.[5] The next—tenth—CCW meeting on killer robots is currently scheduled for August 2021 under chair Ambassador Marc Pecsteen de Buytswerve of Belgium. This report, which can inform the discussions at the upcoming GGE meeting, shows how a majority of CCW states parties agree on the need for a new treaty to address the moral, legal, technical, and security concerns raised by autonomous weapons systems. To ensure that a new treaty becomes a reality, states parties should:
- Agree at the CCW’s Sixth Review Conference to a mandate to negotiate and swiftly adopt a new legally binding instrument on autonomous weapons systems;
- If the Review Conference fails to approve such a mandate, then pursue a legally binding instrument at a forum outside of the CCW; and
- Prohibit weapons systems that select and engage targets without meaningful human control or that target humans, and adopt positive obligations to ensure all other autonomous weapons systems are used with meaningful human control.
A Legally Binding Instrument
At the September 2020 meeting, CCW states parties from Africa, Asia, Europe, the Middle East, and Latin America argued that a legally binding instrument would be the most appropriate outcome of international discussions on lethal autonomous weapons systems.[6] They described a legally binding instrument as “necessary” and a “priority”[7] and highlighted its benefits over a political declaration or a set of best practices.[8] Several states specifically noted that the CCW Guiding Principles on Lethal Autonomous Weapons Systems, adopted by states parties in 2018 and 2019,[9] were insufficient on their own.[10] These 11 broad principles restate existing international humanitarian law (IHL) and offer guidance for future discussions on LAWS, but they do not identify a clear plan for implementing policy measures. Sri Lanka said, “The Guiding Principles are not meant to be nor [are] sufficient enough to be the regulative framework that we seek to put in place to address the complex systems relating to LAWS.”[11] Algeria agreed that the Guiding Principles are “not an end in themselves” and called for a legal foundation for the principles.[12] A legally binding instrument, by contrast, would be the “strongest and most favored option” for addressing the threats posed by autonomous weapons systems.[13]
States’ Positions on Treaty Obligations
The discussions at the 2020 meeting reflected not only widespread support for a legal instrument on autonomous weapons systems but also a growing convergence on its general structure. States parties proposed a combination of prohibitions and regulations that echoed the elements of the treaty proposed by the Campaign to Stop Killer Robots as well as Human Rights Watch and IHRC.
General Obligation
CCW states parties largely agreed that human control over the use of force and weapons systems should be preserved. This position parallels the proposal for a general obligation to “maintain meaningful human control over the use of force.”[14]
The vast majority of the delegations that spoke at the September 2020 meeting recognized that humans should have some role in the use of autonomous weapons systems. The chair of the meeting summarized this part of the discussion, saying: “the use of force must reflect human agency and human intention and … the judgements required to authorize the use of armed force must be made by humans.”[15] At least 30 of the 46 states that spoke, plus the Arab Group, used the term “human control,”[16] and at least 16 of those specifically referred to “meaningful human control” as a term they or others embraced.[17] Some of these states explicitly proposed a legal obligation of human control. Austria, for example, “call[ed] for the early start of negotiations on a legally binding instrument ensuring meaningful human control over critical functions.”[18] Many others implicitly supported such an obligation by stressing the importance of human control and calling for a legally binding instrument in the same intervention. States of the Arab Group, in a statement delivered by Iraq, both “underscore[d] the importance of maintaining human control for the critical functioning of these weapons” and expressed support for a legally binding instrument.[19] Pakistan, another treaty proponent, emphasized that the only answer to ensuring compliance with international humanitarian law lies “in having meaningful human control over such weapons at all stages and times.”[20]
States parties cited several reasons for the need to maintain human control. Like Pakistan, many delegations argued that human control is essential to ensuring compliance with international law, particularly international humanitarian law. Italy, for example, explained that “human control is fundamental to ensure that all weapon systems are developed, deployed, and used in compliance with IHL.”[21] Its delegation added that “only human judgment can perform the necessary assessment relating to the application of the IHL principles in a specific environment.”[22] The Chairperson’s Summary also noted the “widely held view that IHL requires human control, involvement or judgement over weapons and the use of force.”[23] States such as Colombia and Ireland described human control as a means to ensure accountability under international law.[24] The Arab Group noted security concerns, including the likely proliferation of autonomous weapon systems and their potential use by non-state actors.[25] In addition to highlighting legal and security considerations, delegations also raised ethical and moral concerns as reasons to maintain human control. Citing a paper it submitted jointly with Austria, Belgium, Brazil, Chile, Germany, Luxembourg, Mexico, and New Zealand, Ireland explained that human control is “intrinsically linked to the important ethical and moral considerations that form part of the GGE’s work.”[26]
Delegations differed somewhat on the object of human control. Some states parties, including Argentina, Costa Rica, and South Africa, described a need for human control over both the use of force and weapons systems.[27] Others, such as Mexico and Sweden, focused on control over the use of force.[28] In Sweden’s words, “[p]reserving human control over the use of force is a key objective.”[29] The majority of delegations that addressed the issue referenced maintaining control over weapons systems. These states often articulated the need to maintain human control across the entire life cycle of weapons systems, especially the systems’ critical functions, including the selection and engagement of targets.[30]
The strong state support for a legally binding instrument that ensures meaningful human control is consistent with one of the core elements of the treaty proposed by the Campaign to Stop Killer Robots, Human Rights Watch, and IHRC: a general obligation for states parties to “maintain meaningful human control over the use of force.”[31] This obligation establishes an overarching principle that can guide interpretation of the treaty’s prohibitions and positive obligations and close any unexpected loopholes. The focus on control over conduct (“use of force”) rather than control over a specific system helps future-proof the treaty by obviating the need to foresee all possible technologies in a rapidly developing field. In addition, because the term “use of force” is used in both international humanitarian law (the laws of war) and international human rights law, the general obligation ensures that the treaty applies to situations of armed conflict and law enforcement operations.[32] Finally, regulating conduct allows the obligation to cover algorithmic decision-making throughout the targeting process, and thus reflects modern targeting practices, which are characterized by distributed decision-making across actors and technologies. While an obligation to maintain control over the use of force has the above advantages, a general obligation that guarantees meaningful human control over weapon systems would also help address the dangers posed by autonomous weapons systems.
Prohibitions
At the September 2020 meeting, states parties expressed particular consternation about systems that operate without meaningful human control or that use sensor data to target humans. Their concerns paralleled those of civil society organizations and can be addressed by prohibitions on weapons systems that pose fundamental legal or moral problems.
States parties implicitly or explicitly endorsed prohibitions on weapons systems that operate without meaningful human control. As discussed above, a number of states parties emphasized the need to maintain meaningful human control, and requiring such control is effectively equivalent to prohibiting systems that lack such control. In addition, some of the 30 states that had already called for a ban on fully autonomous weapons reiterated their support.[33] An oft-quoted proposal from Chile, for example, included a prohibition on “the design, development, or deployment of weapons or weapons systems that cannot be controlled by humans.”[34] In its call for prohibitions, Sri Lanka also stressed the importance of human control. It argued “that considered and carefully-calculated decisions on distinction, proportionality, and precautions in attack—which a human mind is capable of making in a specific conflict environment—cannot be expected to be replicated by a machine.”[35] Although the Chairperson’s Summary did not take a stand on a prohibition, it similarly recognized “an emerging consensus that fully autonomous weapon systems beyond human control cannot be used in accordance with IHL.”[36]
Several delegations expressed opposition to machines making life-and-death determinations. Sri Lanka stated even if an autonomous weapon system could make the necessary international humanitarian law judgments, its use would nonetheless “raise serious ethical and moral considerations,” challenge “fundamental principles of humanity,” and implicate aspects of international human rights law such as the right to life and the right to human dignity.[37] Sri Lanka continued: “The real issue, however, is irrespective of how precise the target may be, whether it is appropriate to leave a machine to decide on the life and death of a human being. The ethical and moral element of the debate is one of the fundamental, if not the most important aspects of this discussion.”[38] At least 13 states parties, including Sri Lanka, voiced objections to the use of autonomous weapons systems against people.[39] Taking these concerns to the logical conclusion, Chile called for a prohibition on “the design, development, or deployment of weapons or weapons systems that make life-or-death decisions.”[40] Austria declared that “humans must remain in control over decisions related to life and death.”[41]
The prohibitions that civil society organizations have proposed for the new treaty encompass the two major categories of systems that states parties expressed discomfort with. First, the treaty should ban weapons systems that “by their nature select and engage targets without meaningful human control.”[42] The prohibition should cover, for example, complex systems that, due to their machine-learning algorithms, would produce unpredictable or inexplicable effects. Second, the treaty should prohibit systems that rely on target profiles, i.e., certain types of data, such as weight, heat, or sound, to represent people or categories of people.[43] In killing or injuring people based on such data, these systems would violate human dignity and dehumanize violence. The proposed treaty elements ban both types of systems because they are by their nature legally or morally unacceptable.
Positive Obligations
In statements at the September 2020 meeting, states parties further converged around the idea of a treaty with prohibitions and regulations. The elements of the proposed treaty similarly complement their general obligation and prohibitions with regulations, or positive obligations, to “ensure that meaningful human control is maintained” on systems not covered by the prohibitions.[44]
Many states articulated support for restricting the development and use of autonomous weapons systems. The states of the Arab Group and the Non-Aligned Movement, for example, proposed a legally binding instrument containing both prohibitions and regulations.[45] China called for a legally binding instrument containing regulations and suggested emulating the CCW protocol that preemptively bans blinding lasers.[46] Even the United States, which did not support a legally binding instrument, envisioned some limits on how LAWS are used.[47] States cited the rapid pace of artificial intelligence development as a justification for new international regulations.[48]
States parties addressed their general visions for the positive obligations in their statements at the 2020 meeting. Austria and Costa Rica both generally called for limits on autonomy,[49] and the Arab Group stated that a legally binding instrument “should cover … restrictions on the use of LAWS.”[50] Cuba proposed a sliding scale approach: “The greater autonomy and lethality that these machines may have, the stricter should be the regulations that we create.”[51] Argentina argued for restrictions on weapons systems’ capacities for self-learning.[52] The Chairperson’s Summary provided more details. It stated: “The ability to constrain a system through setting boundaries on, among other things, its duration of operation, range of operation and the functions that can operate autonomously, and hence determine whether the weapon-system’s use could be lawful, was considered as relevant by several delegations.” The summary also highlighted the importance of requiring an understanding of how a machine operates and the operational environment, allowing for the ability to intervene, and keeping the application of human control temporally proximate to the attack.[53]
Positive obligations are a core part of the proposed elements for a treaty on weapons systems that select and engage targets based on sensor processing rather than human input. These obligations “ensure that meaningful human control is maintained in the use of all … systems” covered but not prohibited by the treaty.[54] They outline affirmative steps states parties would need to take to cover systems that are not inherently unacceptable but still have the potential to be used to select and engage targets without meaningful human control. Specific positive requirements would bolster the strength of the treaty by regulating the use of emerging technologies in weapons that are not explicitly captured by the treaty’s prohibitions and by being adaptable enough to address future technological developments.
States' Positions on Meaningful Human Control
The concept of meaningful human control cuts across the proposed general obligation, prohibitions, and positive obligations, and there was significant support for maintaining such control at the September 2020 meeting. Although some states parties used other terms, such as human judgment or human intervention, to refer to the human role, as indicated by the statistics noted above, about two-thirds of the states that spoke at the 2020 meeting specifically referenced the importance of “human control” or “meaningful human control.”
The characteristics of human control that states identified at the 2020 meeting align with those laid out in the proposed treaty elements. They can be distilled into decision-making, technological, and operational components.[55] Although none of these components is by itself sufficient to make human control meaningful, each element promotes and contributes to human control.
Decision-Making Components
The decision-making components of meaningful human control that states parties highlighted include explainability and situational awareness. Explainability requires human understanding of how the system functions, including what it might identify as a target. At the 2020 meeting, at least 12 states as well as the ICRC and the UN Institute for Disarmament Research (UNIDIR) noted the importance of this component.[56] Argentina, for example, emphasized that personnel responsible for activating and monitoring LAWS should have precise knowledge of the characteristics of the systems. It added that humans should be able to identify and explain a weapon system’s possible errors or arbitrary decisions during retrospective analysis.[57] The Chairperson’s Summary noted that “[h]uman operators, particularly in the chain of command and control, must have sufficient knowledge and understanding of a system to be confident that it will function as intended in a particular attack.”[58]
Human control also requires situational awareness, i.e., an understanding of the environment in which the use of force may take place. It cannot be pre-programmed into robots because the complexity and rapid changes of conflict make specific situations unforeseeable. Chile explained that the proportionality test requires human assessment of a specific situation and expressed concern about the inability of LAWS to make legal judgments in specific and highly dynamic contexts.[59] Spain emphasized that “situations can arise [in which] humans will be forced to take decisions in unforeseen circumstances, and we cannot leave these situations in the hands of LAWS.”[60]
Technical Components
States parties identified several technical components of human control, including: predictability and reliability; the ability of the system to relay relevant information to the human operator; and the ability of a human to intervene after the activation of the system.[61]
A large number of states parties emphasized the importance of predictability and reliability in weapons systems.[62] Austria, for example, asserted that “predictability and reliability of the weapons used are crucial for IHL compliance, as both contribute to the ability to estimate the expected effects and results of a particular use of force.”[63] Costa Rica warned that the lack of predictability in LAWS and how the limited nature of LAWS programming “leads to great uncertainty [in] the precision and functioning of these systems.”[64] The Chairperson’s Summary also repeatedly mentioned predictability as a relevant characteristic of human control and as a requirement for compliance with international humanitarian law.[65]
Some states argued that a weapon system should be able to relay relevant information to the human operator. For example, France listed maintaining sufficient communication links throughout deployment of an autonomous weapon system as one of six measures to ensure human-machine interaction.[66] Finland noted such links help ensure weapons systems operate as intended.[67] Furthermore, in a joint paper published shortly before the 2020 meeting, the ICRC and Stockholm International Peace Research Institute stated that “a communication link sufficient to transfer raw sensor data” to human operators was one of the technical components needed to enable remote human control of autonomous weapons systems.[68]
States parties that spoke at the 2020 meeting widely viewed the ability of humans to intervene after the activation of the system as a crucial technical component of meaningful human control. They called for incorporating mechanisms that allowed for humans to modify mission objectives, cancel missions, or deactivate systems after deployment. Citing its joint submission with eight other states, Ireland argued that one of the objectively evaluated criteria to ensure full conformity of international law should be “whether the degree of human control allows for human supervision and intervention in order to prevent redefinition of the weapon system’s mission without human validation, and to interrupt or deactivate the carrying out of autonomous functions.”[69] Argentina emphasized that the interface between humans and machines should allow a human to intervene in or abort an operation at any point in the process.[70] The Chairperson’s Summary similarly listed “the ability of a human to deactivate or override the operation of a weapon system” as a characteristic of human-machine interaction.[71]
Although these technical components mentioned above are important elements of meaningful human control, Article 36, a disarmament organization based in the United Kingdom, underscored that states should not fall under the illusion that a simple technological fix can satisfy the requirements of meaningful human control.[72] Technical components are necessary but not sufficient for meaningful human control, and should work in tandem with the decision-making and operational components.
Operational Components
Finally, many CCW states parties proposed imposing operational constraints to ensure meaningful human control. Operational components of meaningful human control include time, space, and target constraints.
A large number of CCW states parties that spoke at the 2020 meeting supported temporal and geographic limits on the use of autonomous weapons systems. The list included Austria, Belgium, Costa Rica, France, Germany, Ireland, Sweden, and Switzerland.[73] Ireland, for example, noted that its submission with eight other states called for establishing “adequate environmental limits, including spatial and temporal limits” to ensure that “the decisions made at the planning stage, including legal assessments, are respected throughout the execution stage.”[74]
Numerous states called for restrictions on types of targets, specifically suggesting that autonomous weapons systems should not be allowed to select and engage humans. As discussed above, at least 13 states, including Argentina, Austria, Chile, Costa Rica, Cuba, France, Mexico, Pakistan, Spain, Sri Lanka, Sweden, Switzerland, and Turkey expressed serious concerns over delegating life-and-death decisions to autonomous weapons systems.[75] Relying on algorithms to target people dehumanizes warfare and presents challenges for compliance with international humanitarian law’s principle of distinction. It also raises concerns about data bias. The Chairperson's Summary, for example, explains that "data bias may have potentially negative implications for compliance with IHL ... may diminish, perpetuate or amplify social biases, including gender and racial biases." [76]
Conclusion
The September 2020 meeting on lethal autonomous weapons systems provided an important opportunity for proponents of a new treaty to articulate their support for specific components of the instrument and to identify points of convergence. The effectiveness of the process is evident in the number of written submissions that groups of states sent to the chair in June 2021 as well as the joint statements presented at the informal consultations later that month.[77] While these groups will need to reconcile the nuances of their positions, the basic elements of their proposals to prohibit and regulate autonomous weapons systems are the same and can form a solid basis for a new treaty. Identifying such areas of commonality is key to the next step in the process: adopting a negotiating mandate at the Review Conference, or, if that fails, going outside the CCW to adopt a legally binding instrument.
[1] Convention on Conventional Weapons (CCW) Meeting of High Contracting Parties, “Final Report,” CCW/MSP/2019/CRP.2/Rev.1, https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2019/hcp-meeting/documents/final-report.pdf (accessed June 30, 2021), para. 31.
[2] “Chairperson’s Summary,” CCW Group of Governmental Experts on Emerging Technologies in Area of Lethal Autonomous Weapons Systems (GGE), CCW/GGE.1/2020/WP.7, April 19, 2021, https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2020/gge/documents/chair-summary.pdf (accessed June 30, 2021) (Chairperson’s Summary). The following states made submissions before the September 2020 meeting: Australia, Austria, Brazil, China, Colombia, Costa Rica, Cuba, Ecuador, Finland, France, Germany, Guatemala, Israel, Italy, Japan, Mauritius, the Netherlands, Panama, Poland, Portugal, Russia, South Africa, Spain, Sweden, Switzerland, the United Kingdom, the United States, Venezuela, and Venezuela on behalf of the Non-Aligned Movement. There was joint submission from Austria, Belgium, Brazil, Chile, Ireland, Germany, Luxembourg, Mexico, and New Zealand, as well as a submission from the International Commiteee of the Red Cross (ICRC). Ibid., p. 2. Ambassador Janis Karklins of Latvia chaired the GGE for the first half of 2020.
[3] Campaign to Stop Killer Robots, “Key Elements of a Treaty on Fully Autonomous Weapons,” November 2019, https://www.stopkillerrobots.org/wp-content/uploads/2020/04/Key-Elements-of-a-Treaty-on-Fully-Autonomous-WeaponsvAccessible.pdf (accessed July 24, 2021). See also Human Rights Watch and Harvard Law School International Human Rights Clinic (IHRC), New Weapons, Proven Precedent: Elements of and Models for a Treaty on Killer Robots, October 2020, https://www.hrw.org/report/2020/10/20/new-weapons-proven-precedent/elements-and-models-treaty-killer-robots; Bonnie Docherty, “The Need for and Elements of a New Treaty on Fully Autonomous Weapons,” in Rio Seminar on Autonomous Weapons (Fundação Alexandre de Gusmão: Brasília, 2020), http://funag.gov.br/biblioteca/download/laws_digital.pdf (accessed July 24, 2021), pp. 223-234.
[4] Russian Federation, “The Position on the Status of the Meetings in 2020,” CCW/2020/2, April 13, 2021, https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2021/gge/documents/position-meetings-russia.pdf (accessed June 30, 2021).
[5] In written submissions and statements made at the informal meeting in June 2021, several states made proposals for a new treaty that were consistent with but even stronger and more specific that those made in September 2020. See, for example, Joint Working Paper submitted by Costa Rica, Panama, Peru, the Philippines, Sierra Leone, and Uruguay, June 2021, https://documents.unoda.org/wp-content/uploads/2021/06/Costa-Rica-Panama-Peru-the-Philippines-Sierra-Leone-and-Uruguay.pdf (accessed June 30, 2021); Joint Working Paper submitted by the Brazil, Chile, and Mexico, “Elements for a Future Normative Framework Conducive to a Legally Binding Instrument to Address the Ethical Humanitarian and Legal Concerns Posed by Emerging Technologies in the Area of (Lethal) Autonomous Weapons (LAWS),” June 2021, https://documents.unoda.org/wp-content/uploads/2021/06/Brazil-Chile-Mexico.pdf (accessed June 30, 2021).
[6] See, for example, statement of Austria, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020, https://conf.unog.ch/dr/public/61.0500/0A90EB8D-23C3-47F2-8E27-FD27E45BF17F_15h13/chunks/snippet_lOs145-36t150-35.mp3 (accessed June 30, 2021); statements of Algeria, Brazil, Pakistan, Peru, the Philippines, and Sri Lanka, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020 (notes by Human Rights Watch and IHRC); statement of Iraq, CCW GGE meeting on lethal autonomous weapons systems, September 22, 2020, https://conf.unog.ch/dr/public/61.0500/E5FE0F1C-9F68-4E0A-9809-B1D049DF26CE_10h01/chunks/snippet_lEs60-26t61-59.mp3 (accessed June 30, 2021).
[7] Statements of Cuba and Peru, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020 (notes by Human Rights Watch and IHRC).
[8] Statements of Chile and Pakistan, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020 (notes by Human Rights Watch and IHRC); statement of Venezuela, CCW GGE meeting on lethal autonomous weapons systems, September 24, 2020 (notes by Human Rights Watch and IHRC).
[9] Chairperson’s Summary, “Guiding Principles,” Annex I, p. 13.
[10] See, for example, statements of Algeria, Pakistan, and Sri Lanka, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020 (notes by Human Rights Watch and IHRC); statement of Venezuela, CCW GGE meeting on lethal autonomous weapons systems, September 24, 2020 (notes by Human Rights Watch and IHRC).
[11] Statement of Sri Lanka, CCW GGE meeting on lethal autonomous weapons systems, September 24, 2020, https://conf.unog.ch/dr/public/61.0500/BC0F8FB7-5F42-4E59-8A47-F201A18A87D2_10h18/chunks/snippet_lOs114-53t119-46.mp3 (accessed June 30, 2021).
[12] Statement of Algeria, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020, https://conf.unog.ch/dr/public/61.0500/A8249AAC-C92C-41A8-BC92-6CA307B3A798_10h15/chunks/snippet_lEs108-12t112-25.mp3 (accessed June 30, 2021).
[13] Statement of Pakistan, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020, https://conf.unog.ch/dr/public/61.0500/0A90EB8D-23C3-47F2-8E27-FD27E45BF17F_15h13/chunks/snippet_lEs17-38t23-38.mp3 (accessed June 30, 2021).
[14] Campaign to Stop Killer Robots, “Key Elements of a Treaty on Fully Autonomous Weapons,” November 2019, p. 2.
[15] Chairperson’s Summary, p. 8, para. 27.
[16] Statements of Argentina, Costa Rica, Cuba, Egypt, Ecuador, France, Germany, Jordan, Mexico, the Netherlands, Peru, Spain, Turkey, and the Arab Group (delivered by Iraq) CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020 (notes by Human Rights Watch and IHRC); statements of Austria, Belgium, Colombia , Italy, New Zealand, Norway, Pakistan, and Venezuela, CCW GGE on lethal autonomous weapon systems, September 22, 2020 (notes by Human Rights Watch and IHRC); statements of Algeria, Brazil, Chile, India, Ireland, Japan, Sweden, Switzerland, and South Africa, CCW GGE on lethal autonomous weapon systems, September 23, 2020 (notes by Human Rights Watch and IHRC).
[17] Statements of Argentina, Costa Rica, Mexico, the Netherlands, Peru, and Spain, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020 (notes by Human Rights Watch and IHRC); statements of Austria, Belgium, New Zealand, Norway, and Pakistan, CCW GGE meeting on lethal autonomous weapons systems, Geneva, September 22, 2020 (notes by Human Rights Watch and IHRC); statements of Brazil, India, Ireland, Japan, and South Africa, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020 (notes by Human Rights Watch and IHRC).
[18] Statement of Austria, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020, https://conf.unog.ch/dr/public/61.0500/0A90EB8D-23C3-47F2-8E27-FD27E45BF17F_15h13/chunks/snippet_lEs145-36t150-35.mp3 (accessed June 30, 2021). See also statement of Peru, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020, https://conf.unog.ch/dr/public/61.0500/0A90EB8D-23C3-47F2-8E27-FD27E45BF17F_15h13/chunks/snippet_lEs140-31t145-25.mp3 (accessed June 30, 2021) (“Peru reiterates the importance to start negotiations on a legally binding international instrument that will ban and regulate the development and deployment and use of these weapons to guarantee meaningful human control.…”); statement of Costa Rica, CCW GGE meeting on lethal autonomous weapons systems, September 24, 2020, https://conf.unog.ch/dr/public/61.0500/3F0FA712-8D86-48A0-8A65-C8822662F685_15h07/chunks/snippet_lEs37-51t43-27.mp3 (accessed June 30, 2021) (“Those options cannot substitute an internationally legally binding agreement that would stipulate prohibitions and regulations on autonomous weapon systems, limiting the autonomy and maintaining significant human control.”).
[19] Statement of the Arab Group, delivered by Iraq, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020, https://conf.unog.ch/dr/public/61.0500/0A90EB8D-23C3-47F2-8E27-FD27E45BF17F_15h13/chunks/snippet_lEs94-57t98-37.mp3 (accessed June 30, 2021).
[20] Statement of Pakistan, CCW GGE meeting on lethal autonomous weapons systems, September 22, 2020, https://conf.unog.ch/dr/public/61.0500/E5FE0F1C-9F68-4E0A-9809-B1D049DF26CE_10h01/chunks/snippet_lEs66-09t71-55.mp3 (accessed June 30, 2021).
[21] Statement of Italy, CCW GGE meeting on lethal autonomous weapons systems, September 22, 2020, https://conf.unog.ch/dr/public/61.0500/E5FE0F1C-9F68-4E0A-9809-B1D049DF26CE_10h01/chunks/snippet_lEs42-05t45-23.mp3 (accessed June 30, 2021).
[22] Ibid.
[23] Chairperson’s Summary, p. 4, para. 7.
[24] Statement of Colombia, CCW GGE meeting on lethal autonomous weapons systems, September 22, 2020, https://conf.unog.ch/dr/public/61.0500/E5FE0F1C-9F68-4E0A-9809-B1D049DF26CE_10h01/chunks/snippet_lEs12-11t14-41.mp3 (accessed June 30, 2021); statement of Ireland, CCW GGE meeting on lethal autonomous weapons systems, September 22, 2020, https://conf.unog.ch/dr/public/61.0500/E2E48F3A-D5D7-4C67-9D83-DFF8B4933E2A_15h10/chunks/snippet_lEs64-39t70-26.mp3 (accessed June 30, 2021).
[25] Statement of the Arab Group, delivered by Iraq, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020.
[26] Statement of Ireland, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020, https://conf.unog.ch/dr/public/61.0500/881D2B3C-1407-49BA-AE82-1914CA10D467_10h10/chunks/snippet_lEs102-14t107-51.mp3 (accessed June 30, 2021).
[27] Statements of Argentina, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020, https://conf.unog.ch/dr/public/61.0500/0A90EB8D-23C3-47F2-8E27-FD27E45BF17F_15h13/chunks/snippet_lEs129-47t132-52.mp3 (accessed June 30, 2021); statement of Argentina, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020, https://conf.unog.ch/dr/public/61.0500/881D2B3C-1407-49BA-AE82-1914CA10D467_10h10/chunks/snippet_lEs7-49t12-41.mp3 (accessed June 30, 2021); statement of Costa Rica, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020, https://conf.unog.ch/dr/public/61.0500/0A90EB8D-23C3-47F2-8E27-FD27E45BF17F_15h13/chunks/snippet_lEs116-52t122-36.mp3 (accessed June 30, 2021); statement of South Africa, CCW GGE meeting on lethal autonomous weapons systems, September 24, 2020, https://conf.unog.ch/dr/public/61.0500/3F0FA712-8D86-48A0-8A65-C8822662F685_15h07/chunks/snippet_lEs21-54t27-55.mp3 (accessed June 30, 2021).
[28] Statement of Mexico, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020, https://conf.unog.ch/dr/public/61.0500/0A90EB8D-23C3-47F2-8E27-FD27E45BF17F_15h13/chunks/snippet_lEs162-28t167-24.mp3 (accessed June 30, 2021); statement of Sweden, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020, https://conf.unog.ch/dr/public/61.0500/881D2B3C-1407-49BA-AE82-1914CA10D467_10h10/chunks/snippet_lEs1-42t5-47.mp3 (accessed June 30, 2021).
[29] Statement of Sweden, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020.
[30] Statements of Australia, Belgium, Jordan, and Spain, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020 (notes by Human Rights Watch and IHRC); statement of Venezuela, CCW GGE meeting on lethal autonomous weapons systems, September 22, 2020 (notes by Human Rights Watch and IHRC); statement of Spain, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020 (notes by Human Rights Watch and IHRC).
[31] Campaign to Stop Killer Robots, “Key Elements of a Treaty on Fully Autonomous Weapons,” November 2019, p. 2.
[32] Although international humanitarian law and international human rights law govern the use of force in somewhat different ways, the new treaty can take such differences into account.
[33] Campaign to Stop Killer Robots, “Country Positions on Negotiating a Treaty to Ban and Restrict Killer Robots,” September 2020, https://www.stopkillerrobots.org/wp-content/uploads/2020/05/KRC_CountryViews_25Sep2020.pdf (accessed June 29, 2021).
[34] Statement of Chile, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020, https://conf.unog.ch/dr/public/61.0500/17F25398-AD13-4F73-865A-905D901B7737_15h04/chunks/snippet_lEs39-48t45-35.mp3 (accessed June 30, 2021).
[35] Statement of Sri Lanka, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020, https://conf.unog.ch/dr/public/61.0500/0A90EB8D-23C3-47F2-8E27-FD27E45BF17F_15h13/chunks/snippet_lEs170-59t174-48.mp3 (accessed June 30, 2021).
[36] Chairperson’s Summary, p. 4, para. 7.
[37] Statement of Sri Lanka, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020. Sri Lanka also noted a further reason to pay attention to international human rights law: the “use of autonomous technology in civil operations outside the conflict environment is a possibility that cannot be ruled out.”
[38] Ibid.
[39] Statements of Cuba, Mexico, and Turkey, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020 (notes by Human Rights Watch and IHRC); statements of Austria, France, Pakistan, Sweden, and Switzerland, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020 (notes by Human Rights Watch and IHRC); statements of Argentina, Costa Rica, and Spain, CCW GGE meeting on lethal autonomous weapons systems, September 24, 2020 (notes by Human Rights Watch and IHRC). See also statement of ICRC, CCW GGE meeting on lethal autonomous weapons systems, September 24, 2020 (notes by Human Rights Watch and IHRC).
[40] Statement of Chile, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020, https://conf.unog.ch/dr/public/61.0500/17F25398-AD13-4F73-865A-905D901B7737_15h04/chunks/snippet_lEs39-48t45-35.mp3 (accessed June 30, 2021).
[41] Statement of Austria, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020, https://conf.unog.ch/dr/public/61.0500/881D2B3C-1407-49BA-AE82-1914CA10D467_10h10/chunks/snippet_lEs47-56t55-45.mp3 (accessed June 30, 2021).
[42] Campaign to Stop Killer Robots, “Key Elements of a Treaty on Fully Autonomous Weapons,” November 2019, p. 6.
[43] Richard Moyes, Article 36, “Target Profiles,” August 2019, http://www.article36.org/wp-content/uploads/2019/08/Target-profiles.pdf (accessed September 8, 2020), p. 3.
[44] Campaign to Stop Killer Robots, “Key Elements of a Treaty on Fully Autonomous Weapons,” November 2019, p. 2.
[45] Statement of the Arab Group, delivered by Iraq, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020; “Working paper by the Bolivarian Republic of Venezuela on behalf of the Non-Aligned Movement (NAM) and Other States Parties to the Convention on Certain Conventional Weapons (CCW),” CCW/GGE.1/2020/WP.5, September 14, 2020, https://undocs.org/CCW/GGE.1/2020/WP.5 (accessed June 30, 2021), pp. 2-3.
[46] China stated that preventative measures like regulations were necessary to be prepared for the humanitarian, legal, and ethical repercussions of LAWS. Statement of China, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020, https://conf.unog.ch/dr/public/61.0500/0A90EB8D-23C3-47F2-8E27-FD27E45BF17F_15h13/chunks/snippet_lEs87-41t91-34.mp3 (accessed June 30, 2021).
[47] Statement of the United States, CCW GGE meeting on lethal autonomous weapons systems, September 22, 2020, https://conf.unog.ch/dr/public/61.0500/E5FE0F1C-9F68-4E0A-9809-B1D049DF26CE_10h01/chunks/snippet_lEs29-15t32-44.mp3 (accessed June 30, 2021).
[48] Pakistan described lethal autonomous weapons as “a unique and novel class of weapons” and said, “Rapid advances in the field of artificial intelligence need to be appropriately regulated in all [their] dimensions with respect to LAWS. They should not outpace the evolution of regulations governing them.” Statement of Pakistan, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020. Similarly concerned with the pace of technology development, Colombia said, “[W]e are convinced that regulation is essential to be able to move forward with peace of mind…. [Autonomous weapons systems] must have a legally binding framework before that type of weapon can be rolled out.” Statement of Colombia, CCW GGE meeting on lethal autonomous weapons systems, September 22, 2020.
[49] Statement of Austria, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020, https://conf.unog.ch/dr/public/61.0500/881D2B3C-1407-49BA-AE82-1914CA10D467_10h10/chunks/snippet_lEs47-56t55-45.mp3 (accessed June 30, 2021); statement of Costa Rica, CCW GGE meeting on lethal autonomous weapons systems, September 24, 2020, https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2020/gge/statements/24Sept_Costa-Rica.pdf (accessed June 30, 2021).
[50] Statement of the Arab Group, delivered by Iraq, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020.
[51] Statement of Cuba, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020, https://conf.unog.ch/dr/public/61.0500/0A90EB8D-23C3-47F2-8E27-FD27E45BF17F_15h13/chunks/snippet_lEs23-59t29-57.mp3 (accessed June 30, 2021).
[52] Statement of Argentina, CCW GGE meeting on lethal autonomous weapons systems, September 24, 2020, https://conf.unog.ch/dr/public/61.0500/BC0F8FB7-5F42-4E59-8A47-F201A18A87D2_10h18/chunks/snippet_lEs110-15t114-46.mp3 (accessed June 30, 2021).
[53] Chairperson’s Summary, p. 9, para. 30.
[54] Campaign to Stop Killer Robots, “Key Elements of a Treaty on Fully Autonomous Weapons,” November 2019, p. 2.
[55] Ibid., p. 4.
[56] Statements of Germany and Spain, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020 (notes by Human Rights Watch and IHRC); statements of Finland, Mexico, and the Republic of Korea, CCW GGE meeting on lethal autonomous weapons systems, September 22, 2020 (notes by Human Rights Watch and IHRC); statements of Argentina, Cuba, France, ICRC, India, South Africa, Sweden, and UNIDIR, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020 (notes by Human Rights Watch and IHRC); statement of Pakistan, CCW GGE meeting on lethal autonomous weapons systems, September 24, 2020 (notes by Human Rights Watch and IHRC).
[57] Statement of Argentina, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020, https://conf.unog.ch/dr/public/61.0500/881D2B3C-1407-49BA-AE82-1914CA10D467_10h10/chunks/snippet_lEs7-49t12-41.mp3 (accessed June 30, 2021).
[58] Chairperson’s Summary, p. 8, para. 27.
[59] Statement of Chile, CCW GGE meeting on lethal autonomous weapons systems, September 22, 2020, https://conf.unog.ch/dr/public/61.0500/E5FE0F1C-9F68-4E0A-9809-B1D049DF26CE_10h01/chunks/snippet_lEs14-49t14-59.mp3 (accessed June 30, 2021).
[60] Statement of Spain, CCW GGE meeting on lethal autonomous weapons systems, September 24, 2020, https://conf.unog.ch/dr/public/61.0500/BC0F8FB7-5F42-4E59-8A47-F201A18A87D2_10h18/chunks/snippet_lEs44-52t49-18.mp3 (accessed June 30, 2021).
[61] Campaign to Stop Killer Robots, “Key Elements of a Treaty on Fully Autonomous Weapons,” November 2019, p. 4.
[62] See, for example, statement of Belgium, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020 (notes by Human Rights Watch and IHRC); statements of Austria, Germany, Norway, and the Republic of Korea, CCW GGE meeting on lethal autonomous weapons systems, September 22, 2020 (notes by Human Rights Watch and IHRC); statement of Ireland, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020 (notes by Human Rights Watch and IHRC); statements of Brazil and Costa Rica, CCW GGE meeting on lethal autonomous weapons systems, September 24, 2020 (notes by Human Rights Watch and IHRC).
[63] Statement of Austria, CCW GGE meeting on lethal autonomous weapons systems, September 22, 2020, https://conf.unog.ch/dr/public/61.0500/E5FE0F1C-9F68-4E0A-9809-B1D049DF26CE_10h01/chunks/snippet_lEs105-20t112-18.mp3 (accessed June 30, 2021).
[64] Statement of Costa Rica, CCW GGE meeting on lethal autonomous weapons systems, September 24, 2020, https://conf.unog.ch/dr/public/61.0500/BC0F8FB7-5F42-4E59-8A47-F201A18A87D2_10h18/chunks/snippet_lEs15-39t15-43.mp3 (accessed June 30, 2021).
[65] Chairperson’s Summary, paras. 1, 20, 22, 25(a) and 29.
[66] Statement of France, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020, https://conf.unog.ch/dr/public/61.0500/881D2B3C-1407-49BA-AE82-1914CA10D467_10h10/chunks/snippet_lEs84-24t93-44.mp3 (accessed June 30, 2021).
[67] Statement of Finland, CCW GGE meeting on lethal autonomous weapons systems, September 22, 2020, https://conf.unog.ch/dr/public/61.0500/E2E48F3A-D5D7-4C67-9D83-DFF8B4933E2A_15h10/chunks/snippet_lEs133-05t140-55.mp3 (accessed June 30, 2021).
[68] Vincent Boulanin, Neil Davison, Netta Goussac, and Moa Peldán Carlsson, “Limits on Autonomy in Weapon Systems: Identifying Practical Elements of Human Control,” ICRC and SIPRI, June 2, 2020, https://www.icrc.org/en/document/limits-autonomous-weapons (accessed June 30, 2021).
[69] Statement of Ireland, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020, https://conf.unog.ch/dr/public/61.0500/881D2B3C-1407-49BA-AE82-1914CA10D467_10h10/chunks/snippet_lEs102-14t107-51.mp3 (accessed June 30, 2021).
[70] Statement of Argentina, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020, https://conf.unog.ch/dr/public/61.0500/881D2B3C-1407-49BA-AE82-1914CA10D467_10h10/chunks/snippet_lEs7-49t12-41.mp3 (accessed June 30, 2021).
[71] Chairperson’s Summary, p. 8, para. 29. See also statement of ICRC, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020, https://conf.unog.ch/dr/public/61.0500/881D2B3C-1407-49BA-AE82-1914CA10D467_10h10/chunks/snippet_lEs55-58t66-11.mp3 (accessed June 30, 2021).
[72] Statement of Article 36, CCW GGE meeting on lethal autonomous weapons systems, September 25, 2020, https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2020/gge/statements/25Sept_Article36.pdf (accessed June 30, 2021).
[73] Statements of Belgium, Costa Rica, France, and Germany, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020 (notes by Human Rights Watch and IHRC); statement of Austria, CCW GGE meeting on lethal autonomous weapons systems, September 22, 2020 (notes by Human Rights Watch and IHRC); statements of Ireland, Sweden, and Switzerland, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020 (notes by Human Rights Watch and IHRC).
[74] Statement of Ireland, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020, https://conf.unog.ch/dr/public/61.0500/17F25398-AD13-4F73-865A-905D901B7737_15h04/chunks/snippet_lEs50-30t54-46.mp3 (accessed June 30, 2021). See also Chairperson’s Summary, p. 9, para. 30.
[75] Statements of Austria and Chile, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020 (notes by Human Rights Watch and IHRC); statements of Cuba, Mexico, Sri Lanka, and Turkey, CCW GGE meeting on lethal autonomous weapons systems, September 21, 2020 (notes by Human Rights Watch and IHRC); statement of Chile, CCW GGE meeting on lethal autonomous weapons systems, September 22, 2020 (notes by Human Rights Watch and IHRC); statements of France, Pakistan, Sweden, and Switzerland, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020 (notes by Human Rights Watch and IHRC); statements of Argentina, Costa Rica, and Spain, CCW GGE meeting on lethal autonomous weapons systems, September 24, 2020 (notes by Human Rights Watch and IHRC). See also statement of ICRC, CCW GGE meeting on lethal autonomous weapons systems, September 24, 2020 (notes by Human Rights Watch and IHRC).
[76] Chairperson’s Summary, p. 21, para. 27. See also statement of Ireland, CCW GGE meeting on lethal autonomous weapons systems, September 22, 2020 (notes by Human Rights Watch and IHRC); statements of Belgium and South Africa, CCW GGE meeting on lethal autonomous weapons systems, September 23, 2020 (notes by Human Rights Watch and IHRC); statements of Costa Rica and Pakistan, CCW GGE meeting on lethal autonomous weapons systems, September 24, 2020 (notes by Human Rights Watch and IHRC) (raising concerns about gender bias resulting from algorithms).
[77] Examples of group statements include: Joint Working Paper submitted by Costa Rica, Panama, Peru, the Philippines, Sierra Leone, and Uruguay, June 2021, https://documents.unoda.org/wp-content/uploads/2021/06/Costa-Rica-Panama-Peru-the-Philippines-Sierra-Leone-and-Uruguay.pdf (accessed June 30, 2021); Joint Working Paper submitted by the Brazil, Chile, and Mexico, “Elements for a Future Normative Framework Conducive to a Legally Binding Instrument to Address the Ethical Humanitarian and Legal Concerns Posed by Emerging Technologies in the Area of (Lethal) Autonomous Weapons (LAWS),” June 2021, https://documents.unoda.org/wp-content/uploads/2021/06/Brazil-Chile-Mexico.pdf (accessed June 30, 2021); Joint Working Paper submitted by France and Germany, “Outline for a normative and operational framework on emerging technologies in the area of LAWS,” June 2021, https://documents.unoda.org/wp-content/uploads/2021/06/France-and-Germany.pdf (accessed June 30, 2021); Joint Working Paper submitted by Australia, Canada, Japan, the United Kingdom, and the United States, June 2021, https://documents.unoda.org/wp-content/uploads/2021/06/Australia-Canada-Japan-United-Kingdom-United-States.pdf (accessed June 30, 2021); Joint Working Paper submitted by Austria, Brazil, Chile, Ireland, Luxembourg, Mexico, and New Zealand, “Joint Submission on possible consensus recommendations in relation to the clarification, consideration and development of aspects of the normative and operational framework on emerging technologies in the area of lethal autonomous weapons systems,” June 2021, https://documents.unoda.org/wp-content/uploads/2021/06/Austria-Brazil-Chile-Ireland-Luxembourg-Mexico-and-New-Zealand.pdf (accessed June 30, 2021).