Thank you, Mr. Chairman.

It is appropriate and useful that this issue made big headlines last week, just before this meeting.

First, following the announcement that a major, highly respected Korean research institute (KAIST) would engage in artificial intelligence projects with a major Korean military manufacturer, a large number of leading AI experts worldwide told the institute that they would boycott interaction with it if the institute engaged in the development of lethal autonomous weapons systems. KAIST immediately clarified that it would not work toward lethal autonomous weapons systems, and that it rejected them on ethical and other grounds.

Second, thousands of Google employees joined together to object to a new AI project with the US military. Google has been asked to clarify if this project could lead to lethal autonomous weapons systems, also known as fully autonomous weapons.

These are the latest examples of the revulsion and repugnance which fully autonomous weapons engender. We ask that you keep this basic revulsion at the front of your mind as you engage in sometimes complicated technical discussions this week.

There are many reasons to reject lethal autonomous weapons systems (including legal, accountability, technical, operational, proliferation, and international security concerns), but ethical and moral concerns—which generate the sense of revulsion—trump all.

These ethical concerns should compel High Contracting Parties of the Convention on Conventional Weapons to take into account the Martens Clause in international humanitarian law, under which weapons that run counter to the principles of humanity and the dictates of the public conscience should not be developed.

High Contracting Parties of the Convention on Conventional Weapons should codify that with respect to lethal autonomous weapons systems.

We are pleased to hear today so much support for a legally binding instrument on lethal autonomous weapons systems, for a preemptive prohibition, and for the necessity of meaningful human control of lethal autonomous weapons systems. For the Campaign to Stop Killer Robots, a requirement for meaningful human control of lethal autonomous weapons systems is the same thing as a prohibition on development, production, and use of lethal autonomous weapons systems; they are two sides of the same coin—if the human control is truly meaningful.

There is clearly growing support for a prohibition. In fact, more states have spoken in favor of a legally binding instrument and a ban today than for any other concrete outcome. We warmly welcome the statements earlier today by the Africa Group on Disarmament and by Austria that they support a ban, adding them to the ever longer list of supporters.

The need for a ban becomes more urgent on a constant basis, as technological developments race ahead. This is year five of Convention on Conventional Weapons discussions and the pace has not been impressive.

It is notable that some of those preaching patience, prudence, and caution are those who are rushing to develop and field weapons with ever-greater levels of autonomy.

We believe that based on the work of the two Group of Governmental Experts sessions this year, as well as other initiatives, High Contracting Parties of the Convention on Conventional Weapons should agree to a mandate in November to begin formal negotiations on a new protocol, and should conclude the negotiations by the end of 2019, with a new Protocol VI preemptively banning fully autonomous weapons.

Thank you.