26.08.24
No reason to delay negotiations: statement to the CCW GGE on autonomous weapons
By Elizabeth Minor
At the August session of the Convention on Conventional Weapons’ Group of Governmental Experts on ‘Lethal Autonomous Weapons Systems’, whose current mandate is to formulate “by consensus, a set of elements of an instrument, without prejudging its nature” on autonomous weapons systems, Article 36 gave the following general statement on 26 August 2024, delivered by our Advisor, Elizabeth Minor:
Article 36 thanks the Chair for the rolling text, which brings together points that many states have agreed on for some time now.
This text provides a good summary of the key obligations needed in order to preserve meaningful human control. Many key rules for regulating autonomous weapons systems in armed conflict could be developed from the common understandings presented in this document. These legal rules need not be lengthy, nor far from the content already here.
We appreciate that the text is succinct.
There are nevertheless fundamental elements that are missing from it, which states must consider in negotiating a legally binding instrument. For example:
- The need for users to have a functional understanding of systems and their context of operation is essential for ensuring human control – this requirement would inform and shape processes of design and training.
- Autonomous weapons systems targeting people are a central concern. They raise the most fundamental ethical issues as well as severe legal challenges. Antipersonnel autonomous weapons systems should be prohibited. This issue is completely missing from the text circulated here.
- Likewise, there is no engagement with human rights law considerations in this text. Ensuring that rules agreed are sufficient from a human rights perspective, including in circumstances outside of armed conflict, is crucial.
But – there is now broad support for action amongst the international community, with growing momentum from recent regional and international conferences, wide engagement with the UN Secretary-General’s report on autonomous weapons systems, and a majority of states supporting new law.
And, we are subject to greater urgency than ever to act, with advancing weapons development; the use of autonomous weapons systems being reported in armed conflicts such as in Ukraine; and reports of wider AI systems, such as the targeting tools in Gaza, showing how the erosion of meaningful human control and reducing people to data points already contributes to civilian harm.
Negotiations on a treaty on autonomous weapons systems must be the next step for states – and there is no reason to delay any further.
States that have recognised the value of legal regulation must start to negotiate now in a forum that can consider all aspects of the issue; where all states can participate on an equal basis; and where progress cannot be blocked. Unfortunately, given our long engagement with this forum and its working procedures, we expect that many of the useful parts of this rolling text will not survive the week, and that agreement on an instrument will not be reached in the CCW.
For all these reasons, we urge states to use all available forums in order to negotiate this legally binding instrument, noting the opportunity provided by considerable interest in this issue at the UN General Assembly, and that forum’s inclusivity, including of all states.
Thank you chair.
Article 36 also intervened on 27 August with the following intervention on discussions on the ‘working characterisation’ of autonomous weapon systems:
Firstly, we welcome working characterisation, which we believe reflects the broad convergence building on how AWS should be understood, and the key features of AWS from which concerns arise.
As part of this common understanding, we agree with many delegations that “lethal” is an unnecessary qualifier, and we also agree with including a clarifications regarding input after the activation of a weapons system to avoid bad faith understandings.
We recognise the text clarifying that systems that are not weapons systems are not autonomous weapons systems for the purpose of our common understanding – some delegations have for instance given the example of ‘decision support systems’ as a set of tools that fall outside the boundary of autonomous weapons systems.
As many of us noted yesterday, according to reports, the use of these wider military AI systems that do not fall within the scope of autonomous weapons systems are already contributing to serious civilian harm, through the erosion of meaningful human decision-making and control in the use of force, and the reduction of people to data points, that their use is entailing.
I therefore have a question and challenge to those states who have said they are looking to develop norms around the “responsible use” of military AI in other forums. In ongoing and upcoming discussions including those on the US-led political declaration and the REAIM conference to be hosted by the Republic of Korea, will states tell us what the criteria are for responsible use, and can they answer, in those forums’ discussion, whether the current direction of practice – which is not limited to one State – is indeed responsible and acceptable? If these forums cannot adequately address these questions, further responses from the international community on these wider tools will clearly be necessary.
Thank you chair
Featured image: UN Web TV