Building towards a common solution: Article 36 statement to talks on autonomous weapons
By Elizabeth Minor
Though disagreements over the format and possibilities for participating in the meeting looked like they might stop it going ahead, states gathered this week in Geneva (and online) for five days of meetings on autonomous weapons, under the Convention on Certain Conventional Weapons. There is not consensus within this forum to negotiate a treaty on autonomous weapons – but we are seeing significant positive developments in terms of the substantive contributions being made by many states, which can provide the building blocks towards international regulation.
Article 36 was participating in the conference remotely from London via the UN’s “Interprefy” online platform. We made a statement under the final agenda item of the conference – on recommendations for developing a normative and operational framework on “emerging technologies in the area of lethal autonomous weapons systems” – setting out how we see the current landscape and ways forward.
It can be download here or read below:
Article 36 statement to item f, CCW GGE meeting on LAWS 21-25 September 2020
25 September 2020
Delivered by Elizabeth Minor via Interprefy
Thank you chair and thank you for the circulation of the non-paper.
Across statements this week, as well as in states’ commentary papers, we have seen many substantive contributions that reflect different elements of the structure for legally binding international regulation we think would be most effective for addressing autonomous weapons. There is an opportunity to develop and bring together this content from states towards a strong, effective framework that prohibits certain system configurations under the scope of our discussion, and obligates states to ensure meaningful human control over the rest:
Firstly, a number of countries have expressed opposition to human life and death “decisions” being carried out by machines, or have suggested restrictions could be made on the types of targets systems could apply force to. Such positions should be further explored towards developing a prohibition on the targeting of people using sensor-based systems, which we see as a key boundary line.
Secondly, towards another key boundary: several countries have emphasised the need for the users of weapon systems to understand how these will function in practice, with some linking this ‘explicitly’ to legal compliance, and others raising concerns at systems that might “evolve”. We see here the need to prohibit systems whose complexity of functioning means that their effects cannot be sufficiently predicted, foreseen or understood by their operators: we must prohibit systems that cannot be sufficiently controlled.
Finally, for all other systems within the scope of this discussion, we must ensure meaningful human control. We appreciate recognition – which is also referenced in the chair’s non-paper – the core area for states’ work is to determine the elements of human control necessary to ensure compliance with the law and ethical principles. As some countries have observed this week, there are many commonalities between states’ approaches in elaborating the elements of human control that should be in place – whatever their approach to regulation. Many states have raised applying temporal and spatial limits to the use of systems, controls on what contexts systems could be used in, or what target profiles they can use for example. Positive obligations to ensure that these factors are controlled should sit alongside the prohibitions we have outlined.
A key issue for us is to build recognition amongst the group that regulation should be applied to the broad scope of weapons technologies that are under discussion here – those that apply force through matching sensor inputs to a “target profile” of characteristics following a system’s activation, emplacement, or deployment. We believe the scope for regulation should not be narrowed at this point. We would also like to emphasise that in building a concept of control we must focus on human action, rather than the sometimes merely imagined technical fixes that might enhance it.
The content presented by states up until now provides useful building blocks towards an effective international framework for regulation to address the systems and concerns under discussion at the GGE. It is now up to all involved to take this substance forward – to outline a common solution.
Whilst the group as a whole may not share the commonalities or common positions many have identified this week – and notwithstanding the agenda and structure of work adopted by the group as a whole – we believe there is space for states that do find commonalities in each other’s positions to work together to build these shared understandings across all interconnected aspects of the legal, technical, operational and ethical landscape – for example through further collectively endorsed submissions to the conference.
There is an opportunity, now, for us to work collectively to reject the automation of killing and resist the erosion of human dignity and responsibility that developments in this area are risking, and instead, to ensure meaningful human control over the technological tools that human beings create. We also believe that the inclusion of diverse perspectives and diverse representation will be crucial to this goal.
Thank you chair.
Image: Article 36 gives a statement to the CCW in September via a remote connection. Still from UN Web TV