From 29-30 April, government officials, the UN, ICRC, think tanks, international and regional organisations and civil society members gathered in Vienna, Austria for a conference to discuss the challenges posed by autonomous weapons (AWS) and the development of a legally binding instrument to limit the use of AWS. The conference was organised by the Austrian Federal Ministry for European and International Affairs.

During the conference, there was an opportunity for states and civil society to deliver statements about the use of AWS and the need to develop legislation to regulate its use. Article 36’s Elizabeth Minor gave remarks to the conference during this session, which are reproduced below.

Humanity at the Crossroads: Autonomous Weapons Systems and the Challenge of Regulation

Vienna Conference 2024

Remarks by Elizabeth Minor, 30 April

We are now at a point in the international policy conversation on autonomy in weapons systems where there is a broad shared understanding of what autonomous weapons systems are – we know what the scope for regulation should be.

We also have broad agreement around what that regulation needs to look like – a structure of specific prohibitions and positive obligations.

And, we know there is broad support for action in the international community – a majority of the world’s countries have spoken in favour of a legally binding instrument.

A treaty to prohibit and regulate autonomy in weapons systems is urgent, and it is possible – we know what we need to do.

States that are willing must now take concrete steps towards starting negotiations, in a forum that is inclusive, and where progress cannot be blocked by a highly militarised minority. In the current, polarised international environment, it is vital that countries come together across regions and alignments and build partnerships to address this issue, where there is much agreement, and where the way forward is clear – this conference has been a crucial step.

If we do not act together to set legal rules on autonomous weapons systems, norms of behaviour will still be set – by the practice of countries using these systems. We must not leave norm-setting to these states alone, or to their veto. If we do, we risk a race to the bottom, with devastating consequences for peace and security, and for civilians. We must assert the value of developing international law as a means to shape and constrain behaviour.

Reports from Gaza on Israel’s use of AI-powered target suggestion systems are already showing us how the quest for speed, the erosion of meaningful human control, and the reduction of people to data points can contribute to devastation for civilians. These developments in military AI more broadly highlight the urgency of enshrining meaningful human control in international law, and of regulating autonomous weapons systems in particular, now, as do developments in other conflicts.

Lastly, we would like to invite states that have voiced general concerns on ethics, as well as on key issues such as bias, to recognise that it is autonomous weapons systems targeting people that raise the most fundamental ethical issues – and, that by including a specific prohibition on systems targeting people, we can draw a key line against dehumanisation and for civilian protection that many states already recognise in their policies and practices on weapons systems.

Thank you.