Categories
News

Machines should not decide on life and death

This week, the United Nations in Geneva is discussing autonomous weapon systems. These are weapons that can select and attack a target without human control.

Image: GrantTurnbull/Shutterstock

We have long warned against digital dehumanisation in warfare and are present in Geneva to call on states to always require meaningful human intervention in the deployment of weapon systems.

The growing use of artificial intelligence and increasingly autonomous weapon systems are extremely worrying developments, the consequences of which we are seeing in Gaza and Ukraine.

Decisions about life and death should not be left to machines. After more than ten years of discussions about autonomous weapons systems, states must start negotiations on a treaty that requires human control over the use of force.

On 3 September, PAX employee Roos Boer issued a statement on behalf of the Stop Killer Robots campaign. You can view and read the statement below.

"[Human] control as a principle in international law is crucial for drawing red lines to protect our humanity, and for establishing clearly what we find fundamentally ethically unacceptable as an int'l community." – @rboer.bsky.social delivers #StopKillerRobots statement at #CCWUN @paxvoorvrede.nl

[image or embed]

— Stop Killer Robots (@stopkillerrobots.bsky.social) 2 september 2025 om 16:48

Statement on behalf of the Campaign to Stop Killer Robots to the CCW GGE LAWS on Box III, 3 September 2025 .

Delivered by Roos Boer (PAX).

Thank you Chair.

We welcome the progress states have made through the rolling text in their common understanding of some of the basic components of meaningful human control (or context- appropriate human control and judgement). Many of the elements in boxes III-paragraph V align with some of the basic parameters that we believe must be now fully developed into rules in a legal instrument, in order to uphold basic legal and ethical principles and ensure responsibility and accountability.

In box III, these elements include the consideration of predictability, reliability, traceability and explainability; the importance of ethical as well as legal assessments; and the placing of limitations on the types of targets, duration, geographical scope and scale of operation of weapons systems.

We recommend that all the elements in box III are retained in the text at this point, as a basis for states’ subsequent and more detailed negotiations on a legally binding instrument. This particularly applies to the elements in paragraphs 5 and 6 – with one exception that we will raise shortly.

We welcome the recognition by states in the rolling text of the fact that ‘context-appropriate human control and judgment’ is needed to uphold international law: the linkage between such control and fulfilling basic legal principles is important, and must be preserved in states’ discussions and negotiations.

However, we do not believe that ensuring compliance with existing law is the sole purpose of establishing international rules to ensure meaningful human control in the use of force.

Rather, enshrining such control as a principle in international law is crucial for drawing red lines to protect our humanity, and for establishing clearly what we find fundamentally ethically unacceptable as an international community when it comes to states and companies pursuing increasing autonomy in weapons systems. The pursuit of increasing autonomy represents a fundamental shift in the use of force: it raises much bigger questions than IHL compliance alone.

For this reason, we were disappointed to see the changes to paragraphs 5 and 6 in box III that remove the concept of a general prohibition on the use of systems without meaningful human control. We also regret that the current text more explicitly restricts the purpose of box III’s elements to the goal of upholding compliance with IHL.

We were also deeply concerned to see the addition of paragraph 6.D.v.

Merely limiting “real-time machine learning with regard to target selection and engagement functions” will not be sufficient to prevent a loss of sufficient understanding and meaningful control over systems.

We continue to urge states to fully consider the issues raised by anti-personnel autonomous weapons systems and how these must be dealt with. We believe that a specific prohibition on systems that target people, within a wider legal instrument on autonomous weapons systems, is the only ethically and legally viable response.

Thank you Chair.

Also read