A new generation of weapons systems with increasing levels of autonomy is being developed and deployed. This raises various legal, ethical and security concerns. To ensure meaningful human control over the use of force an international treaty is urgently needed. It is crucial for states to take action on this matter, not only to safeguard legal and ethical norms, but also in the interest of international peace and security. Fully autonomous weapons (killer robots) are fundamentally unacceptable and should be prohibited. Other autonomous weapons should be regulated to ensure they are used in line with legal and ethical norms.
PAX first wrote on the topic of autonomous weapons in 2011. In 2013, PAX joined forces with a group of international NGOs to set up the Campaign to Stop Killer Robots. This coalition now consists of over 200 NGO’s in over 60 countries. PAX has also been involved in talks with the UN in Geneva about this subject since 2014. We do research, advocacy and campaigning to work towards an international treaty that guarantees meaningful human control over the use of violence.
What are autonomous weapons?
Autonomous weapons are weapons systems that detect and apply force to a target based on sensor inputs, rather than direct human inputs. After the human user activates the weapons system, there is a period of time where the weapon system can attack a target without direct human approval. This means the user does not exactly where and when an attack will take place. This development will have an enormous effect on the way war is conducted and has been called the third revolution in warfare, after gunpowder and the atomic bomb.
An example of autonomous weapons that received a lot of international media attention was the use of the Kargu in Libya. According to a UN report these weapons systems “were programmed to attack targets without requiring data connectivity between the operator and the munition”. But also in the war in Ukraine loitering munitions are being used with increasing levels of autonomy. It is deeply concerning that these weapons are being developed and used without a clear regulatory framework. This framework should be clear which autonomous weapons are fundamentally unacceptable and must be prohibited, and how other autonomous weapons should be used in line with legal and ethical norms. Therefore there is an urgent need for a treaty that addresses this.
What are the concerns?
Autonomous weapons without meaningful human control raise several ethical, legal, and security concerns:
- Ethical: It is unacceptable to delegate life and death decisions to a machine.
- Legal: Experts agree that the use of autonomous weapons, without meaningful human control, does not meet the requirements of the law of war. In addition, it is unclear who is responsible when violations of the law of war are committed.
- Security: If autonomous weapons are deployed without regulation, it will lead to more conflict and instability in the world. It will lower the threshold for warfare and lead to a new arms race. Wide proliferation will make these weapons more readily available to a wide range of actors. There are risks of unintended initiations and escalations of conflict by autonomous weapons (“accidental war”).
Who are concerned?
UN Secretary-General António Guterres has repeatedly stated that lethal autonomous weapons are politically unacceptable and morally reprehensible and that they should be prohibited internationally. The International Committee of the Red Cross calls on states to adopt new legally binding rules on autonomous weapons system. Thousands of artificial intelligence scientists from around the world have warned about this development and called for a treaty. In 2018 and 2021, the European Parliament passed resolutions calling for an international treaty. Others calling for a treaty include the German Federation of Industries (BDI). An increasing number of states are calling for an international treaty.
Debate within the UN
Since 2014, autonomous weapon systems have been discussed within the UN Convention on Certain Conventional Weapons (CCW). It is now clear that a majority of states want to guarantee human control over the use of force and see the need for new regulation to safeguard this. States also agree that certain autonomous weapons should be prohibited, while the use of others must be regulated. Unfortunately, this shared ambition has not translated into significant political steps, as the CCW decides by consensus and a small minority of states has obstructed progress. Therefore, there is increasing attention in other international fora to address this topic like the Human Rights Council and the UN General Assembly.
Technology is developing at a rapid pace, so there is an urgent need for states to work towards the international treaty.
More information
on the process at the UN Convention for Conventional Weapons
on the international Campaign to Stop Killer Robots
Contact information
Daan Kayser, project leader autonomous weapons, kayser@paxforpeace.nl
- Increasing Autonomy in Weapons Systems: Ten examples of weapon systems with increasing autonomy (2021)
- Crunch Time (2018)
- Killer Robots - What are they and what are the concerns?
- Les Robots Tueurs: De quoi sagit-il et quelles sont les préoccupations?
- Robots Asesinos: ¿Qué son y por qué resultan preocupantes?
- State of AI (2019)
- Don't be Evil - A survey of the tech sector’s stance on lethal autonomous weapons (2019)
- Convergence? (Update 2019)
- Slippery Slope: The arms industry and increasingly autonomous weapons (2019)
- Conflicted intelligence (2020)
- Save Your University from Killer Robots (2020)