Lethal autonomous weapon systems and the tech sector: some examples of best practices

August 19, 2019

The new PAX report ‘Don’t be Evil?’ highlights advances in the tech sector relevant to the potential development of lethal autonomous weapons (known commonly as ‘killer robots’) as well as links between the tech sector and the military. The report is part of the Reprogramming War project, which includes a series of four reports each looking at which actors could be involved in the development such weapon systems. The aim of the project is to raise awareness amongst the private sector, as well as working towards developing policies to ensure that any work undertaken by the private sector does not contribute to the development of these weapons.

‘Don’t be Evil?’ emphasises some concerning ties between the private sector and defence, as some companies develop technology very relevant to killer robots and have no qualms in collaborating with the military. As part of the report, PAX surveyed 50 different companies and classified 21 of them as being of ‘high concern’.

Best practice

However, some companies demonstrated high levels of awareness and are already in the process of implementing norms to ensure their technology will not be used in future killer robots. Some examples are described below.

  • Google

In 2018 Google published its AI Principles, which include a specific reference to weapon systems. They state that Google will not design or deploy AI in the following application areas, including “weapons or other technologies whose principal purpose or implementation is to cause or directly facilitate injury to people”. The company reiterated that position in response to our survey request, further outlining that “since announcing our AI principles, we’ve established a formal review structure to assess new projects, products and deals. We’ve conducted more than 100 reviews so far, assessing the scale, severity, and likelihood of best- and worst-case scenarios for each product and deal”.

  • Softbank

In response to our survey Japanese tech giant Softbank (owner of Boston Dynamics amongst others) stated they will not develop lethal autonomous weapons: “Our philosophy at SoftBank Corp. is to use the Information Revolution to contribute to the well-being of people and society”. The company added that they “do not have a weapons business and have no intention to develop technologies that could be used for military purposes”.

  • Animal Dynamics

The CEO of British Animal Dynamics, Alex Caccia, stated in response to our survey that “under our company charter, and our relationship with Oxford University, we will not weaponize or provide ‘kinetic’ functionality to the products we make”. Caccia adds that “halting the technical development of autonomy is futile, and would prevent the many beneficial outcomes of autonomy; however, legislating against harmful uses for autonomy is an urgent and necessary matter for government and the legislative framework to come to terms with”.

  • VisionLabs

In response to our survey VisionLabs answered that they “do not develop or sell lethal autonomous weapons systems”. The Russian company added that they “explicitly prohibit the use of VisionLabs technology for military applications. This is a part of our contracts. We also monitor the results/final solution developed by our partners”.

These above examples demonstrate that companies can and must take similar measures to prevent any contribution towards the development of killer robots. Indeed, we have outlined several steps that tech companies can take.

First, tech companies can publicly commit to not contribute to the development of lethal autonomous weapons. They can establish a clear policy stating so as well, which should include implementation measures such as ensuring each project is assessed by an ethics committee, assessing all technology the company develops and its potential uses and implications, and adding a clause in contracts stating that the technology developed may not be used in lethal autonomous weapons. Tech companies should also ensure that employees are well informed of what they are working on, and allow open discussions on any related concerns.

Finally, tech workers themselves can take steps. They can voice their concerns and start internal discussions on critical issues. A useful resource for this is the website of the Campaign to Stop Killer Robots.

For more information, please visit our Reprogramming War page or the Killer Robots page on paxforpeace.nl.

Get involved with our peace work.
Subscribe to the PAX Action Alert.