Next week nation states from around the world will meet at the United Nations in Geneva to discuss the problems raised by Lethal Autonomous Weapons Systems (LAWS): weapons that once activated will select targets and attack them with violent force without the benefits of human control. The Convention of Certain Conventional Weapons will host this their second expert meeting on the topic from Monday 13 to Friday 17 April.
Delegates from the International Committee for Robot Arms Control (ICRAC), an international not-for-profit association of scientists, technologists, lawyers, and policy experts committed to the peaceful use of robotics in the service of humanity and the regulation of robot weapons, will take an active role in the discussions.
As a founding member of the Campaign to Stop Killer Robots, ICRAC is pushing for an international legally binding treaty to prohibit the development production and use of LAWS. Among the many concerns of the membership, ICRAC is worried about the destabilising impact that LAWS will have on the security of the planet.
As Professor Lucy Suchman from Lancaster University puts it, ‘LAWS take the automation of weapon systems a step too far, undermining the conditions necessary for meaningful human control.’
A new information leaflet from ICRAC will be issued to delegates at the CCW to raise awareness about ten of the most serious global issues for security that the use of LAWS will create: LAWS – 10 Problems for Global Security
It concludes,
We are at a critical juncture in the evolution of weapons. The end point of increasing weapons’ automation is full autonomy, where human beings have little control over the course of conflicts and events in battle. At this point in time, it is still within our power to stop the automation of the kill decision, by ensuring that every weapon remains meaningfully controlled by humans.
Both humans and computer systems have their strengths and weaknesses, and the aim of designing effective supervisory systems for weapons control must be to exploit the strengths of both. This way, it is possible not only to gain better legal compliance, but also to ensure that the partnership between human and machine best ensures the protection of civilians, their human dignity and our wider global security.
Dr Heather Roff from the University of Denver, an invited speaker at the CCW meeting said that, “Without careful consideration of the second and third order effects of developing and deploying LAWS, we risk destabilizing regional and global peace and security.”
These concerns were echoed by Dr. Denise Garcia from Northeastern University who said that, “the dangers to global security from the future of LAWs is too frightening to contemplate.”
“We are just witnessing the beginning of an automated pace race that will take battlefield decisions out of human hands” said Professor Noel Sharkey, chair of ICRAC. “There is a limited time window to stop these weapons before they proliferate widely.