As delivered by Dr. Elke Schwarz
Thank you, Mr Chairperson,
The International Committee for Robot Arms Control is pleased to see states move away from the use of broad, brush-stroke terms such as in-the-loop, on-the-loop, the wider loop, human oversight, and appropriate human judgement. We agree with the working paper from Estonia and Finland that complex definitions of autonomy and autonomous weapons systems is moving us in the wrong direction. As scientists we believe, following Einstein, that definitions should be as simple as possible but no simpler. In that, we applaud the approach of the ICRC, that focus should be on the critical functions of target selection and the application of violent force. This counters concerns that a prohibition of autonomous weapons systems (AWS) would impact on innovation in other civilian and non-lethal military applications.
ICRAC holds that the way forward is to focus on the meaningful human control of weapons systems. For human control to be meaningful we need to examine how humans interact with machines and understand the types of human-machine biases that can occur in the selection of legitimate targets. Lessons should be learned from 30 years of research on human supervisory control of machinery and more than 100 years of research on the psychology of human reasoning. A combination of this work can help us to design human-machine interfaces that allow weapons to be controlled in a manner that is fully compliant with international law and the principles of humanity.
First, there should be a focus on what the human operator MUST do in the targeting cycle. This is control in use which is governed by targeting rules under International Humanitarian Law and International Human Rights Law. Further, international law rules that apply after the use of weapons – such as those that relate to human responsibility – must be satisfied.
Second, the design of weapon systems must render them INCAPABLE of operating without meaningful human control. This is control by design, which is governed by international weapons law. In terms of international weapons law, if the weapon system, by its design, is incapable of being sufficiently controlled, then such a weapon is illegal per se. Systems MUST be designed to ensure human responsibility and accountability.
Ideally the following three conditions should be followed for the control of weapons systems:
- a human commander (or operator) will have full contextual and situational awareness of the target area for each and every attack and is able to perceive and react to any change or unanticipated situations that may have arisen since planning the attack.
- there will be active cognitive participation in every attack with sufficient time for deliberation on the nature of any target, its significance in terms of the necessity and appropriateness of attack, and likely incidental and possible accidental effects of the attack and
- there will be a means for the rapid suspension or abortion of every attack.
For further details please see our guidelines for the human control of weapons systems from the April meeting this year.
While systems must be designed to ensure safety and responsibility, we should not mistake the review of weapons and good design as itself a form of human control. The responsibility to make decisions of life and death cannot be delegated to machines, nor to the review- or design process of those machines.
Thank you Mr Chairperson