UN to hear call for ban on ‘killer robots’
Machines have potential to attack without a human pulling the trigger
Killer robots are seen as the next step on from drones, unmanned aerial vehicles intended initially for surveillance bu adapted for carrying out targeted killings. Pictures is an X-47B pilot-less drone combat aircraft flying over the aircraft carrier, the USS George H. W. Bush. Photograph: Jason Reed/Reuters.
“Killer robots” that could attack targets autonomously without a human pulling the trigger pose a threat to international stability and should be banned before they come into existence, the United Nations is expected to hear today.
Christof Heyns, the UN special rapporteur on extrajudicial, summary or arbitrary executions, will address the UN Human Rights Council in Geneva today and call for a worldwide moratorium on what he calls “lethal autonomous robotics” - weapons systems that, once activated, can lock on and kill targets without further involvement of human handlers.
“Machines lack morality and mortality, and as a result should not have life and death powers over humans,” Mr Heyns will say.
Mr Heyns’s call for a moratorium draws the UN into the realms of sci-fi: fully autonomous weapons have not yet been developed, and exist only in the imaginations of military planners. However, experts in warfare technologies warn that the world’s leading military powers are moving so rapidly in this direction that a pre-emptive ban is essential.
“States are working towards greater and greater autonomy in weapons, and the potential is there for such technologies to be developed in the next 10 or 20 years,” said Bonnie Docherty of Harvard law school’s International Human Rights Clinic, who co-authored a report on the subject with Human Rights Watch.
In his submission to the UN, Mr Heyns points to the experience of drones. Unmanned aerial vehicles were intended initially only for surveillance, and their use for offensive purposes was prohibited, yet once strategists realised their perceived advantages as a means of carrying out targeted killings, all objections were swept out of the way.
Drone technology has already moved a step closer to a fully autonomous state in the form of the X-47B, a super-charged UAV developed by the US Navy that can fly itself, and which last week completed the first takeoff from an aircraft carrier. The drone is billed as a non-combat craft, yet its design includes two weapons bays capable of carrying more than 4,000lbs.
Britain is developing its own next generation of drone, known as Taranis, that can be sent to tackle targets at long range and can defend itself from enemy aircraft. Like X-47B it has two in-built weapons bays, though is currently unarmed.
Apart from drones, several states are known to be actively exploring the possibility of autonomous weapons operating on the ground. South Korea has set up sentry robots known as SGR-1 along the Demilitarised Zone with North Korea that can detect people entering the zone through heat and motion sensors; though the sentry is currently configured so that it has to be operated by a human, it is reported to have an automatic mode, which, if deployed, would allow it to fire independently on intruders.
Steve Goose, Human Rights Watch’s arms director, said it was undeniable that “modern militaries are looking to develop autonomous weapons. The question is how far that push for autonomy will go.”
Given its dominance as the world’s leading military power, the US is likely to set the pace. According to Human Rights Watch, the Pentagon is spending about $6 billion a year on research and development of unmanned systems, though in a directive adopted last November it said that fully autonomous weapons could only be used “to apply non-lethal, non-kinetic force, such as some forms of electronic attack”.
The key issue identified by Mr Heyns in his UN submission is whether future weapons systems will be allowed to make the decision to kill autonomously, without human intervention.