“If there is not a pre-emptive ban on the high-level autonomous weapons then once the genie is out of the bottle it will be extremely difficult to get it back in.”
By Nadia Prupis /Common Dreams
The U.S. and UK are undermining attempts by the United Nations to negotiate over the future of autonomous weapons—or “killer robots”—talks which, if delayed further, could come too late to prevent so-called “robot wars.”
Technology and human rights experts have been pushing for the UN to preemptively ban machines that can kill on the battlefield without human operators, citing a greater risk to civilian life and a broader lack of accountability for military officials. But Christof Heyns, UN special rapporteur on extrajudicial, summary, or arbitrary executions, said Tuesday that the negotiation process is in danger of getting “stuck.”
“A lot of money is going into development and people will want a return on their investment,” Heyns told the Guardian. “If there is not a pre-emptive ban on the high-level autonomous weapons then once the genie is out of the bottle it will be extremely difficult to get it back in.”
As the UN General Assembly negotiates an agreement between nations on autonomous weapons, U.S. and UK representatives are reportedly pushing for weaker rules that would only prohibit future technology, but not killer robots developed during the protracted negotiating period. Such delays would also mean that existing semi-autonomous prototypes—like the Phalanx close-in weapons system (CIWS) in the U.S., the Iron Dome in Israel, and the SGR-1 sentry robot in South Korea—would not be subject to the ban.
Proponents of killer robots say they will help reduce military casualties in war. But as a report published earlier this year by Human Rights Watch and Harvard Law School’s International Human Rights Clinic argues, such tools bring too many moral and legal risks to justify their continued development. Those risks include higher potential for violation of international law and a lack of accountability for war crimes committed by robots.
What’s more, proliferation of autonomous weapons would make a global arms race “inevitable,” experts—including physicist Stephen Hawking, Apple co-founder Steve Wozniak, and Tesla CEO Elon Musk—said in July.
Noel Sharkey, a professor of artificial intelligence and co-founder of the International Committee for Robot Arms Control, is very concerned about where things are headed.
“Governments,” he explained to the Guardian, “are continuing to test autonomous weapons systems, for example with the X49B, which is a fighter jet that can fly on its own, and there are contracts already out for swarms of autonomous gun ships. So if we are tied up [discussing a ban] for a long time then the word ‘emerging’ is worrying.”
“The concern that exercises me most is that people like the U.S. government keep talking about gaining a military edge,” Sharkey said. “So the talk is of using large numbers—swarms—of robots.”
If the UN is unable to close a deal on the future of autonomous weapons, countries would still have the option of crafting their own agreements, which is how the Convention on Cluster Munitions came about. But experts say it’s unlikely that major weapon-producing nations would agree to such a treaty.
As of now, only five UN member states—Cuba, Pakistan, Egypt, Ecuador, and the Vatican—have backed a ban on killer robots.
This work is licensed under a Creative Commons Attribution-Share Alike 3.0 License
John Lawrence says
Humans love technology so much, and the imperative is, if it’s possible, it must be developed. Will we ever voluntarily not develop some technology just because it’s in the interests of humanity in general not to develop it? About the same chances that we will go back to horses and buggies to forestall global warming.