Experts warn of killer robots arms race

Tech leaders, scientists call for ban on autonomous weapons

Tech leaders and scientists including Elon Musk, Stephen Hawking and Steve Wozniak warn that the deployment of robots capable of killing while untethered to human operators is “feasible within years, not decades.”

Tech leaders and scientists including Elon Musk, Stephen Hawking and Steve Wozniak warn that the deployment of robots capable of killing while untethered to human operators is “feasible within years, not decades.”

 

Elon Musk and Stephen Hawking, along with hundreds of artificial intelligence researchers and experts, are calling for a worldwide ban on so-called autonomous weapons, warning that they could set off a revolution in weaponry comparable to gunpowder and nuclear arms.

In a letter unveiled as researchers gathered at the International Joint Conference on Artificial Intelligence in Buenos Aires, Argentina, on Monday, the signatories argued that the deployment of robots capable of killing while untethered to human operators is “feasible within years, not decades.” If development is not cut off, it is only a matter of time before the weapons end up in the hands of terrorists and warlords, they said.

Unlike drones, which require a person to remotely pilot the craft and make targeting decisions, the autonomous weapons would search for and engage targets on their own. Unlike nuclear weapons, they could be made with raw materials that all significant military powers could afford and obtain, making them easier to mass-produce, the authors argued.

The weapons could reduce military casualties by keeping human soldiers off battlefields, but they could also lower the threshold for going to battle, the letter said. “If any major military power pushes ahead with A.I. weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow,” it said.

Musk, the head of SpaceX, has raised warnings about artificial intelligence before, calling it probably humanity’s “biggest existential threat.” Hawking, the physicist, has written that while development of artificial intelligence could be the biggest event in human history, “Unfortunately, it might also be the last.”

The letter said artificial intelligence “has great potential to benefit humanity in many ways.” Proponents have predicted applications in fighting disease, mitigating poverty and carrying out rescues. An association with weaponry, though, could set off a backlash that curtails its advancement, the authors said.

Other notable signatories to the letter included Steve Wozniak, the co-founder of Apple; Noam Chomsky, the linguist and political philosopher; and Demis Hassabis, the chief executive of the artificial intelligence company Google DeepMind.

- The New York Times News Service

The Irish Times Logo
Commenting on The Irish Times has changed. To comment you must now be an Irish Times subscriber.
SUBSCRIBE
GO BACK
Error Image
The account details entered are not currently associated with an Irish Times subscription. Please subscribe to sign in to comment.
Comment Sign In

Forgot password?
The Irish Times Logo
Thank you
You should receive instructions for resetting your password. When you have reset your password, you can Sign In.
The Irish Times Logo
Please choose a screen name. This name will appear beside any comments you post. Your screen name should follow the standards set out in our community standards.
Screen Name Selection

Hello

Please choose a screen name. This name will appear beside any comments you post. Your screen name should follow the standards set out in our community standards.

The Irish Times Logo
Commenting on The Irish Times has changed. To comment you must now be an Irish Times subscriber.
SUBSCRIBE
Forgot Password
Please enter your email address so we can send you a link to reset your password.

Sign In

Your Comments
We reserve the right to remove any content at any time from this Community, including without limitation if it violates the Community Standards. We ask that you report content that you in good faith believe violates the above rules by clicking the Flag link next to the offending comment or by filling out this form. New comments are only accepted for 3 days from the date of publication.