Research into artificial intelligence should not be left to military, says Google exec

Eric Schmidt tells RSA conference of need to ‘demilitarise the internet’

Eric Schmidt, executive chairman of Google’s parent Alphabet.  Photograph: Tony Avelar/Bloomberg.

Eric Schmidt, executive chairman of Google’s parent Alphabet. Photograph: Tony Avelar/Bloomberg.


Research on machine learning and artificial intelligence needs to be done in the open and not by military research labs, Eric Schmidt, executive chairman of Google’s parent Alphabet has said.

Mr Schmidt was speaking at the RSA Conference in San Francisco on Wednesday. Some 40,000 security professionals have been attending the event this week. Security threats discussed at hundreds of sessions include the Russian hacking of the Democratic National Committee in advance of the US presidential election last year, as well as threats to business, government and internet-connected devices.

In conversation with journalist Gideon Lewis-Kraus, Mr Schmidt said machine-learning capabilities, which allow computers to learn without being programmed, needed to be done in the open and not by military research labs. He said that under a different scenario, “we’d spend all day worrying if they [the military] were in our networks; we wouldn’t know what they were doing.

“The internet would begin to start getting shut down because of fear of the Russians or the Chinese or bilaterally.”

He said one of the questions the industry should be asking was whether it could come up with a way where countries would agree to not use machine-learning technologies in a way that militarised the internet – “in other words a demilitarisation plan”.

“Maybe I’m an optimist, but I’d like to see some discussions about that,” he said.

Reiterating a fear he has written about in the past, Mr Schmidt said he was “extremely worried” about the likelihood that countries would begin to block the openness and interconnectedness of the internet. Globalisation and economic regrowth were dependent on such openness, he added.


Mr Schmidt noted there had been a meeting last year between then US president Barack Obama and the Chinese president Xi Jinping in which they had agreed to reduce the number of attacks from China on the US.

“I think getting the governments to agree that we are better off collectively by having a more open internet with less attacking – especially at nation state level – would be a clear improvement for everybody.”

Google on Wednesday announced it was making available a new open-source library called Tensorflow 1.0, which the company uses internally for machine learning.

“We are doing this because we want to stimulate the [artificial intelligence] industry. We want you to build these solutions and identify these new opportunities,” Mr Schmidt said. He remarked on early machine-learning techniques over a decade ago which had “discovered” the concept of cat videos on YouTube.

“I would have liked them to discover something more profound. I was very upset,” he said.


In a keynote address to the conference on Wednesday, Dr Hugh Thompson, chief technology officer of Symantec, said thinking differently about analytics could transform the field of security. He said most discussion around the Internet of Things (IoT) was currently about things that had been hacked, including cars and botnets. Most of the energy was being directed into the exposition of weaknesses in devices and a lot of people working on the security aspect were “still at the stage where we’re trying to comprehend how bad could this be”, he said. He proposed a standard for IoT devices that might look like a food label, which every single device would have physically attached to it or which would be transmitted digitally. It would list properties such as whether the device had Bluetooth connectivity. He suggested every device might also come with a behaviour graph to monitor what it should be doing.

The conference concludes on Friday.