Moore’s Law, still proving itself true after 50 years

Interview: Intel co-founder Gordon Moore, now 86, has amazed even himself with the accuracy of his predictions in 1965 about the exponential growth of computing power

Intel chief executive Brian Krzanich pays a video tribute to the company’s co-founder Gordon Moore at the 2015 International CES in Las Vegas in January. PHOTOGRAPH: ETHAN MILLER/GETTY

On April 19th, 1965, just over 50 years ago, Gordon Moore, then the head of research at Fairchild Semiconductor and later one of the co-founders of Intel, was asked by Electronics magazine to submit an article predicting what was going to happen to integrated circuits, the heart of computing, in the next 10 years.

Studying the trend he’d seen in the previous few years, Moore predicted that every year we’d double the number of transistors that could be fitted on a single chip of silicon, so you’d get twice as much computing power for only slightly more money. When that came true, in 1975, he modified his prediction to a doubling roughly every two years.

“Moore’s Law” has essentially held up ever since and, despite the sceptics, keeps chugging along, making it probably the most remarkable example ever seen of sustained exponential growth of a technology.

For the 50th anniversary of Moore's Law, I interviewed Moore, now 86, at the Exploratorium in San Francisco, at a celebration in his honour co-hosted by the Gordon and Betty Moore Foundation and Intel. I asked him what he'd learned most from Moore's Law having lasted this long.


“I guess one thing I’ve learned is once you’ve made a successful prediction, avoid making another one,” Moore said. “I’ve avoided opportunities to predict the next 10 or 50 years.”

But was he surprised by how long it has been proved basically correct? “Oh, I’m amazed,” he said. “The original prediction was to look at 10 years, which I thought was a stretch. This was going from about 60 elements on an integrated circuit to 60,000 – a thousandfold extrapolation over 10 years. I thought that was pretty wild. The fact that something similar is going on for 50 years is truly amazing. You know, there were all kinds of barriers we could always see that were going to prevent taking the next step, and somehow or other, as we got closer, the engineers had figured out ways around these. But some day it has to stop. No exponential like this goes on forever.”

But what an exponential it's been. Introducing the evening, Intel's chief executive, Brian Krzanich, summarised where Moore's Law has taken us. If you took Intel's first-generation microchip from 1971, the 4004, and the latest chip it has on the market today, the fifth-generation Core i5 processor, he said, you could see the power of Moore's Law at work: Intel's latest chip offers 3,500 times more performance, is 90,000 times more energy-efficient and about 60,000 times more cost-effective.

To put that another way, Krzanich said that Intel engineers did a rough calculation of what would have happened if a 1971 Volkswagen Beetle had improved at the same rate as microchips under Moore’s Law.

“Here are the numbers: [today] you would be able to go with that car 300,000 miles per hour. You would get two million miles per gallon of gas, and all that for the mere cost of 4 cents!”


What is most striking in Moore’s 1965 article is how many predictions he got right about what these steadily improving microchips would enable. The article, headed “Cramming more components onto integrated circuits”, argued that “integrated circuits will lead to such wonders as home computers – or at least terminals connected to a central computer – automatic controls for automobiles, and personal portable communications equipment. The electronic wristwatch needs only a display to be feasible today . . . In telephone communications, integrated circuits in digital filters will separate channels on multiplex equipment. [They] will also switch telephone circuits and perform data processing.”

Moore pretty much anticipated the personal computer, the cellphone, self-driving cars, the iPad, Big Data and the Apple Watch. How did he do that? (The only thing he missed, I jokingly told him, was microwave popcorn.)

“Well, I had been looking at integrated circuits,” he said. “[They] were really new at that time, only a few years old, and they were very expensive. There was a lot of argument as to why they would never be cheap, and I was beginning to see, from my position as head of a laboratory, that the technology was going to go in the direction where we would get more and more stuff on a chip and it would make electronics less expensive . . . I had no idea it was going to turn out to be a relatively precise prediction, but I knew the general trend was in that direction and had to give some kind of a reason why it was important to lower the cost of electronics.”

Can it continue? Every year someone predicts the demise of Moore’s Law, and they’re proved wrong. With enough good engineers working on it, he hoped, “we won’t hit a dead end . . . It’s unique technology. I can’t see anything really comparable that has gone on for this long a period of time with exponential growth.”

But let’s remember that it was enabled by a group of remarkable scientists and engineers, in a US that did not just brag about being exceptional but invested in infrastructure and basic scientific research, and set audacious goals, to make it so. If we want Moore’s Law-like technologies, we need to invest in the building blocks.

Alas, recent US governments have not been investing in basic research the way they did when the likes of Moore and Robert Noyce, the co-inventor of the integrated circuit and the other co-founder of Intel, were coming of age.

“I’m disappointed that the [US] federal government seems to be decreasing its support of basic research,” said Moore. “That’s really where these ideas get started. They take a long time to germinate, but eventually they lead to some marvellous advances. Certainly, our whole industry came out of some of the early understanding of the quantum mechanics of some of the materials . . .

“Our position in the world of fundamental science has deteriorated pretty badly. There are several other countries that are spending a significantly higher percentage of their GNP than we are on basic science or on science, and ours is becoming less and less basic.”

Chemistry set

How did Moore first get interested in science? “My neighbour got a chemistry set, and we could make explosives,” he said. “In those days, chemistry sets had some really neat things in them, and I decided about then that I wanted to be a chemist, not knowing quite what they did, and I continued my work in a home laboratory for some period of time.

“Got to the point where I was turning out nitroglycerin in small production quantities and turning it to dynamite . . . A couple ounces of dynamite makes a marvellous firecracker. That really got my early interest in it.

“You couldn’t duplicate that today, but there are other opportunities. You know, I look at what some of my grandkids are doing – for example, those robotics and the like. These are spectacular. They’re really making a lot of progress.”

Looking back on Moore’s Law and the power of computing that it has driven, I asked Moore what he thought was its most important contribution.

“Wow,” he said. “You know, just the proliferation of computing power. We’ve just seen the beginning of what computers are going to do for us.”

How so? “Oh, I think, incrementally, we see them taking over opportunities that we tried to do without them before and were not successful. It’s kind of the evolution into the machine intelligence, if you wish, and this is not happening in one step.

“To me, it’s happening in a whole bunch of increments. I never thought I’d see autonomous automobiles driving on the freeways. It wasn’t many years ago [they] put out a request to see who could build a car that could go across the Mojave Desert to Las Vegas from a place in southern California, and several engineering teams across the country set out to do this. Nobody got more than about 300 yards before there was a problem. Two years later, they made the full 25-mile trip across this desert track, which I thought was a huge achievement, and from that it was just a blink before they were driving on the freeways. I think we’re going to see incremental advances like that in a variety of other areas.”

Did he worry, I asked Moore, whose own microprocessors seemed as sharp as ever, that machines would truly start to replace both white-collar and blue-collar labour on a scale that could mean the end of work for a lot of people?

“Don’t blame me,” he said. “I think it’s likely we’re going to continue to see that. You know, for several years, I have said we’re a two-class society, separated by education. I think we’re seeing the proof of some of that now.”

When was the moment he came home and said to his wife, Betty, “Honey, they’ve named a law after me”?

“For the first 20 years, I couldn’t utter the term Moore’s Law,” he said. “It was embarrassing. It wasn’t a law. Finally, I got accustomed to it, where now I could say it with a straight face.”

Given that, is there something that he wishes he had predicted, like Moore’s Law, but did not? “The importance of the internet surprised me,” he said. “It looked like it was going to be just another minor communications network that solved certain problems. I didn’t realise it was going to open up a whole universe of new opportunities, and it certainly has. I wish I had predicted that.”

© Copyright New York Times