The mathematics of Moore’s Law has long baffled observers, even as it underlies much of the technological revolution that has transformed the world over the past 50 years, but as chips get smaller, there’s now renewed speculation that it will be squeezed out.
In the 1965, Intel cofounder Dr. Gordon Moore observed that the number of digital transistors on a single microchip doubled every two years. The trend has stuck ever since: computers the size of entire rooms now rest in the palm of your hand, at a fraction of the cost.
But with the under-girding technology approaching the size of a single atom, many fear the heyday of the digital revolution is coming to a close, forcing technologists around the world to rethink their business strategies and their notions of computing altogether.
We have faced the end of Moore’s Law before — in fact, Brian Krzanich, Intel’s chief executive, jokes he has seen the doomsday prediction made no less than four times in his life. But what makes the coming barrier different is that whether we have another five or even ten years of boosting the silicon semiconductors that constitute the core of modern computing, we are going to hit a physical wall sooner rather than later.
If Moore’s Law is to survive, it would require a radical innovation, rather than the predictable progress that has sustained chip makers over recent decades.
And most technology companies in the world are beginning to acknowledge the changing forecast for digital hardware. Semiconductor industry associations of the United States, Europe, Japan, South Korea, and Taiwan will issue only one more report forecasting chip technology growth. Intel’s CEO casts these gloomy predictions as premature and refused to participate with the final report. Krzanich insists Intel has the technical capabilities to keep improving chips while keeping costs low for manufacturers, though few in the industry believe the faltering company will maintain its quixotic course for long.
The rest of the industry is casting forth to new opportunities. New technologies like graphene (an atomic-scale honeycomb-like web of carbon atoms) and quantum computing offer a unique way out of physical limitations imposed by silicon superconductors. Graphene has recently enthralled chipmakers with its affordable carbon base and configuration that makes it an ideal candidate for faster, though still largely conventional, digital processing.
“As you look at Intel saying the PC industry is slowing and seeing the first signs of slowing in mobile computing, people are starting to look for new places to put semiconductors,” said David Kanter, a semiconductor industry analyst at Real World Technologies in San Francisco, told The New York Times.
Quantum computing, on the other hand, would tap the ambiguity inherent in the universe to change computing forever. The prospect has long intrigued tech companies, and the recent debut of some radical early stage designs have reignited the fervor of quantum’s advocates.
For many years, the end of Moore’s Law was viewed as a kind of apocalypse scenario for the technology industry: What would we do when there was no more room on the chip? Much of what has been forecast about the future of the digital world has been preceded on the notion that we will continue to make the incredible improvements of the past half century.
It’s perhaps a good sign that technology companies are soberly looking to the future and getting excited about new, promising developments that may yet yield entirely new frontiers.