Is the Age of Silicon Computing Coming to an End? Physicist Michio Kaku Says “Yes”



Traditional computing, with its ever more microscopic circuitry etched in silicon, will soon reach a final barrier: Moore's law, which dictates that the amount of computing power you can squeeze into the same space will double every 18 months, is on course to run smack into a silicon wall due to  overheating, caused by electrical charges running through ever more tightly packed circuits.

"In about ten years or so, we will see the collapse of Moore’s Law. In fact, already, already we see a slowing down of Moore’s Law," says world-renowned physicist, Michio Kaku. "Computer power simply cannot maintain its rapid exponential rise using standard silicon technology."

According to Kaku, at the International Supercomputing Conference 2011 last June, Intel architecture group VP Kirk Skaugen said something about Moore’s Law not being sufficient, by itself, for the company to ramp up to exascale performance by 2018. But he went on to tout Intel’s tri-gate technology (the company’s so-called “3D” processors) as the solution, which Skaugen announced means “no more end of life for Moore’s Law.”

Despite Intel’s recent advances with tri-gate processors, Kaku argues in a video interview with Big Think, that the company has merely delayed the inevitable: the law’s collapse due to heat and leakage issues.

“So there is an ultimate limit set by the laws of thermal dynamics and set by the laws of quantum mechanics as to how much computing power you can do with silicon,” says Kaku, noting “That’s the reason why the age of silicon will eventually come to a close,” and arguing that Moore’s Law could “flatten out completely” by 2022."

Kaku see several alternatives to the demise of Moores Law: protein computers, DNA computers, optical computers, quantum computers and molecular computers.

"If I were to put money on the table I would say that in the next ten years as Moore’s Law slows down, we will tweak it. We will tweak it with three-dimensional chips, maybe optical chips, tweak it with known technology pushing the limits, squeezing what we can. Sooner or later even three-dimensional chips, even parallel processing, will be exhausted and we’ll have to go to the post-silicon era,” says Kaku.

Kaku concludes that when Moore’s Law finally collapses by the end of the next decade, we’ll “simply tweak it a bit with chip-like computers in three dimensions. We may have to go to molecular computers and perhaps late in the 21st century quantum computers.”

We'll place our bets on quantum computing.

"Quantum computers can efficiently render every physically possible quantum environment, even when vast numbers of universes are interacting. Quantum computation is a qualitatively new way of harnessing nature,"  according to David Deutch, an Israeli-British physicist at the University of Oxford who pioneered the field of quantum computation and is a proponent of the many-worlds interpretation of quantum mechanics. Quantum computers, says Deutch, have the potential to solve problems that would take a classical computer longer than the age of the universe.

Astrophysicist Paul Davies at Arizona State University proposes that information, not mathematics, is the foundation on which physical reality, the laws of nature, are is constructed. Meanwhile at MIT, computer scientist Seth Lloyd, develops Davies assumption, by treating quantum events as "quantum bits," or qubits,  as the way whereby the universe "registers itself."  

Lloyd proposes that information is a quantifiable physical value, as much as mass or motion -that any physical system–a river, you, the universe–is a quantum mechanical computer. Lloyd has calculated that "a computer made up of all the energy in the entire known universe (that is, within the visible “horizon” of forty-two billion light-years) can store about 1092 bits of information and can perform 10105 computations/second."

The universe itself is a quantum computer, Lloyd says, and it has made a mind-boggling 10122/sec computations since the Big Bang (for that part of the universe within the “horizon”).

In Year Million: Science at the Far Edge of Knowledge, leading and up-and-coming scientists and science writers cast their minds one million years into the future to imagine the fate of the human and/or extraterrestrial galaxy. First attempted by H. G. Wells in his 1893 essay “The Man of the Year Million”—is an exploration into a barely conceivable distant future, where the authors confront possibilities facing future generations of Homo Sapiens. How would the galaxy look if it were redesigned for optimal energy use and maximized intelligence? What is a universe bereft of stars?

Lloyd has proposed that a black hole could serve as a quantum computer and data storage bank. In black holes, he says, Hawking radiation, which escapes the black hole, unintentionally carries information about material inside the black hole. This is because the matter falling into the black hole becomes entangled with the radiation leaving its vicinity, and this radiation captures information on nearly all the matter that falls into the black hole. 

“We might be able to figure out a way to essentially program the black hole by putting in the right collection of matter,” he suggests.

There is a supermassive black hole in the center of our galaxy, perhaps the remnant of an ancient quasar. Could the Milky Way's supermassive black hole (in image above) become the mainframe and central file sharing system for galaxy hackers of the Year Million? A swarm of ten thousand or more smaller black holes may be orbiting it. Might they be able to act as distributed computing nodes and a storage network? 

Toward the Year 1,000,000 AD an archival network between stars and between galaxies could develop an Encyclopedia Universica, storing critical information about the universe at multiple redundant locations in those and many other black holes.

Quantum computing per MIT's Lloyd sounds like science fiction -as satellites, moon shots, and the original microprocessor once were.  But the age of computing in not even at the end of the beginning. 

To leapfrog the silicon wall, we have to figure out how to manipulate the brain-bending rules of the quantum realm – an Alice in Wonderland world of subatomic particles that can be in two places at once. Where a classical computer obeys the well understood laws of classical physics, a quantum computer is a device that harnesses physical phenomenon unique to quantum mechanics (especially quantum interference) to realize a fundamentally new mode of information processing.

The fundamental unit of information in quantum computing (called a quantum bit or qubit), is not binary but rather more quaternary in nature, which differs radically from the laws of classical physics.A qubit can exist not only in a state corresponding to the logical state 0 or 1 as in a classical bit, but also in states corresponding to a blend or superposition of these classical states.

In other words, a qubit can exist as a zero, a one, or simultaneously as both 0 and 1, with a numerical coefficient representing the probability for each state.  This may seem counter-intuitive because everyday phenomenon are governed by classical Newtonian physics, not quantum mechanics — which takes over at the atomic level.  

The reason this is exciting is that it's derived from the massive quantum parallelism achieved through superposition, is the equivalent of performing the same operation on a classical super computer with ~10150 separate processors, which is impossible.

The idea of a computational device based on quantum mechanics was first explored in the 1970's and early 1980's by physicists and computer scientists such as Charles H. Bennett of the IBM Thomas J. Watson Research Center,  Paul A. Benioff of Argonne National Laboratory in Illinois, David Deutsch of Oxford, and the late Richard P. Feynman, Nobel laureate of the California Institute of Technology were pondering the fundamental limits of computation.

They understood that if technology continued to abide by Moore's Law, then the continually shrinking size of circuitry packed onto silicon chips would eventually reach a point where individual elements would be no larger than a few atoms.  Here a problem arose because at the atomic scale the physical laws that govern the behavior and properties of the circuit are inherently quantum mechanical in nature, not classical. 

This then raised the question of whether a new kind of computer could be devised based on the principles of quantum physics.

Feynman was among the first to attempt to provide an answer to this question by producing an abstract model in 1982 that showed how a quantum system could be used to do computations.  He also explained how such a machine would be able to act as a simulator for quantum physics.  In other words, a physicist would have the ability to carry out experiments in quantum physics inside a quantum mechanical computer.

In 1985, Deutsch realized that Feynman's assertion could eventually lead to a general purpose quantum computer and published a crucial theoretical paper showing that any physical process, in principle, could be modeled perfectly by a quantum computer.  Thus, a quantum computer would have capabilities far beyond those of any traditional classical computer.  After Deutsch published this paper, the search began to find interesting applications for such a machine.

The breakthrough occurred in 1994 when Shor circulated a preprint of a paper in which he set out a method for using quantum computers to crack an important problem in number theory, namely factorization.  He showed how an ensemble of mathematical operations, designed specifically for a quantum computer, could be organized to enable a such a machine to factor huge numbers extremely rapidly, much faster than is possible on conventional computers. 

With Shor's breakthrough, quantum computing transformed from a mere academic curiosity directly into a national and world interest. Quantum hardware, on the other hand, remains an emerging field, but the work done thus far suggests that it will only be a matter time before we have devices large enough to test Shor's and other quantum algorithms. 

Beyond the actual creation of a quantum computer, our chief limitations are the imaginations of software engineers. This will be the major challenge of the Google whiz kids of tomorrow: to take computing and networking power that is effectively infinite and create interfaces that are simple enough for mere humans to understand.

Recent breakthroughs pioneered by Stuart Wolff of the University of Virginia  allow us to take electricity out of the equation, and get rid of the overheating problem that is undercutting Moore's law. Single electrons have been made to adjust their spin. Subatomic circuitry is within our grasp.

The Daily Galaxy via and

Additional sources:

Information and the Nature of Reality, Paul Davies and Niels Henrik Gregersen(2011)

View Today's Hot Tech News Video from IDG -Publishers of PC World, MacWorld, and Computerworld–Top Right of Page  

To launch the video click on the Start Arrow. Our thanks for your support! It allows us to bring you the news daily about the discoveries, people and events changing our planet and our knowledge of the Universe.


"The Galaxy" in Your Inbox, Free, Daily