Skip to content
Surprising Science

After Quantum Computers, Quark-Scale Computers

Beyond the microchip lies quantum computing. Beyond that lies quark-scale computing, made from materials a billion billion billion times smaller than the current computational scale.
Sign up for Smart Faster newsletter
The most counterintuitive, surprising, and impactful new stories delivered to your inbox every Thursday.

What’s the Latest Development?


Seth Lloyd, professor of mechanical engineering at MIT, has devoted his professional career to making smaller microchips that can process more information in less time. What has resulted, thanks to his work and that of others, is a consistently more powerful computer, tablet and smartphone (and the veritable obligation to keep buying newer versions). But Lloyd has no illusions about the staying power of the microchip. Though its power has doubled roughly every year and a half, the law of physics that have permitted the shrinking of semiconductors have definite limits. According to Lloyd, the computing industry is approaching those limits. In other words, computers may soon cease getting better. 

What’s the Big Idea?

An entirely new kind of computer is on the horizon: A quantum computer, whose internal mechanisms obey the counter-intuitive laws of quantum physics. “Though tiny and computationally puny when compared with conventional chips, these quantum computers show that it is possible to represent and process information at scales far beyond what can be done in a semiconductor circuit.” Looking forward to ultimate limits of processing, Lloyd says quantum computing may be a stepping stone to quark-scale computing: “The ultimate level of miniaturization allowed by physical law is apparently the Planck scale, a billion billion billion times smaller than the current computational scale.”

Read it at Slate

Photo credit: Shutterstock.com

Sign up for Smart Faster newsletter
The most counterintuitive, surprising, and impactful new stories delivered to your inbox every Thursday.

Related

Up Next