Inside Advanced Scale Challenges|Wednesday, December 12, 2018
  • Subscribe to EnterpriseTech Weekly Updates: Subscribe by email

Cold Blanket Cast on Quantum Computing 

shutterstock 355765019

Amid the gush of money and enthusiastic predictions being thrown at quantum computing comes a proposed cold shower in the form of an essay by physicist Mikhail Dyakonov published in IEEE Spectrum this month – "The Case Against Quantum Computing." Whatever your view of QC’s prospects, Dyakonov’s commentary is worth a read. Error correction – or more accurately the inability to monitor variables and correct errors at the scale required – is the big stumbling block, he writes, but there is a good deal more to his piece.

Sometimes it’s best to start with conclusions first.

“To my mind, quantum computing researchers should still heed an admonition that IBM physicist Rolf Landauer made decades ago when the field heated up for the first time," Dyakonov writes. "He urged proponents of quantum computing to include in their publications a disclaimer along these lines: 'This scheme, like all other schemes for quantum computation, relies on speculative technology, does not in its current form take into account all possible sources of noise, unreliability and manufacturing error, and probably will not work.'”

Love it. Don’t necessarily agree with it. But it’s a great reminder that when an idea catches fire it can sometimes flare into “irrational exuberance” as former Fed Chairman Alan Greenspan pointed out it in his (in)famous characterization of investor attitudes leading up to the first dot.com bubble.

Here’s a lightly edited extract from Dyakonov’s commentary:

“Experts estimate that the number of qubits needed for a useful quantum computer, one that could compete with your laptop in solving certain kinds of interesting problems, is between 1,000 and 100,000. So the number of continuous parameters describing the state of such a useful quantum computer at any given moment must be at least 2 to the 1,000th, which is to say about 10 to the 300th. That’s a very big number indeed. How big? It is much, much greater than the number of subatomic particles in the observable universe.

“To repeat: A useful quantum computer needs to process a set of continuous parameters that is larger than the number of subatomic particles in the observable universe…At this point in a description of a possible future technology, a hardheaded engineer loses interest. But let’s continue…Could we ever learn to control (all of these) continuously variable parameters defining the quantum state of such a system?
My answer is simple. No, never.”

The IEEE Spectrum piece is well worth reading if only as a reminder that very large hurdles remain in quantum computing development. Dyakonov is a theoretical physicist at Charles Coulomb Laboratory, the University of Montpellier, France. He builds up a reasonably detailed technical argument in his essay and he also contends the current wave of exuberance around QC will soon subside:

“I believe that, appearances to the contrary, the quantum computing fervor is nearing its end," Dyakonov writes. "That’s because a few decades is the maximum lifetime of any big bubble in technology or science. After a certain period, too many unfulfilled promises have been made, and anyone who has been following the topic starts to get annoyed by further announcements of impending breakthroughs. What’s more, by that time all the tenured faculty positions in the field are already occupied. The proponents have grown older and less zealous, while the younger generation seeks something completely new and more likely to succeed.”

Add a Comment

Do NOT follow this link or you will be banned from the site!
Share This