Advanced Computing in the Age of AI | Friday, March 29, 2024

Intel Bets Big on 2-Track Quantum Strategy 

shutterstock: mopic

Quantum computing has lived so long in the future it’s taken on a futuristic life of its own, with a Gartner-style hype cycle that includes triggers of innovation, inflated expectations and – though a useful quantum system is still years away – anticipatory troughs of disillusionment.

To wit, there’s Mikhail Dyakonov in the November IEEE Spectrum, who, even as investors, companies and countries pour billions into quantum, says it will never work. The theoretical physicist at the Université de Montpellier, France, contends that error correction (monitoring variables and correcting errors) isn’t possible at quantum scale. “A useful quantum computer needs to process a set of continuous parameters,” Dyakonov wrote, “…larger than the number of subatomic particles in the observable universe.”

Maybe. But betting against quantum flies in the face of centuries of technological breakthroughs reinforcing the maxim: what we can conceive we can achieve. At a technical level, countering Dyakonov is a post-doctoral researcher at QuTech, part of the Delft University of Technology in the Netherlands (see his rebuttal in sister publication HPCwire), who writes, “No computer, classical or quantum, ever has to process even a single continuous parameter. In classical computers, we can use floating-point arithmetic to approximate continuous parameters using a finite number of bits.”

Intel's Jim Clarke

Also at odds with Dyakonov is Jim Clarke, director of quantum hardware at Intel Labs, who told us during a recent interview that, daunting as quantum error correction may be, there are tougher quantum challenges to overcome, beginning with scale – which in the quantum world is otherworldly.

This is because, as described in a recent MIT Technology Review article, “The fundamental units of computation…are qubits, which — unlike bits — can occupy a quantum state of 1 and 0 simultaneously. By linking qubits through an almost mystical phenomenon known as entanglement, quantum computers can generate exponential increases in processing power.”

Leaving aside the headsplitting notion that qubits can be both 1 and 0 (University of Nottingham professor Phil Moriarty: "As a quantum physicist, it's not that you understand it, you just get used to it."), in Clarke’s explanation, a qubit is like a spinning coin – it’s both heads and tails. This is the principal of “quantum superposition.”

“So if I get two coins spinning at the same time, I’d simultaneously have four states; with three coins, eight states,” Clarke said. “With 300 spinning coins, how many states can I have? That’s two to the 300th, which is more states than there are in the universe. At 50 you can represent more states than any supercomputer could do.”

This brings us to Clarke’s biggest quantum worry: interconnection. Holding a superconducting qubit processing unit (QPU – see left), Clarke said, “If you take this chip, I’ve got 49 qubits and 108 coaxial connectors to the outside world. What would it look if I had a million qubits? I can’t have 2 million coax cables to the outside world. Maybe that’s what an ENIAC system looked like in the 1940s, but that’s not what conventional system looks like. So what worries me most is wiring your interconnects.”

By comparison, Clarke said, an Intel Xeon server chip has 7 billion transistors and only 2000 connectors, mostly for power and ground.

The wiring problem is a factor contributing to Intel’s two-track quantum strategy – one track is development of a “mainstream” (in the quantum world)  superconducting qubit, which most companies (IBM, Google, Rigetti and others) are attempting to perfect; Intel’s other development track is the silicon spin qubit (SSQ), which Intel is pursuing along with Delft University, the University of New South Wales and Princeton and which looks like a transistor.

While developmentally less mature than the superconducting qubit (which have reached about 50 entangled qubits), the SSQ (now at about 23 entangled qubits) may tap more readily into Intel’s chip heritage.

“When you think of Intel, you probably think of it as a transistor company,” Clarke said, “and you’d be right. (SSQ’s)…look a lot like a transistor. I’d describe it as a single electron transistor.”

Intel is the only large company investigating SSQ’s, according to Clarke.

“Our thought is to look at similar (superconducting qubit) technologies as some of our competitors, but we’re also looking at a novel technology that resembles our transistors in the infrastructure we have,” Clarke said. “We can build on our multi-billion dollar infrastructure to make these devices. So that’s a big bet we’re making…. We’re doing both, we’re basically hedging out bet.”

A potential advantage of the SSQ (picture at right) is that it’s “a million times smaller than the superconducting qubit,” said Clarke. “…. just from a real estate perspective, if this has 49 (coax cables), then I have to wonder what a million would look like. It would be huge. But with silicon spin, there’s no reason we can’t have a density similar to our advanced logic or advanced memory, so there’s no reason we couldn’t get into the millions easily… We’re hoping to accelerate that technology and make it competitive with superconducting qubits and then, hopefully, it will be the technology that will scale.”

Intel’s two-track strategy poses another challenge for Clarke: how to accelerate the development of one versus the other. “But it’s hard for me to imagine that we’d give up early on the technology that looks like transistors, being a transistor company,” he said.

Clarke concedes that IBM, Google and others have been working on quantum longer than Intel, but he argues that Intel has advantages the others can’t match, notably Intel’s architecture and process expertise.

“We can tap into state-of-the-art fabrication facilities,” Clarke said, citing research the quantum group has done jointly with Intel’s packaging group in Arizona aimed at improving performance and reliability.

“We have our fabrication engineers working on the chip, we have folks working on control electronics for the QPU, we have people developing architectures for the QPU,” Clarke said. “They’re all working based on the background of Intel’s experience. We’re trying to put together a complete system.”

Companies without advanced fabrication, Clarke said, probably rely on “something that resembles a university lab. You can imagine that a university professor can make one good transistor. But can he build 7 billion of them all the same, and put them on a chip that you could buy for a server?”

He cited Intel’s 500-acre Romler Acres campus in Hillsboro, OR, at which, though Intel’s quantum work is experimental, “we’re still running our material through the same factories that our advanced technologies are running on, and that has to be considered an advantage…. We get to use state-of-the-art tools with process controls, we take great care with our material deposition.”

With Quantum years away, quantum developers necessarily adopt long time horizons. Clarke estimates quantum is half a decade from a significant step in system performance.

“I think there’s a race to quantum supremacy,” he said. “At some number, 50 or 60 (qubits), someone will say we’ve contrived a problem that can’t be solved with a classical computer. That would be a milestone, but it’s not a practical milestone. I think being able to do an optimization problem or characterize the configuration of a molecule that can’t be done with a classical computer, that would be a first milestone. We’re probably five or six years away from that.”

Waiting that long for results runs counter to our impatient culture, but Clarke suggests quantum progress be put in historical perspective.

“In a field where computer advances are measured on the span of a year or two, then when you say something (quantum) is 10 years away, some will say it might as well be forever,” he said. “But if you look at the history of electronics, the first transistor was 1947, the first silicon transistor was 1954, the integrated circuit was 1958, and the first microprocessor was 1970. So these things don’t happen overnight. If you take a look at where we are, we’ve surpassed the equivalent of the first integrated circuit, and now we’re trying to get to a large enough size to do something useful. So…if we say we’re 10 years away from having a few thousand qubits to do something that cannot be done otherwise, it’s actually not so far of a stretch.”

 

EnterpriseAI