IBM built the biggest, coolest quantum computer. Now comes the hard part

The world’s largest quantum computer system is quietly humming away, doing god knows what, in the middle of a low-slung conference room, just off the lobby of a building in Westchester County, New York.

This is the Thomas J. Watson Research Lab, which gave birth to the modern laser, DRAM, Mandelbrot set, Deep Blue chess-famous computer, Jeopardy-winning Watson computer, and a host of things that helped the astronauts land on the moon. Perched atop a hill in Yorktown Heights, about an hour’s drive from Manhattan, and looking like a crescent from above, the building was designed in 1961 for IBM Research by Eero Saarinen in a way that matches the curves of his now-shuttered TWA terminal at JFK with the topography of its airport’s surroundings, which can be seen from the sunlit hallway that encircles a constellation of windowless labs. It still houses the company’s most cutting-edge research—things like AI and semiconductor design—and it might be the last of America’s major historical research labs that’s still going. In fact, hanging above the staircase, in wood, the old motto of the building’s namesake: Think, also the name of the company’s annual premier conference.

After all that, the machine, called the Quantum System Two, was introduced last month not as a new chapter in that history of computing so much as a whole new book. The 22-foot-wide, 15-foot-high hexagonal platform of glass and polished aluminum, featuring three Quantum Heron processors, was designed to expand in modular fashion. (And designed is the operative word: A winner in Fast Company’s 2023 Innovation by Design Awards, the Quantum System Two is the work of an in-house team along with Map, Universal Design Studios, and Milan-based Goppion, which makes the glass for the Mona Lisa.) Its looks and ambition evoke the world-changing room-size mainframes of IBM in the ’60s and the sexy supercomputers of Cray in the ’80s. And if you squint: HAL and Skynet too.

At the same time, because it is sitting right next to the lobby, and because it looks so cool, and because what it’s trying to do is so outrageous, you might think this is a work of corporate art—perhaps an extravagant trade show display, say, an industrial fridge in the style of the Cybertruck.

Jay Gambetta, who led the team that built the System Two, would like me to see past that facade. Inside—at the coldest temperatures in the universe, at a fraction above absolute zero, and at the tiniest of scales—is what he calls the building block for a working, fault-tolerant quantum computer: a chip that lays the foundation for a room full of System Twos, a quantum supercomputer, another era.

“We’re really proud of it,” says Gambetta.

The fridge at the center of System Two contains three 133-qubit chips [Photo: Alex Pasternack]

For Gambetta, 44, a soft-spoken Australian and the vice president of IBM Quantum, that pride is fueled by nearly two decades of painstaking, mind-boggling work. Quantum computing, first imagined in the early ‘80s, often requiring massive cryogenic fridges and complex electronics, is still incredibly hard to do, and hard to scale. The machines are still plagued by noise and errors that can destroy reliable calculations; to really work, they’ll need to be bigger, and will need systematic error correction. Some doubt a fully fledged, reliable quantum computer will happen anytime soon.

But according to IBM’s latest roadmap, an error-corrected machine is coming within a decade. About the problems that remain, Gambetta says, “I think of them as an engineering challenge, but I don’t see any blockers.”

Taking flight

The idea of a quantum computer is to replace the simple 1s and 0s of bits—the building block of everything digital, from your websites to your 90 Day Fiancé to the simulations behind new kinds of batteries and lifesaving drugs—with something far more powerful. By exploiting the weirdnesses of quantum mechanics, like superposition and entanglement, qubits allow for a mix of 0 and 1 . . . a cat that’s both alive and dead or a coin mid toss.

It’s all hard to make heads or tails of; not unlike Einstein’s famous quote vis a vis quantum mechanics, that God “does not play dice with the universe.” But the weirdnesses are real, and for some problems, the resulting speedups from quantum computations could make classical computers look like abacuses.

Whereas adding a bit to your classical computer simply grows your computing power in a linear fashion, each new qubit in a quantum system expands your computation space exponentially: Two qubits represents 4 possible values; 3 represents 9; 10 contains 1,024 values, and so on. With enough qubits, an error-corrected quantum computer could handle calculations out of reach for the best classical computers. In April, Google (which is also spending billions on quantum computing) reported that its 70-qubit machine tackled a problem that would’ve taken a supercomputer nearly five decades, and solved it in seconds.

The neatest application may be for what theoretical physicist Richard Feynman imagined when he first proposed these computers in 1982: simulating the behavior of stuff at the quantum level. Classical computers can simulate a quantum system but struggle to do so above 50 particles, or qubits; above 100, says Gambetta, “a lot of interesting [science] problems happen at that range.”

Running those simulations on a quantum computer could help scientists discover new drugs and fuels and batteries, or help unravel some of the universe’s thorniest mysteries. Their math could supercharge AI and crack the hardest problems—including the prime factoring one that protects all of our digital secrets.

Those and other promises have fueled a gold rush of quantum companies, which raised $2.35 billion last year, according to McKinsey, just slightly above the previous year’s total. Alongside billions in government funding—led by the U.S. and China—Google, Amazon, and Microsoft are also investing in research. Deep-pocketed companies have been exploring uses in oil and gas, chemicals, aviation, pharmaceuticals, and finance. Governments are pushing out new standards to secure data from encryption-breaking quantum machines.

Since IBM first put a quantum computer on the cloud in 2016 for anyone to start experimenting with, it has maintained a leading spot in the business and the science, and has laid out the clearest, most concrete technology roadmap in the field. Remarkably, it has been keeping many of its promises too.

Using a new benchmark that looks at the quality rather than the quantity of qubits, IBM says the new chip, called Heron, is its “most performant” quantum processor yet, three times better than its previous record-breaking chip, the 127-qubit Eagle, and even better than last year’s whopper, the 433-qubit Osprey. Alongside Heron, Gambetta’s team also released Condor, which, with 1,121 qubits, is the world’s largest quantum chip. “I like birds,” he says.

But after years of going bigger and bigger, Gambetta’s team is now focusing its efforts on smaller, higher-quality birds, and developing ways of linking them together into larger, parallel systems, like classical supercomputers. This will require new ways of connecting adjacent and distant qubits using both classical and quantum networking. (For comparison, the classical system that trained OpenAI’s GPT used 285,000 CPU cores, 10,000 GPUs and 400 gigabits per second of network connectivity for each GPU server.)

Click to expand [Image: IBM/Flickr]

By 2033, according to its revised roadmap, IBM will link together multiple System Twos, forming a system capable of executing 1 billion gates across as many as 16,632 qubits. A parade of ever-larger processor designs—Kookaburra, Cockatoo, and Starling—will culminate that year in a 2,000-qubit chip that Gambetta’s team calls Blue Jay. (Funny!)

“I didn’t propose it, but I’m gonna accept that name,” he says with a fading grin. “Let’s hope the device works.”

In the meantime, even while the machines are still “noisy,” they’re starting to prove more useful too, beating classical computers on certain physics simulations. In 2019, researchers at Google’s quantum lab claimed their computer could outperform classical machines, but only on a niche calculation without any practical use. But in a paper in Nature last June, a team of physicists at IBM and Berkeley used a 127-qubit Eagle processor to beat a classical computer at approximating a property called the average magnetization, using a simulation that naturally maps to a quantum computer.

Gambetta, who’s been working on building quantum computers since 2004, says the paper, along with other recent research that has used Eagle for problems with no a priori answers, shows that things are different now: “We’ve entered the era of utility.”

Noise machines

As crucial as the calculation was, Gambetta, who is an avid surfer, is really stoked about how the researchers got it: by waging a temporary battle on the errors that are the bane of all quantum engineers. The principles that make each qubit incredibly powerful also make them incredibly fragile, sensitive to the slightest noise—from the environment, the control electronics, even each other.

System Two’s cooling system is designed to be fully automated, relieving a major headache for engineers. [Photo: Alex Pasternack]

In this case, IBM used a set of new techniques for “error mitigation,” which seeks to better understand the noise and thus better reduce it—a bit like the process in noise-canceling headphones. With the help of the error mitigation method, IBM achieved a record-size quantum circuit, at a scale of 124 qubits with 2,600 entangling gates.

The performance of Eagle on the calculation represented “a very big mind shift for a community or even the larger public, that perhaps only perceived that useful quantum computation can only happen when you have this big error-corrected thing,” said Abhinav Kandala, an experimental physicist at IBM Quantum, and the paper’s lead author.

The goal now is to build more high-quality entangling gates between qubits, which are the logical operations run by the computer. The number of gates, or the length of the circuit, encodes the accuracy of the algorithm. After years of adding more qubits, says Gambetta, “it’s time, now that we’re in this utility phase, to focus on how we add more gates.”

Gambetta says IBM will soon have a system capable of running 100 qubits and 3,000 gates, which should open up more utility. By the end of 2024, IBM is aiming for 5,000 gates. That number increases steadily each year until 2029, when it reaches 15,000 gates using 200 qubits. “That’ll be another era,” Gambetta says, “when we implement error correction.”

But error mitigation is only a prelude to a more difficult technique: full-on quantum error correction, or QEC. Even the best-made qubit in the world will need to be error corrected, which will require building many high-quality redundant physical qubits—possibly thousands or more—to make a single “logical” qubit.

Building qubits that are good enough simply to do QEC has long been a goal for quantum computer engineers. So far, superconducting qubit chips like IBM’s and Google’s make an error about every 100 or 1,000 steps; the goal is to shrink that rate down to more like one in a million, which is when error-correction techniques start to become more feasible.

As IBM tries to squeeze more usefulness out of the current set of noisy processors and build better qubits, it’s also been working on ways to reduce the number of extra qubits it needs to do error correction. The code, or surface code, describes how the redundant physical qubits on a chip are arranged on a grid to create one working logical qubit, and IBM has been testing variations that are adapted to the intricate hexagonal geometries of its chip designs.

In August, IBM shared research describing a new approach it calls the Gross code, named for a mathematical unit equal to 12 dozen. Instead of a single grid layer, it works by coupling together more qubits that aren’t direct neighbors across two parallel grids, which means far fewer physical qubits to encode a logical qubit. Most surface codes used today can require up to 4,000 physical qubits to host 12 logical qubits; the Gross code, says IBM, could encode one using only 288 physical qubits.

Still, error correction is only part of a symphony of tricks needed to run and measure a reliable quantum computation in the face of a cascade of errors. All of it—control, readout, decoding, mitigation, correction—must be performed at rapid speed, before the qubits decohere or the whole system is overwhelmed by errors. Think of it like a complex caper at the tiniest of scales in deep-space temperatures: a stunt-filled Mission: Impossible heist in supercomputer form.

All the challenges involved, especially as the machines get larger, have fueled persistent doubts. But with Heron and other ongoing research, Gambetta now sees more light at the end of the tunnel, a way to get past the era of “noisy” error-prone machines.

“We’re saying pretty, pretty, pretty strongly that we have a path to error correction now,” says Gambetta. And with Heron, IBM has found a basic design and process that will take it all the way. “The understanding of materials and the design of the qubits, I’m not gonna say it’s a solved problem—but it’s basically solved.”

How to make a qubit

The physics of these chips is complicated as hell, and the engineering reflects that. To manage and read each qubit, IBM’s machines use microwave pulses. In System Two, this means a number of wires, connections, and classical devices for each qubit; and that means thousands of gold-plated microwave cables snaking down from the top of the fridge through a series of concentric plates until they reach the processor at the very bottom. (There are three Herons inside System Two.)

The result of all this is the “quantum chandelier”: the giant, teetering upside-down steampunk ziggurat, which has become a staple of quantum computer design. The chandelier is kept in a vacuum inside a giant fridge at some of the coldest temperatures in the universe.

Inside the fridge of a System One unit, a chandelier carries a single Eagle chip with 127 qubits. [Image: IBM]

IBM’s and Google’s chips use what’s known as a transmon qubit, a tiny loop of superconducting metal that can be made using existing microchip technology. Superconductors operate with no electrical resistance, provided they are kept at temperatures of roughly 20 millikelvin, or -273 degrees Celsius. (This is one reason why a room temperature superconductor would be such a big deal.)

Others are making progress with different approaches. Shortly after IBM announced Heron and System Two, a lab at Harvard working with Boston-based QuEra, MIT, and a joint program between the National Institute of Standards and Technology and the University of Maryland, announced a breakthrough on a quantum computer: They managed to turn 280 qubits into as many as 48 logical qubits, more than 10 times the number of logical qubits that have ever been created. Scott Aaronson, director of the University of Texas at Austin’s Quantum Information Center, wrote on his blog last month that if it stands, “it’s plausibly the top experimental quantum computing advance of 2023.”

Unlike IBM’s, QuEra’s qubits are made of rubidium atoms, and

Creată 1y | 8 ian. 2024, 10:30:03


Autentifică-te pentru a adăuga comentarii

Alte posturi din acest grup

How Zipline’s Keller Cliffton built the world’s largest drone delivery network

Zipline’s cofounder and CEO Keller Cliffton charts the company’s recent expansion from transporting blood for lifesaving transfusions in Rwanda to retail deliveries across eight countries—includin

3 mai 2025, 13:30:10 | Fast company - tech
Skype is shutting down. If you still use it, like I do, here are some alternatives

When Skype debuted in 2003, it was the first time I remember feeling that an individual app—and not just the broader internet—was radically disrupting communications.

Thanks to its imple

3 mai 2025, 11:20:04 | Fast company - tech
This free app is like Shazam for bird calls

It’s spring, and nature is pulling me away from my computer as I write this. The sun is shining, the world is warming up, and the birds are chirping away.

And that got me thinking: What

3 mai 2025, 11:20:03 | Fast company - tech
‘Read the room, girl’: Running influencer Kate Mackz faces backlash over her White House interview

Wake up, the running influencers are fighting again. 

In the hot seat this week is popular running influencer Kate Mackz, who faces heavy backlash over the latest guest on her runni

2 mai 2025, 21:20:07 | Fast company - tech
Half of Airbnb users in the U.S. are now interacting with its AI customer service agent

Half of Airbnb users in the U.S. are now using the company’s AI-powered customer service agent, CEO Brian Chesky said Thursday

2 mai 2025, 21:20:05 | Fast company - tech
What your emoji use says about your personality

Are you guilty of overusing the monkey covering its eyes emoji? Do you find it impossible to send a text without tacking on a laughing-crying face?

Much like choosing between a full stop

2 mai 2025, 16:40:07 | Fast company - tech
SAG-AFTRA’s new influencer committee aims to strengthen support for digital creators

SAG-AFTRA is expanding its reach into the influencer economy.

In late April, the u

2 mai 2025, 14:30:04 | Fast company - tech