A Quantum Leap Forward

Quantum computing’s promise is inching closer to reality. In the coming years, these systems will likely lead to breakthroughs in areas as diverse as drug discovery, materials science, financial modeling, and macroeconomics. They could potentially tackle tasks in minutes or hours that would require years or even centuries using classical computers.

One of the keys to achieving commercial viability is error correction. Unlike classical computers that rely on 0s and 1s to process information, quantum computers tap qubits, which exhibit a quantum phenomenon known as superposition to exist in both states simultaneously. When two or more qubits become entangled become entangled and evolve coherently (in a predictable and controlled manner), it is possible to operate the qubits across a spectrum of behaviors, rather than only two basic states.

There’s a problem, however. Errors occur because qubits are incredibly fragile and difficult to maintain. Any electromagnetic interference—a.k.a. “noise”—can destroy qubit coherence and disrupt calculations. “Error correction is at the core of building systems that deliver meaningful results,” said William Oliver, Henry Ellis Warren professor of Electrical Engineering and Computer Science and professor of physics at the Massachusetts Institute of Technology.

Scientists may have just turned the corner on this colossal challenge. In December 2024, Google announced that its Willow quantum chip improves error correction exponentially with more qubits, addressing a problem that has plagued the field since the idea of quantum computing emerged in the 1980s. Then in February 2025, Microsoft and Amazon reported that they have developed quantum chips.

“We have advanced far beyond a theoretical understanding of quantum computing to the point where, hopefully, it will soon be possible to get a clear quantum advantage for a problem of practical interest,” said Scott Aaronson, David J. Bruton Jr. Centennial Professor of Computer Science at the University of Texas, Austin and a pioneer in the field.

Making Corrections Count

What makes quantum computing so alluring—and yet so challenging—is the very nature by which these systems work. While qubits can exist in both the 0 and 1 state simultaneously, they can do so for only as long as they remain quantum coherent, meaning there are no errors. A quantum computer measures outcomes by collapsing the qubits and making probabilistic determinations based on changes from the initial state. As the system measures results across large numbers of qubits, measurable patterns emerge.

There are several ways to approach quantum computing. Google, IBM, AWS, and Rigetti rely on superconducting methods that incorporate cryogenic chambers (cooled to near absolute zero; -273 degrees Celsius or 0 Kelvin). These systems minimize electrical resistance and energy loss in qubits. Companies like IonQ, Quantinuum, Pasqal, and Xanadu use other techniques, including neutral atoms and trapped ions that frequently require lasers and light to control qubits.

Regardless of the approach, all quantum systems face a fundamental hurdle. Because qubits are incredibly fragile, error correction is critical. Environmental noise, thermal fluctuations, and imperfect operations degrade performance—and results. Another problem? When engineers chain together groups of operations into gates, each one can introduce an error. Even at 99.8% accuracy, errors pile up to the point where quantum computers fail to yield useful results.

That’s the bad news. On the other hand, the 1990s theory of quantum error correction argued that the independence of errors in quantum systems leads to a predictable decline. “The result from back then was that, if this assumption is satisfied, then quantum error correction can ultimately correct errors faster than they happen and enable an arbitrarily long quantum computation,” Aronson said.

Starting in 2019 and continuing with the December Willow announcement, researchers have uncovered experimental evidence that supports the idea that quantum error correction works. If the theory holds, qubits will decay exponentially, but due to the highly predictable nature of the process, the quantum computing system will detect and correct the errors faster than they pile up.

Physics Matters

To be sure, Google’s announcement pushed quantum computing into new territory—and placed superconducting systems in the spotlight. “For the first time, it was clear that there is a viable path,” said Gokul Subramanian Ravi, an assistant professor in the Computer Science and Engineering Department at the University of Michigan. Although error correction isn’t the only way to do quantum computing—some tasks do not require it—it’s needed for most complex computations.

Willow accomplished a couple of important things. First, it demonstrated quantum error correction scaling beyond threshold. This means that error rates after error correction decreased as they added more qubits. The team first used the Willow chip to construct a logical qubit from a grid of 3×3 encoded physical qubits and later expanded it to a 5×5 grid, and eventually a 7×7 grid. Each time they increased the size of the grid, the error rate declined by a factor of 2. “Decreasing the error rates while adding more qubits is exactly what needs to happen to realize larger quantum computers,” Oliver said.

Second, Willow performed a benchmark computation in under five minutes that would have required approximately 10 septillion (1025) years using today’s fastest supercomputers. Since there would be no way to verify the entire result, researchers used classical computers to validate small and separate pieces of the overall equation.

“The ability to increase the number of qubits and consistently watch error rates drop by more than a factor of two demonstrates that error correction scaling works. In fact, at both the 5×5 and 7×7 grid sizes, the logical qubit encoding outperformed even the best individual qubits in the grid. In other words, the sum was better than its parts,” said Oliver, who co-authored the scientific paper about the error correction demonstration with Willow in the journal Nature.

“This result, along with other promising demonstrations related to logical qubits over the past couple of years from groups at Harvard and QuEra and Quantinuum and Microsoft, have considerably brightened the future of quantum computing,” Ravi added.

A Question of Scale

Although Google’s achievement demonstrated that quantum scaling is feasible, there’s still plenty of work to do. Willow supports only 105 superconducting physical qubits. Engineers will likely have to scale to the millions of qubits to build a quantum computer that is as practical as today’s classical computers. For that, engineers will likely need to achieve an error rate (with error correction) of one in a billion, or less.

For now, achieving the sheer number of qubits needed to perform a commercially viable application remains a significant obstacle. Part of the problem is rooted in the sheer physical size of superconducting systems, which rely on super-cooled chambers that can reach 2 to 3 meters in height and weigh several hundred kilograms. Users would have to connect and interconnect devices to scale to the millions of qubits needed to build applications with the required error correction.

The solution lies in designing quantum chips with more circuits, along with finding innovative ways to interconnect them. This involves everything from material science and fabrication methods to developing entirely new circuit designs. One company taking aim at the challenge is IBM, which has developed sophisticated 3D packaging technology that accommodates a greater number of qubits in a smaller space—while maintaining signal integrity. Its Eagle processor supports 127 qubits.

IBM has also embraced a tunable coupling architecture that achieves 99.9% two-qubit gate fidelity in its 156-quibit Heron processors. And it continues to make progress on error correction using low-density parity check (LDPC) codes, including a “gross code” that can encode 12 logical qubits using only 288 physical qubits. This represents approximately a 10x improvement over previous scaling approaches.

The company’s quantum development timeline is ambitious. IBM aims to deliver a 200-qubit logical system by 2029 (it will require approximately 10,000 physical qubits) while scaling up to a 2,000 logical qubit supercomputer (approximately 100,000 physical qubits) around 2033. To achieve this scale, the company is developing far more advanced couplers and cables that reduce noise on chips and across chips. This coupling allows qubits to share information that fuels quantum calculations.

“The essence of quantum computing is that you want to store each piece of logical information in more than one physical qubit, and that way, if there’s a physical error, you have a chance to correct that error before it has a chance to ripple through your entire computation,” explained Oliver Dial, CTO for IBM Quantum. “Quantum mechanics makes this complicated, because you can’t examine individual qubits without destroying the calculation.”

Researchers typically address fault tolerance through a series of parity checks that rely on extra qubits to measure activity between so-called code qubits. If one qubit flips to a different state, the parity changes and they can use the data for a calculation. But to scale systems to a level needed for critical error correction and practical outcomes, it’s imperative to greatly expand the number of qubits involved in any given calculation. “Scaling is critical and modularity is at the center of things,” Dial said.

Conquering Qubits

Others are addressing scaling issues in different ways. In October 2024, researchers in Australia demonstrated the viability of fabricating silicon spin qubits on 300-millimeter wafers—the same size used for CMOS semiconductor manufacturing. This would allow chip makers to apply proven semiconductor manufacturing expertise to fabricate quantum semiconductors. The method achieved 99% fidelity in all qubit operations. “If this technology pans out, it means that it could be possible to scale up quantum chip production using well-established fabrication processes and existing CMOS foundries,” Ravi explained.

Time and ongoing improvements in chips and architectures will continue to push quantum computing toward viability, Oliver said. “Many questions remain, including how to connect chips and systems optimally, how to connect systems to classical computers needed to decipher and manage quantum algorithms, and whether to handle any of these classical operations in a cryogenic environment.”

All of this leads to further questions: “Do I need an extremely large refrigerator for all my chips? Or can I have chips distributed among existing fridges,” Oliver asked. “There are still a lot of decision points that need to be worked out.”

Not surprisingly, AI and machine learning are rising stars in the quantum space. These tools can aid in everything from configuring denser quantum circuits and discovering ways to minimize noise to fine-tuning control parameters for qubits and identifying ideal frequencies and voltages for quantum operations. “Calibrating them all is extremely difficult, and classical AI and ML have an important role to play here,” Oliver said.

When quantum computing will go mainstream remains unclear, however. In January 2025, Nvidia CEO Jensen Huang caused a stir when he predicted that a functional device could be 15 or 30 years away. Oliver said practical use cases today are few and far between, but quantum computers could make a significant impact in 10 to 15 years.

However, the announcements from Microsoft and Amazon could accelerate the timeline. Microsoft’s Majorana chip, a result of two decades of research, delivers eight topological quantum bits (it describes the new state of matter it created as a “topological state”). Microsoft hopes to eventually dial the figure up to one million qubits on a specialized AI chip and to have the service publicly available by 2030. Amazon says that its prototype Ocelot quantum chip could slash error correction costs by 90%. It reported the breakthrough in Nature.

“My sense is that people and companies are experimenting with quantum computers, but they currently aren’t inventing new drugs and solving actual business problems,” Oliver said. “Like classical computers in the last century, quantum computing technology will evolve over time.”

Further Reading