Quantum Milestone: We Can Now Detect and Correct Quantum Errors in Real Time

November 27, 2021

Researchers at Honeywell Quantum Solutions applied multiple rounds of quantum error correction to a single logical qubit, an industry first.

Cambridge Researchers at Honeywell Quantum Solutions have taken a significant step toward demonstrating the viability of large-scale quantum computing on its trapped-ion quantum computing technology. 

The Honeywell team can now perform quantum error correction (QEC), which are protocols necessary to detect and correct errors in real time on a quantum computer.  They demonstrated the ability to “protect” quantum information (prevent a quantum computation from being quickly corrupted by imperfections and noise) on the System Model H1.  This is an important first in the quantum computing industry.  Currently, most demonstrations of quantum error correction involve correcting errors or “noise” after the procedure has finished running, a technique known as post-processing. 

In a paper published this week on arXiv, researchers detailed how they created a single logical qubit (a series of entangled physical qubits) and applied multiple rounds of quantum error correction. This logical qubit is protected from two main types of errors that occur in a quantum computer: bit flips and phase flips.  

Previously, groups have looked at codes that only are capable of correcting a single type of error (bit or phase but not both) {Google, IBM/Raytheon, IBM/Basel}. Others have looked at quantum error detecting codes, which can detect both types of errors but not correct them {ETH, Google, Delft}. Further still, groups have demonstrated pieces of the quantum error correcting process {Blatt, Monroe}. 

“All of today’s quantum technologies are at an early stage where they must combat errors that accumulate during computations,” said Tony Uttley, president of Honeywell Quantum Solutions. “What the Honeywell team accomplished is groundbreaking. It proves what was once only theoretical, that quantum computers will be able to correct errors in real time, paving the way for precise quantum computations.”  

Though the achievement represents progress toward large-scale quantum computing, Honeywell researchers are still working to cross the break-even point at which the logical error rate is less than the physical error rate. 

The need for logical qubits

To appreciate this achievement, it is important to understand how difficult it is to detect and then correct a quantum error.

Quantum bits, or qubits, are fragile and finicky. They pick up interference or “noise” from their environment. This noise causes errors to accumulate and corrupts information stored in and between physical qubits. (Scientists call this decoherence.) 

Attempts to directly detect and correct errors on a physical qubit also corrupts its “quantumness.” And cloning this data, a method used in classical computing that involves making multiple exact copies of the information, does not work in quantum (as prohibited by “The No Cloning Theorem.”)

To overcome these concerns, several scientists, most notably Peter Shor, Robert Calberbank, and Andrew Steane, found a way around this, at least in theory, after studying how quickly qubits experience decoherence.  

They demonstrated that by storing information in a collection of entangled qubits, it was possible to detect and correct errors without disrupting quantum information.  They called this assortment of entangled qubits a logical qubit. 

Scientists have spent years developing codes and methods that could be applied to logical qubits to protect quantum information from errors.  

What’s next

The next step is to break even, crossing the point at which the logical qubit error rate is lower than the error rate for physical qubits. (Creating logical qubits and applying quantum error correction codes also can inject noise into a system).  

The Honeywell team is closing in on that mark.  To definitively demonstrate passing the break-even point, the error rate per QEC cycle needs to be lower than the largest physical error rate associated with the QEC protocol. 

“In the technical paper, we point to key improvements we need to make to reach the break-even point,” said Dr. Ciaran Ryan-Anderson, an advanced physicist and lead author of the paper. “We believe these improvements are feasible and are pushing to accomplish this next step.”

From there, the goal is to create multiple logical qubits, which depending on the quantum technology, requires better fidelities, more physical qubits, better connectivity between qubits, and other factors.

An increase in logical qubits will usher in a new era of fault-tolerant quantum computers that can continue to function even when some operations fail.   (Fault tolerance is a design principle that prevents errors from cascading throughout a system and corrupting circuits.)

“The big, enterprise-level problems we want to solve with quantum computers require precision and we need error-corrected logical qubits to scale successfully,” Uttley said.

arrow
Kaniah Konkoly-Thege

Kaniah is Chief Legal Counsel and SVP of Government Relations for Quantinuum. In her previous role, she served as General Counsel, Honeywell Quantum Solutions. Prior to Honeywell, she was General Counsel, Honeywell Federal Manufacturing and Technologies, LLC, and Senior Attorney, U.S. Department of Energy. She was Lead Counsel before the Civilian Board of Contract Appeals, the Merit Systems Protection Board, and the Equal Employment Opportunity Commission. Kaniah holds a J.D. from American University, Washington College of Law and B.A., International Relations and Spanish from the College of William and Mary.

Jeff Miller

Jeff Miller is Chief Information Officer for Quantinuum. In his previous role, he served as CIO for Honeywell Quantum Solutions and led a cross-functional team responsible for Information Technology, Cybersecurity, and Physical Security. For Honeywell, Jeff has held numerous management and executive roles in Information Technology, Security, Integrated Supply Chain and Program Management. Jeff holds a B.S., Computer Science, University of Arizona. He is a veteran of the U.S. Navy, attaining the rank of Commander.

Matthew Bohne

Matthew Bohne is the Vice President & Chief Product Security Officer for Honeywell Corporation. He is a passionate cybersecurity leader and executive with a proven track record of building and leading cybersecurity organizations securing energy, industrial, buildings, nuclear, pharmaceutical, and consumer sectors. He is a sought-after expert with deep experience in DevSecOps, critical infrastructure, software engineering, secure SDLC, supply chain security, privacy, and risk management.

Todd Moore

Todd Moore is the Global Vice President of Data Encryption Products at Thales. He is responsible for setting the business line and go to market strategies for an industry leading cybersecurity business. He routinely helps enterprises build solutions for a wide range of complex data security problems and use cases. Todd holds several management and technical degrees from the University of Virginia, Rochester Institute of Technology, Cornell University and Ithaca College. He is active in his community, loves to travel and spends much of his free time supporting his family in pursuing their various passions.

John Davis

Retired U.S. Army Major General John Davis is the Vice President, Public Sector for Palo Alto Networks, where he is responsible for expanding cybersecurity initiatives and global policy for the international public sector and assisting governments around the world to prevent successful cyber breaches. Prior to joining Palo Alto Networks, John served as the Senior Military Advisor for Cyber to the Under Secretary of Defense for Policy and served as the Acting Deputy Assistant Secretary of Defense for Cyber Policy.  Prior to this assignment, he served in multiple leadership positions in special operations, cyber, and information operations.