Quantinuum and Microsoft achieve breakthrough that unlocks a new era of reliable quantum computing

April 3, 2024

By Ilyas Khan, Chief Product Officer and Jenni Strabley, Senior Director Offering Management

A screenshot of a computerDescription automatically generated

Quantinuum and Microsoft have announced a vital breakthrough in quantum computing that Microsoft described as “a major achievement for the entire quantum ecosystem.”

By combining Microsoft’s innovative qubit-virtualization system with the unique architectural features and fidelity of Quantinuum’s System Model H2 quantum computer, our teams have demonstrated the most reliable logical qubits on record with logical circuit error rates 800 times lower than the corresponding physical circuit error rates. 

A graph with blue text and blue squaresDescription automatically generated

This achievement is not just monumental for Quantinuum and Microsoft, but it is a major advancement for the entire quantum ecosystem. It is a crucial milestone on the path to building a hybrid supercomputing system that can truly transform research and innovation across many industries for decades to come. It also further bolsters H2’s title as the highest performing quantum computer in the world.

Entering a new era of quantum computing

Historically, there have been widely held assumptions about the physical qubits needed for large scale fault-tolerant quantum computing and the timeline to quantum computers delivering real-world value. It was previously thought that an achievement like this one was still years away from realization – but together, Quantinuum and Microsoft proved that fault-tolerant quantum computing is in fact a reality.

In enabling today’s announcement, Quantinuum’s System Model H2 becomes the first quantum computer to advance to Microsoft’s Level 2 – Resilient phase of quantum computing – an incredible milestone. Until now, no other computer had been capable of producing reliable logical qubits. 

Using Microsoft’s qubit-virtualization system, our teams used reliable logical qubits to perform 14,000 individual instances of a quantum circuit with no errors, an overall result that is unprecedented. Microsoft also demonstrated multiple rounds of active syndrome extraction – an essential error correction capability for measuring and detecting the occurrence of errors without destroying the quantum information encoded in the logical qubit. 

As we prepare to bring today’s logical quantum computing breakthrough to commercial users, there is palpable anticipation about what this new era means for our partners, customers, and the global quantum computing ecosystem that has grown up around our hardware, middleware, and software. 

Collaborating to reach a new era

To understand this achievement, it is helpful to shed some light on the joint work that went into it. Our breakthrough would not have been possible without the close collaboration of the two exceptional teams at Quantinuum and Microsoft over many years.

Building on a relationship that stretches back five years, we collaborated with Microsoft Azure Quantum at a very deep level to best execute their innovative qubit-virtualization system, including error diagnostics and correction. The Microsoft team was able to optimize their error correction innovation, reducing an original estimate of 300 required physical qubits 10-fold, to create four logical qubits with only 30 physical qubits, bringing it into scope for the 32-qubit H2 quantum computer.

This massive compression of the code and efficient virtualization challenges a consensus view about the resources needed to do fault-tolerant quantum computing, where it has been routinely stated that a logical qubit will require hundreds, even thousands of physical qubits. Through our collaboration, Microsoft’s far more efficient encoding was made possible by architectural features unique to the System Model H2, including our market-leading 99.8% two-qubit gate fidelity, 32 fully-connected qubits, and compatibility with Quantum Intermediate Representation (QIR).

Thanks to this powerful combination of collaboration, engineering excellence, and resource efficiency, quantum computing has taken a major step into a new era, introducing reliable logical qubits which will soon be available to industrial and research users.

Understanding today’s error correction breakthrough

It is widely recognized that for a quantum computer to be useful, it must be able to compute correctly even when errors (or faults) occur – this is what scientists and engineers describe as fault-tolerance. 

In classical computing, fault-tolerance is well-understood and we have come to take it for granted. We always assume that our computers will be reliable and fault-free. Multiple advances over the course of decades have led to this state of affairs, including hardware that is incredibly robust and error rates that are very low, and classical error correction schemes that are based on the ability to copy information across multiple bits, to create redundancy. 

Getting to the same point in quantum computing is more challenging, although the solution to this problem has been known for some time. Qubits are incredibly delicate since one must control the precise quantum states of single atoms, which are prone to errors. Additionally, we must abide by a fundamental law of quantum physics known as the no cloning theorem, which says that you can’t just copy qubits – meaning some of the techniques used in classical error correction are unavailable in quantum machines. 

The solution involves entangling groups of physical qubits (thereby creating a logical qubit), storing the relevant quantum information in the entangled state, and, via some complex functions, performing computations with error correction. This process is all done with the sole purpose of creating logical qubit errors lower than the errors at the physical level.

However, implementing quantum error correction requires a significant number of qubit operations. Unless the underlying physical fidelity is good enough, implementing a quantum error correcting code will add more noise to your circuit than it takes away. No matter how clever you are in implementing a code, if your physical fidelity is poor, the error correcting code will only introduce more noise. But, once your physical fidelity is good enough (aka when the physical error rate is “below threshold”), then you will see the error correcting code start to actually help: producing logical errors below the physical errors. 

A close-up of a computer chipDescription automatically generated
System Model H2 ion-trap quantum computer chip showing the “racetrack” trap design
Quantinuum’s fault-tolerance roadmap

Today’s results are an exciting marker on the path to fault-tolerant quantum computing. The focus must and will now shift from quantum computing companies simply stating the number of qubits they have to explaining their connectivity, the underlying quality of the qubits with reference to gate fidelities, and their approach to fault-tolerance.

Our H-Series hardware roadmap has not only focused on scaling qubits, but also developing useable quantum computers that are part of a vertically integrated stack. Our work across the full stack includes major advances at every level, for instance just last month we proved that our qubits could scale when we announced solutions to the wiring problem and the sorting problem. By maintaining higher qubit counts and world class fidelity, our customers and partners are able to advance further and faster in fields such as material science, drug discovery, AI and finance.

In 2025, we will introduce a new H-Series quantum computer, Helios, that takes the very best the H-Series has to offer, improving both physical qubit count and physical fidelity. This will take us and our users below threshold for a wider set of error correcting codes and make that device capable of supporting at least 10 highly reliable logical qubits. 

A path to real-world impact

As we build upon today’s milestone and lead the field on the path to fault-tolerance, we are committed to continuing to make significant strides in the research that enables the rapid advance of our technologies. We were the first to demonstrate real-time quantum error correction (meaning a fully-fault tolerant QEC protocol), a result that meant we were the first to show: repeated real-time error correction, the ability to perform quantum "loops" (repeat-until-success protocols), and real-time decoding to determine the corrections during the computation. We were the first to create non-Abelian topological quantum matter and braid its anyons, leading to topological qubits.

The native flexibility of our QCCD architecture has allowed us to efficiently investigate a large variety of fault-tolerant methods, and our best-in-class fidelity means we expect to lead the way in achieving reduced error rates with additional error correcting codes – and supporting our partners to do the same. We are already working on making reliable quantum computing a commercial reality so that our customers and partners can unlock the enormous real-world economic value that is waiting to be unleashed by the development of these systems. 

In the short term – with a hybrid supercomputer powered by a hundred reliable logical qubits, we believe that organizations will be able to start to see scientific advantages and will be able to accelerate valuable progress toward some of the most important problems that mankind faces such as modelling the materials used in batteries and hydrogen fuel cells or accelerating the development of meaning-aware AI language models. Over the long-term, if we are able to scale closer to ~1,000 reliable logical qubits, we will be able to unlock the commercial advantages that can ultimately transform the commercial world. 

Quantinuum customers have always been able to operate the most cutting-edge quantum computing, and we look forward to seeing how they, and our own world-leading teams, drive ahead developing new solutions based on the state-of-the-art tools we continue to put into their hands. We were the early leaders in quantum computing and now we are thrilled to be positioned at the forefront of fault-tolerant quantum computing. We are excited to see what today’s milestone unlocks for our customers in the days ahead.

For more information
About Quantinuum

Quantinuum, the world’s largest integrated quantum company, pioneers powerful quantum computers and advanced software solutions. Quantinuum’s technology drives breakthroughs in materials discovery, cybersecurity, and next-gen quantum AI. With over 500 employees, including 370+ scientists and engineers, Quantinuum leads the quantum computing revolution across continents. 

Blog
August 28, 2025
Quantum Computing Joins the Next Frontier in Genomics
  • The Sanger Institute illustrates the value of quantum computing to genomics research
  • Quantinuum supports developments in a field that promises to deliver a profound and positive societal impact

Twenty-five years ago, scientists accomplished a task likened to a biological moonshot: the sequencing of the entire human genome.

The Human Genome Project revealed a complete human blueprint comprising around 3 billion base pairs, the chemical building blocks of DNA. It led to breakthrough medical treatments, scientific discoveries, and a new understanding of the biological functions of our body.

Thanks to technological advances in the quarter-century since, what took 13 years and cost $2.7 billion then can now be done in under 12 minutes for a few hundred dollars. Improved instruments such as next-generation sequencers and a better understanding of the human genome – including the availability of a “reference genome” – have aided progress, alongside enormous advances in algorithms and computing power.

But even today, some genomic challenges remain so complex that they stretch beyond the capabilities of the most powerful classical computers operating in isolation. This has sparked a bold search for new computational paradigms, and in particular, quantum computing.

Quantum Challenge: Accepted

The Wellcome Leap Quantum for Bio (Q4Bio) challenge is pioneering this new frontier. The program funds research to develop quantum algorithms that can overcome current computational bottlenecks. It aims to test the classical boundaries of computational genetics in the next 3-5 years.

One consortium – led by the University of Oxford and supported by prestigious partners including the Wellcome Sanger Institute, the Universities of Cambridge, Melbourne, and Kyiv Academic University – is taking a leading role.

“The overall goal of the team’s project is to perform a range of genomic processing tasks for the most complex and variable genomes and sequences – a task that can go beyond the capabilities of current classical computers” – Wellcome Sanger Institute press release, July 2025
Selecting Quantinuum

Earlier this year, the Sanger Institute selected Quantinuum as a technology partner in their bid to succeed in the Q4Bio challenge.

Our flagship quantum computer, System H2, has for many years led the field of commercially available systems for qubit fidelity and consistently holds the global record for Quantum Volume, currently benchmarked at 8,388,608 (223).

In this collaboration, the scientific research team can take advantage of Quantinuum’s full stack approach to technology development, including hardware, software, and deep expertise in quantum algorithm development.

“We were honored to be selected by the Sanger Institute to partner in tackling some of the most complex challenges in genomics. By bringing the world’s highest performing quantum computers to this collaboration, we will help the team push the limits of genomics research with quantum algorithms and open new possibilities for health and medical science.” – Rajeeb Hazra, President and CEO of Quantinuum
Quantum for Biology

At the heart of this endeavor, the consortium has announced a bold central mission for the coming year: to encode and process an entire genome using a quantum computer. This achievement would be a potential world-first and provide evidence for quantum computing’s readiness for tackling real-world use cases.

Their chosen genome, the bacteriophage PhiX174, carries symbolic weight, as its sequencing earned Fred Sanger his second Nobel Prize for Chemistry in 1980. Successfully encoding this genome quantum mechanically would represent a significant milestone for both genomics and quantum computing.

Bacteriophage PhiX174, published under a Creative Commons License https://commons.wikimedia.org/wiki/File:Phi_X_174.png

Sooner than many expect, quantum computing may play an essential role in tackling genomic challenges at the very frontier of human health. The Sanger Institute and Quantinuum’s partnership reminds us that we may soon reach an important step forward in human health research – one that could change medicine and computational biology as dramatically as the original Human Genome Project did a quarter-century ago.

“Quantum computational biology has long inspired us at Quantinuum, as it has the potential to transform global health and empower people everywhere to lead longer, healthier, and more dignified lives.” – Ilyas Khan, Founder and Chief Product Officer of Quantinuum

Glossary of terms: Understanding how quantum computing supports complex genomic research


Term Definition
Algorithms
A set of rules or processes for performing calculations or solving computational problems.
Classical Computing Computing technology based on binary information storage (bits represented as 0 or 1).
DNA Sequence The exact order of nucleotides (A, T, C, G) within a DNA molecule.
Genome The complete set of genetic material (DNA) present in an organism.
Graph-based Genome (Sequence Graph) A non-linear network representation of genomic sequences capturing the diversity and relationships among multiple genomes.
High Performance Compute (HPC) Advanced classical computing systems designed for handling computationally intensive tasks, simulations, and data processing.
Pangenome A collection of multiple genome sequences representing genetic diversity within a population or species.
Precision Medicine Tailored medical treatments based on individual genetic, environmental, and lifestyle factors.
Quantinuum The world’s largest quantum computing company, Quantinuum systems lead the world for the rigorous Quantum Volume benchmark and were the first to offer commercial access to highly reliable “Level 2 – resilient” quantum computing.
Quantum Bit (Qubit) Basic unit of quantum information, which unlike classical bits, can exist in multiple states simultaneously (superposition).
Quantum Computing Computing approach using quantum-mechanical phenomena (e.g., superposition, entanglement, interference) for enhanced problem-solving capabilities.
Quantum Pangenomics Interdisciplinary field combining quantum computing with genomics to address computational challenges in analyzing genetic data and pangenomes.
Quantum Volume A specific test of a quantum computer’s performance on complex circuits. The higher the quantum volume the more powerful the system. Quantinuum’s 56-qubit System Model H2 achieved a record quantum volume of 8,388,608 in May 2025.
Quantum Superposition A fundamental quantum phenomenon in which particles can simultaneously exist in multiple states, enabling complex computational tasks.
Sequence Mapping Determining how sequences align or correspond within a larger genomic reference or graph.
Wellcome Leap Quantum for Bio (Q4Bio) Initiative funding research combining quantum computing and biological sciences to address computational challenges.
Wellcome Sanger Institute The Sanger Institute tackles some of the most difficult challenges in genomic research.
partnership
All
Blog
August 26, 2025
IEEE Quantum Week 2025

Every year, The IEEE International Conference on Quantum Computing and Engineering – or IEEE Quantum Week – brings together engineers, scientists, researchers, students, and others to learn about advancements in quantum computing.

This year’s conference from August 31st – September 5th, is being held in Albuquerque, New Mexico, a burgeoning epicenter for quantum technology innovation and the home to our new location that will support ongoing collaborative efforts to advance the photonics technologies critical to furthering our product development.

Throughout IEEE Quantum Week, our quantum experts will be on-site to share insights on upgrades to our hardware, enhancements to our software stack, our path to error correction, and more.

Meet our team at Booth #507 and join the below sessions to discover how Quantinuum is forging the path to fault-tolerant quantum computing with our integrated full-stack.

September 2nd

Quantum Software Workshop
Quantum Software 2.1: Open Problems, New Ideas, and Paths to Scale
1:15 – 2:10pm MDT | Mesilla

We recently shared the details of our new software stack for our next-generation systems, including Helios (launching in 2025). Quantinuum’s Agustín Borgna will deliver a lighting talk to introduce Guppy, our new, open-source programming language based on Python, one of the most popular general-use programming languages for classical computing.

September 3rd

PAN08: Progress and Platforms in the Era of Reliable Quantum Computing
1:00 – 2:30pm MDT | Apache

We are entering the era of reliable quantum computing. Across the industry, quantum hardware and software innovators are enabling this transformation by creating reliable logical qubits and building integrated technology stacks that span the application layer, middleware and hardware. Attendees will hear about current and near-term developments from Microsoft, Quantinuum and Atom Computing. They will also gain insights into challenges and potential solutions from across the ecosystem, learn about Microsoft’s qubit-virtualization system, and get a peek into future developments from Quantinuum and Microsoft.

BOF03: Exploring Distributed Quantum Simulators on Exa-scale HPC Systems
3:00 – 4:30pm MDT | Apache

The core agenda of the session is dedicated to addressing key technical and collaborative challenges in this rapidly evolving field. Discussions will concentrate on innovative algorithm design tailored for HPC environments, the development of sophisticated hybrid frameworks that seamlessly combine classical and quantum computational resources, and the crucial task of establishing robust performance benchmarks on large-scale CPU/GPU HPC infrastructures.

September 4th

PAN11: Real-time Quantum Error Correction: Achievements and Challenges
1:00 – 2:30pm MDT | La Cienega

This panel will explore the current state of real-time quantum error correction, identifying key challenges and opportunities as we move toward large-scale, fault-tolerant systems. Real-time decoding is a multi-layered challenge involving algorithms, software, compilation, and computational hardware that must work in tandem to meet the speed, accuracy, and scalability demands of FTQC. We will examine how these challenges manifest for multi-logical qubit operations, and discuss steps needed to extend the decoding infrastructure from intermediate-scale systems to full-scale quantum processors.

September 5th

Keynote by NVIDIA
8:00 – 9:30am MDT | Kiva Auditorium

During his keynote talk, NVIDIA’s Head of Quantum Computing Product, Sam Stanwyck, will detail our partnership to fast-track commercially scalable quantum supercomputers. Discover how Quantinuum and NVIDIA are pushing the boundaries to deliver on the power of hybrid quantum and classical compute – from integrating NVIDIA’s CUDA-Q Platform with access to Quantinuum’s industry-leading hardware to the recently announced NVIDIA Quantum Research Center (NVAQC).

Featured Research at the IEEE Poster Session:

Visible Photonic Component Development for Trapped-Ion Quantum Computing
September 2nd from 6:30 - 8:00pm MDT | September 3rd from 9:30 - 10:00am MDT | September 4th from 11:30 - 12:30pm MDT
Authors: Elliot Lehman, Molly Krogstad, Molly P. Andersen, Sara Cambell, Kirk Cook, Bryan DeBono, Christopher Ertsgaard, Azure Hansen, Duc Nguyen, Adam Ollanik, Daniel Ouellette, Michael Plascak, Justin T. Schultz, Johanna Zultak, Nicholas Boynton, Christopher DeRose,Michael Gehl, and Nicholas Karl

Scaling Up Trapped-Ion Quantum Processors with Integrated Photonics
September 2nd from 6:30 - 8:00pm MDT and 2:30 - 3:00pm MDT | September 4th from 9:30 - 10:00am MDT

Authors: Molly Andersen, Bryan DeBono, Sara Campbell, Kirk Cook, David Gaudiosi, Christopher Ertsgaard, Azure Hansen, Todd Klein, Molly Krogstad, Elliot Lehman, Gregory MacCabe, Duc Nguyen, Nhung Nguyen, Adam Ollanik, Daniel Ouellette, Brendan Paver, Michael Plascak, Justin Schultz and Johanna Zultak

Research Collaborations with the Local Ecosystem

In a partnership that is part of a long-standing relationship with Los Alamos National Laboratory, we have been working on new methods to make quantum computing operations more efficient, and ultimately, scalable.

Learn more in our Research Paper: Classical shadows with symmetries

Our teams collaborated with Sandia National Laboratories demonstrating our leadership in benchmarking. In this paper, we implemented a technique devised by researchers at Sandia to measure errors in mid-circuit measurement and reset. Understanding these errors helps us to reduce them while helping our customers understand what to expect while using our hardware.

Learn more in our Research Paper: Measuring error rates of mid-circuit measurements

events
All
Blog
August 25, 2025
We’re not just catching up to classical computing, we’re evolving from it

From machine learning to quantum physics, tensor networks have been quietly powering the breakthroughs that will reshape our society. Originally developed by the legendary Nobel laureate Roger Penrose, they were first used to tackle esoteric problems in physics that were previously unsolvable.

Today, tensor networks have become indispensable in a huge number of fields, including both classical and quantum computing, where they are used everywhere from quantum error correction (QEC) decoding to quantum machine learning.

In this latest paper, we teamed up with luminaries from the University of British Columbia, California Institute of Technology, University of Jyväskylä, KBR Inc, NASA, Google Quantum AI, NVIDIA, JPMorgan Chase, the University of Sherbrooke, and Terra Quantum AG to provide a comprehensive overview of the use of tensor networks in quantum computing.

Standing on the shoulders of giants

Part of what drives our leadership in quantum computing is our commitment to building the best scientific team in the world. This is precisely why we hired Dr. Reza Haghshenas, one of the world’s leading experts in tensor networks, and a co-author on the paper.

Dr. Haghshenas has been researching tensor networks for over a decade across both academia and industry. Dr. Haghshenas did postdoctoral work under Professor Garnet Chan at Caltech, a leading figure in the use of tensor networks for quantum physics and chemistry.

“Working with Dr. Garnet Chan at Caltech was a formative experience for me”, remarked Dr. Haghshenas. “While there, I contributed to the development of quantum simulation algorithms and advanced classical methods like tensor networks to help interpret and simulate many-body physics.”

Since joining Quantinuum, Dr. Haghshenas has led projects that bring tensor network methods into direct collaboration with experimental hardware teams — exploring quantum magnetism on real quantum devices and helping demonstrate early signs of quantum advantage. He also contributes to widely used simulation tools like QUIMB, helping the broader research community access these methods.

Dr. Haghshenas’ work sits in a broad and vibrant ecosystem exploring novel uses of tensor networks. Collaborations with researchers like Dr. Chan at Caltech, and NVIDIA have brought GPU-accelerated tools to bear on the forefront of applying tensor networks to quantum chemistry, quantum physics, and quantum computing.

A powerful simulation tool

Of particular interest to those of us in quantum computing, the best methods (that we know of) for simulating quantum computers with classical computers rely on tensor networks. Tensor networks provide a nice way of representing the entanglement in a quantum algorithm and how it spreads, which is crucial but generally quite difficult for classical algorithms. In fact, it’s partly tensor networks’ ability to represent entanglement that makes them so powerful for quantum simulation. Importantly, it is our in-house expertise with tensor networks that makes us confident we are indeed moving past classical capabilities.

A theory of evolution

Tensor networks are not only crucial to cutting-edge simulation techniques.  At Quantinuum, we're working on understanding and implementing quantum versions of classical tensor network algorithms, from quantum matrix product states to holographic simulation methods. In doing this, we are leveraging decades of classical algorithm development to advance quantum computing.

A topic of growing interest is the role of tensor networks in QEC, particularly in a process known as decoding. QEC works by encoding information into an entangled state of multiple qubits and using syndrome measurements to detect errors. These measurements must then be decoded to identify the specific error and determine the appropriate correction. This decoding step is challenging—it must be both fast (within the qubit’s coherence time) and accurate (correctly identifying and fixing errors). Tensor networks are emerging as one of the most effective tools for tackling this task.

Looking forward (and backwards, and sideways...)

Tensor networks are more than just a powerful computational tool — they are a bridge between classical and quantum thinking. As this new paper shows, the community’s understanding of tensor networks has matured into a robust foundation for advancing quantum computing, touching everything from simulation and machine learning to error correction and circuit design.

At Quantinuum, we see this as an evolutionary step, not just in theory, but in practice. By collaborating with top minds across academia and industry, we're charting a path forward that builds on decades of classical progress while embracing the full potential of quantum mechanics. This transition is not only conceptual but algorithmic, advancing how we formulate and implement methods utilizing efficiently both classical and quantum computing. Tensor networks aren’t just helping us keep pace with classical computing; they’re helping us to transcend it.

technical
All