Quantinuum researchers have set a record for the number of times they were able to place qubits into a quantum state and then measure the results, beating the previously stated best in class many times over.
The team led by Alex An, Tony Ransford, Andrew Schaffer, Lucas Sletten, John Gaebler, James Hostetter, and Grahame Vittorini achieved a state preparation and measurement, or SPAM, fidelity of 99.9904 percent — the highest of any quantum technology to date — using qubits formed from non-radioactive barium-137. The results, which are detailed here, have been submitted to arXiv.
This work has major implications for the quantum industry and trapped-ion technologies.
Improving SPAM fidelity helps reduce errors that accumulate in today’s “noisy” quantum machines, which is critical for moving to “fault-tolerant” systems that prevent errors from cascading through a system and corrupting circuits.
In addition, being able to form qubits from barium-137 and place them into a quantum state with high fidelity is advantageous for scaling trapped-ion hardware systems. Researchers can use lasers in the visible spectrum, a more mature and readily available technology, to initialize and manipulate qubits.
“This is a major step forward for the Quantinuum team and our high-performing trapped-ion quantum hardware,” said Tony Uttley, Quantinuum president and chief operating officer. “The advancement of the quantum computing industry as a whole is going to come from lots of individual technological achievements like this one, paving the way for future fault-tolerant systems.”
For most people, the word “spam” conjures images of unwanted emails flooding an inbox or of chopped pork in a can.
In quantum computing, SPAM stands for state preparation and measurement - two of the five conditions identified by theoretical physicist David DiVincenzo as necessary for the operation of quantum computer. It refers to initializing qubits (placing them in a quantum state) and then measuring the output. SPAM is measured in terms of fidelity, or the ability to complete these tasks at a high rate of success. The higher the fidelity the better because it means a quantum computer is performing these critical tasks with fewer errors.
Researchers at Quantinuum believe SPAM fidelity will need to hit 99.97 to 99.99 percent to reach the point at which the logical error rate beats the leading order physical error rate.
Neutral ytterbium atoms have long been a source of ions in trapped-ion quantum computers. Charged by lasers, ytterbium ions are transformed into qubits. But using ytterbium presents challenges. Expensive ultraviolet lasers are needed to manipulate ytterbium ions and the results can be difficult to measure.
Barium ions, however, are easier to measure and can be manipulated with less expensive and more stable lasers in the green range. But until this work with non-radioactive barium-137, researchers have only been able to achieve low SPAM errors with barium-133 atoms, which are radioactive and require special handling.
“Nobody thought you could do quick, robust SPAM with non-radioactive barium-137,” said Dr. Anthony Ransford, a Quantinuum physicist and technical lead. “We were able to devise a scheme that enabled us to initialize the qubits and measure them better than any other qubits. We are the first to do it.”
Being able to initialize non-radioactive barium-137 ions is just the first step. The goal is to incorporate these ions into future Quantinuum hardware technologies.
“We believe using non-radioactive barium-137 ions as qubits is an attractive path to increasingly robust, scalable, quantum hardware,” Uttley said.
Quantinuum, the world’s largest integrated quantum company, pioneers powerful quantum computers and advanced software solutions. Quantinuum’s technology drives breakthroughs in materials discovery, cybersecurity, and next-gen quantum AI. With over 500 employees, including 370+ scientists and engineers, Quantinuum leads the quantum computing revolution across continents.
As organizations assess the impact of quantum computing on cryptography, many focus on algorithm migration and timelines. But preparing for PQC requires a broader view—one that includes not just new algorithms, but also the quality of the inputs that support them, including randomness.
That’s why Quantinuum joined with partners Thales, Keyfactor, and IBM Consulting to form the QSafe 360 Alliance, a collaboration focused on helping organizations build crypto-agile security architectures that are ready for the quantum era. Together, we’ve released a whitepaper—Digital Trust & Cybersecurity After Quantum Computing—to offer practical guidance on post-quantum readiness, from discovery and planning to deployment.
The history of cryptography offers clear examples of what happens when randomness fail, and how long those issues can go unnoticed. The Polynonce attack, first disclosed in 2023, exploited weak randomness in Bitcoin transaction signatures and enabled the theft of at least $25 million across 773 wallets. The vulnerability persisted undetected for nine years. The Randstorm disclosure, published in 2022, revealed that biased key generation in widely used Bitcoin wallet libraries exposed millions of wallets—across a window of more than a decade (2011–2022). In both cases, cryptographic algorithms functioned as designed; it was the randomness beneath them that silently failed, leaving companies vulnerable for many years
Post-quantum cryptography (PQC) algorithms are being designed to resist attacks from quantum computers. But they still depend on random values to generate key material. That means any implementation of PQC inherits the same reliance on randomness—but without a way to prove its quality, that layer remains a potential vulnerability.
As security teams run cryptographic inventories, develop crypto-agility plans, or build software bill-of-materials (SBOMs) for PQC migration, it’s important to include randomness in that scope. No matter how strong the algorithm, poor randomness can undermine its security from the start.
Quantum Origin takes a fundamentally different approach to randomness quality to deliver proven randomness which improves key generation, algorithms, and the entire security stack. It leverages strong seeded randomness extractors—mathematical algorithms that transform even weak local entropy into provably secure output. These extractors are uniquely powered by a Quantum Seed, which is generated once by Quantinuum's quantum computers using quantum processes verified through Bell tests.
This one-time quantum generation enables Quantum Origin as a software-only solution designed for maximum flexibility. It works with existing infrastructure—on cloud systems, on-premises environments, air-gapped networks, and embedded platforms—without requiring special hardware or a network connection. It's also validated to NIST SP 800-90B standards (Entropy Source Validation #E214). This approach strengthens today’s deployments of AES, RSA, ECC, and other algorithms, and lays a secure foundation for implementing the NIST PQC algorithms.
The QSafe 360 Alliance whitepaper outlines the path to post-quantum readiness, emphasizing crypto-agility as a guiding principle: the ability to adapt cryptographic systems without major disruption, from randomness to key generation to algorithmic strength.
For security architects, CISOs, and cryptographic engineering teams building their post-quantum transition strategies, randomness is not a peripheral concern. It is a starting point.
The QSafe 360 Alliance whitepaper offers valuable guidance on structuring a comprehensive PQC journey. As you explore that framework, consider how proven randomness—available today—will help strengthen your security posture from the ground up.
Our quantum algorithms team has been hard at work exploring solutions to continually optimize our system’s performance. Recently, they’ve invented a novel technique, called the Quantum Paldus Transform (QPT), that can offer significant resource savings in future applications.
The transform takes complex representations and makes them simple, by transforming into a different “basis”. This is like looking at a cube from one angle, then rotating it and seeing just a square, instead. Transformations like this save resources because the more complex your problem looks, the more expensive it is to represent and manipulate on qubits.
While it might sound like magic, transforms are a commonly used tool in science and engineering. Transforms simplify problems by reshaping them into something that is easier to deal with, or that provides a new perspective on the situation. For example, sound engineers use Fourier transforms every day to look at complex musical pieces in terms of their frequency components. Electrical engineers use Laplace transforms; people who work in image processing use the Abel transform; physicists use the Legendre transform, and so on.
In a new paper outlining the necessary tools to implement the QPT, Dr. Nathan Fitzpatrick and Mr. Jędrzej Burkat explain how the QPT will be widely applicable in quantum computing simulations, spanning areas like molecular chemistry, materials science, and semiconductor physics. The paper also describes how the algorithm can lead to significant resource savings by offering quantum programmers a more efficient way of representing problems on qubits.
The efficiency of the QPT stems from its use of one of the most profound findings in the field of physics: that symmetries drive the properties of a system.
While the average person can “appreciate” symmetry, for example in design or aesthetics, physicists understand symmetry as a much more profound element present in the fabric of reality. Symmetries are like the universe’s DNA; they lead to conservation laws, which are the most immutable truths we know.
Back in the 1920’s, when women were largely prohibited from practicing physics, one of the great mathematicians of the century, Emmy Noether, turned her attention to the field when she was tasked with helping Einstein with his work. In her attempt to solve a problem Einstein had encountered, Dr. Noether realized that all the most powerful and fundamental laws of physics, such as “energy can neither be created nor destroyed” are in fact the consequence of a deep simplicity – symmetry – hiding behind the curtains of reality. Dr. Noether’s theorem would have a profound effect on the trajectory of physics.
In addition to the many direct consequences of Noether’s theorem is a longstanding tradition amongst physicists to treat symmetry thoughtfully. Because of its role in the fabric of our universe, carefully considering the symmetries of a system often leads to invaluable insights.
Many of the systems we are interested in simulating with quantum computers are, at their heart, systems of electrons. Whether we are looking at how electrons move in a paired dance inside superconductors, or how they form orbitals and bonds in a chemical system, the motion of electrons are at the core.
Seven years after Noether published her blockbuster results, Wolfgang Pauli made waves when he published the work describing his Pauli exclusion principle, which relies heavily on symmetry to explain basic tenets of quantum theory. Pauli’s principle has enormous consequences; for starters, describing how the objects we interact with every day are solid even though atoms are mostly empty space, and outlining the rules of bonds, orbitals, and all of chemistry, among other things.
It is Pauli's symmetry, coupled with a deep respect for the impact of symmetry, that led our team at Quantinuum to the discovery published today.
In their work, they considered the act of designing quantum algorithms, and how one’s design choices may lead to efficiency or inefficiency.
When you design quantum algorithms, there are many choices you can make that affect the final result. Extensive work goes into optimizing each individual step in an algorithm, requiring a cyclical process of determining subroutine improvements, and finally, bringing it all together. The significant cost and time required is a limiting factor in optimizing many algorithms of interest.
This is again where symmetry comes into play. The authors realized that by better exploiting the deepest symmetries of the problem, they could make the entire edifice more efficient, from state preparation to readout. Over the course of a few years, a team lead Dr. Fitzpatrick and his colleague Jędrzej Burkat slowly polished their approach into a full algorithm for performing the QPT.
The QPT functions by using Pauli’s symmetry to discard unimportant details and strip the problem down to its bare essentials. Starting with a Paldus transform allows the algorithm designer to enjoy knock-on effects throughout the entire structure, making it overall more efficient to run.
“It’s amazing to think how something we discovered one hundred years ago is making quantum computing easier and more efficient,” said Dr. Nathan Fitzpatrick.
Ultimately, this innovation will lead to more efficient quantum simulation. Projects we believed to still be many years out can now be realized in the near term.
The discovery of the Quantum Paldus Transform is a powerful reminder that enduring ideas—like symmetry—continue to shape the frontiers of science. By reaching back into the fundamental principles laid down by pioneers like Noether and Pauli, and combining them with modern quantum algorithm design, Dr. Fitzpatrick and Mr. Burkat have uncovered a tool with the potential to reshape how we approach quantum computation.
As quantum technologies continue their crossover from theoretical promise to practical implementation, innovations like this will be key in unlocking their full potential.
In a new paper in Nature Physics, we've made a major breakthrough in one of quantum computing’s most elusive promises: simulating the physics of superconductors. A deeper understanding of superconductivity would have an enormous impact: greater insight could pave the way to real-world advances, like phone batteries that last for months, “lossless” power grids that drastically reduce your bills, or MRI machines that are widely available and cheap to use. The development of room-temperature superconductors would transform the global economy.
A key promise of quantum computing is that it has a natural advantage when studying inherently quantum systems, like superconductors. In many ways, it is precisely the deeply ‘quantum’ nature of superconductivity that makes it both so transformative and so notoriously difficult to study.
Now, we are pleased to report that we just got a lot closer to that ultimate dream.
To study something like a superconductor with a quantum computer, you need to first “encode” the elements of the system you want to study onto the qubits – in other words, you want to translate the essential features of your material onto the states and gates you will run on the computer.
For superconductors in particular, you want to encode the behavior of particles known as “fermions” (like the familiar electron). Naively simulating fermions using qubits will result in garbage data, because qubits alone lack the key properties that make a fermion so unique.
Until recently, scientists used something called the “Jordan-Wigner” encoding to properly map fermions onto qubits. People have argued that the Jordan-Wigner encoding is one of the main reasons fermionic simulations have not progressed beyond simple one-dimensional chain geometries: it requires too many gates as the system size grows.
Even worse, the Jordan-Wigner encoding has the nasty property that it is, in a sense, maximally non-fault-tolerant: one error occurring anywhere in the system affects the whole state, which generally leads to an exponential overhead in the number of shots required. Due to this, until now, simulating relevant systems at scale – one of the big promises of quantum computing – has remained a daunting challenge.
Theorists have addressed the issues of the Jordan-Wigner encoding and have suggested alternative fermionic encodings. In practice, however, the circuits created from these alternative encodings come with large overheads and have so far not been practically useful.
We are happy to report that our team developed a new way to compile one of the new, alternative, encodings that dramatically improves both efficiency and accuracy, overcoming the limitations of older approaches. Their new compilation scheme is the most efficient yet, slashing the cost of simulating fermionic hopping by an impressive 42%. On top of that, the team also introduced new, targeted error mitigation techniques that ensure even larger systems can be simulated with far fewer computational "shots"—a critical advantage in quantum computing.
Using their innovative methods, the team was able to simulate the Fermi-Hubbard model—a cornerstone of condensed matter physics— at a previously unattainable scale. By encoding 36 fermionic modes into 48 physical qubits on System Model H2, they achieved the largest quantum simulation of this model to date.
This marks an important milestone in quantum computing: it demonstrates that large-scale simulations of complex quantum systems, like superconductors, are now within reach.
This breakthrough doesn’t just show how we can push the boundaries of what quantum computers can do; it brings one of the most exciting use cases of quantum computing much closer to reality. With this new approach, scientists can soon begin to simulate materials and systems that were once thought too complex for the most powerful classical computers alone. And in doing so, they’ve unlocked a path to potentially solving one of the most exciting and valuable problems in science and technology: understanding and harnessing the power of superconductivity.
The future of quantum computing—and with it, the future of energy, electronics, and beyond—just got a lot more exciting.