In a meaningful advance in an important area of industrial and real-world relevance, Quantinuum researchers have demonstrated a quantum algorithm capable of solving complex combinatorial optimization problems while making the most of available quantum resources.
Results on the new H2 quantum computer evidenced a remarkable ability to solve combinatorial optimization problems with as few quantum resources as those employed by just one layer of the quantum approximate optimization algorithm (QAOA), the current and traditional workhorse of quantum heuristic algorithms.
Optimization problems are common in industry in contexts such as route planning, scheduling, cost optimization and logistics. However, as the number of variables increases and optimization problems grow larger and more complex, finding satisfactory solutions using classical algorithms becomes increasingly difficult.
Recent research suggests that certain quantum algorithms might be capable of solving combinatorial optimization problems better than classical algorithms. The realization of such quantum algorithms can therefore potentially increase the efficiency of industrial processes.
However, the effectiveness of these algorithms on near-term quantum devices and even on future generations of more capable quantum computers presents a technical challenge: quantum resources will need to be reduced as much as possible in order to protect the quantum algorithm from the unavoidable effects of quantum noise.
Sebastian Leontica and Dr. David Amaro, a senior research scientist at Quantinuum, explain their advances in a new paper, “Exploring the neighborhood of 1-layer QAOA with Instantaneous Quantum Polynomial circuits” published on arXiv. This is one of several papers published at the launch of Quantinuum’s H2, that highlight the unparalleled power of the newest generation of the H-Series, Powered by Honeywell.
“We should strive to use as few quantum resources as possible no matter how good a quantum computer we are operating on, which means using the smallest possible number of qubits that fit within the problem size and a circuit that is as shallow as possible,” Dr. Amaro said. “Our algorithm uses the fewest possible resources and still achieves good performance.”
The researchers use a parameterized instantaneous quantum polynomial (IQP) circuit of the same depth as the 1-layer QAOA to incorporate corrections that would otherwise require multiple layers. Another differentiating feature of the algorithm is that the parameters in the IQP circuit can be efficiently trained on a classical computer, avoiding some training issues of other algorithms like QAOA. Critically, the circuit takes full advantage of, and benefits from features available on Quantinuum’s devices, including parameterized two-qubit gates, all-to-all connectivity, and high-fidelity operations.
“Our numerical simulations and experiments on the new H2 quantum computer at small scale indicate that this heuristic algorithm, compared to 1-layer QAOA, is expected to amplify the probability of sampling good or even optimal solutions of large optimization problems,” Dr. Amaro said. “We now want to understand how the solution quality and runtime of our algorithm compares to the best classical algorithms.”
This algorithm will be useful for current quantum computers as well as larger machines farther along the Quantinuum hardware roadmap.
The goal of this project was to provide a quantum heuristic algorithm for combinatorial optimization that returns better solutions for optimization problems and uses fewer quantum resources than state of the art quantum heuristics. The researchers used a fully connected parameterized IQP, warm-started from 1-layer QAOA. For a problem with n binary variables the circuit contained up to n(n-1)/2 two-qubit gates and the researchers employed only 20.32n shots.
The algorithm showed improved performance on the Sherrington-Kirkpatrick (SK) optimization problem compared to the 1-layer QAOA. Numerical simulations showed an average speed up of 20.31n compared to 20.5n when looking for the optimal solution.
Experimental results on our new H2 quantum computer and emulator confirmed that the new optimization algorithm outperforms 1-layer QAOA and reliably solves complex optimization problems. The optimal solution was found for 136 out of 312 instances, four of which were for the maximum size of 32 qubits. A 30-qubit instance was solved optimally on the H2 device, which means, remarkably, that at least one of the 776 shots measured after performing 432 two-qubit gates corresponds to the unique optimal solution in the huge set of 230 > 109 candidate solutions.
These results indicate that the algorithm, in combination with H2 hardware, is capable of solving hard optimization problems using minimal quantum resources in the presence of real hardware noise.
Quantinuum researchers expect that these promising results at small scale will encourage the further study of new quantum heuristic algorithms at the relevant scale for real-world optimization problems, which requires a better understanding of their performance under realistic conditions.
Numerical simulations of 256 SK random instances for each problem size from 4 to 29 qubits. Graph A shows the probability of sampling the optimal solution in the IQP circuit, for which the average is 2-0.31n. Graph B shows the enhancement factor compared to 1-layer QAOA, for which the average is 20.23n. These results indicate that Quantinuum’s algorithm has significantly better runtime than 1-layer QAOA.
Quantinuum, the world’s largest integrated quantum company, pioneers powerful quantum computers and advanced software solutions. Quantinuum’s technology drives breakthroughs in materials discovery, cybersecurity, and next-gen quantum AI. With over 500 employees, including 370+ scientists and engineers, Quantinuum leads the quantum computing revolution across continents.
Wherever you’re sitting right now, you’re probably surrounded by the fruits of modern semiconductor technology. Chips aren't only in your laptops and cell phones – they're in your car, your doorbell, your thermostat, and even your toaster. Importantly, semiconductor-based chips are also in the heart of most quantum computers.
While quantum computing holds transformative potential, it faces two major challenges: first, achieving low error operations (say one in a billion), and second, scaling systems to enough qubits to address complex, real-world problems (say, on the order of a million). Quantinuum is proud to lead the industry in providing the lowest error rates in the business, but some continue to question whether our chosen modality, trapped-ion technology, can scale to meet these ambitious goals.
Why the doubt? Well, early demonstrations of trapped-ion quantum computers relied on bulky, expensive laser sources, large glass optics, and sizeable ion traps assembled by hand. By comparison, other modalities, such as semiconductor and superconductor qubits, resemble conventional computer chips. However, our quantum-charge-coupled device (QCCD) architecture shares the same path to scaling: at their core, our quantum computers are also chip-based. By leveraging modern microfabrication techniques, we can scale effectively while maintaining the advantage of low error rates that trapped ions provide.
Fortunately, we are at a point in history where QCCD quantum computing is already more compact compared to the early days. Traditional oversized laser sources have already been replaced by tiny diode lasers based on semiconductor chips, and our ion traps have already evolved from bulky, hand-assembled objects to traps fabricated on silicon wafers. The biggest remaining challenge lies in the control and manipulation of laser light.
For this next stage in our journey, we have turned to Infineon. Infineon not only builds some of the world’s leading classical computer chips, but they also bring in-house expertise in ion-trap quantum computing. Together, we are developing a chip with integrated photonics, bringing the control and manipulation of light fully onto our chips. This innovation drastically reduces system complexity and paves the way for serious scaling.
Since beginning work with Infineon, our pace of innovation has accelerated. Their expertise in fabricating waveguides, building grating couplers, and optimizing deposition processes for ultra-low optical loss gives us a significant advantage. In fact, Infineon has already developed deposition processes with the lowest optical losses in the world—a critical capability for building high-performance photonic systems.
Their impressive suite of failure analysis tools, such as electron microscopes, SIMS, FIB, AFMs, and Kelvin probes, allow us to diagnose and correct failures in days rather than weeks. Some of these tools are in-line, meaning analysis can be performed without removing devices from the cleanroom environment, minimizing contamination risk and further accelerating development.
Together, we are demonstrating that QCCD quantum computing is fundamentally a semiconductor technology—just like conventional computers. While seeming like it’s a world away, quantum computing is now closer to home than ever.
As organizations assess the impact of quantum computing on cryptography, many focus on algorithm migration and timelines. But preparing for PQC requires a broader view—one that includes not just new algorithms, but also the quality of the inputs that support them, including randomness.
That’s why Quantinuum joined with partners Thales, Keyfactor, and IBM Consulting to form the QSafe 360 Alliance, a collaboration focused on helping organizations build crypto-agile security architectures that are ready for the quantum era. Together, we’ve released a whitepaper—Digital Trust & Cybersecurity After Quantum Computing—to offer practical guidance on post-quantum readiness, from discovery and planning to deployment.
The history of cryptography offers clear examples of what happens when randomness fail, and how long those issues can go unnoticed. The Polynonce attack, first disclosed in 2023, exploited weak randomness in Bitcoin transaction signatures and enabled the theft of at least $25 million across 773 wallets. The vulnerability persisted undetected for nine years. The Randstorm disclosure, published in 2022, revealed that biased key generation in widely used Bitcoin wallet libraries exposed millions of wallets—across a window of more than a decade (2011–2022). In both cases, cryptographic algorithms functioned as designed; it was the randomness beneath them that silently failed, leaving companies vulnerable for many years
Post-quantum cryptography (PQC) algorithms are being designed to resist attacks from quantum computers. But they still depend on random values to generate key material. That means any implementation of PQC inherits the same reliance on randomness—but without a way to prove its quality, that layer remains a potential vulnerability.
As security teams run cryptographic inventories, develop crypto-agility plans, or build software bill-of-materials (SBOMs) for PQC migration, it’s important to include randomness in that scope. No matter how strong the algorithm, poor randomness can undermine its security from the start.
Quantum Origin takes a fundamentally different approach to randomness quality to deliver proven randomness which improves key generation, algorithms, and the entire security stack. It leverages strong seeded randomness extractors—mathematical algorithms that transform even weak local entropy into provably secure output. These extractors are uniquely powered by a Quantum Seed, which is generated once by Quantinuum's quantum computers using quantum processes verified through Bell tests.
This one-time quantum generation enables Quantum Origin as a software-only solution designed for maximum flexibility. It works with existing infrastructure—on cloud systems, on-premises environments, air-gapped networks, and embedded platforms—without requiring special hardware or a network connection. It's also validated to NIST SP 800-90B standards (Entropy Source Validation #E214). This approach strengthens today’s deployments of AES, RSA, ECC, and other algorithms, and lays a secure foundation for implementing the NIST PQC algorithms.
The QSafe 360 Alliance whitepaper outlines the path to post-quantum readiness, emphasizing crypto-agility as a guiding principle: the ability to adapt cryptographic systems without major disruption, from randomness to key generation to algorithmic strength.
For security architects, CISOs, and cryptographic engineering teams building their post-quantum transition strategies, randomness is not a peripheral concern. It is a starting point.
The QSafe 360 Alliance whitepaper offers valuable guidance on structuring a comprehensive PQC journey. As you explore that framework, consider how proven randomness—available today—will help strengthen your security posture from the ground up.
Our quantum algorithms team has been hard at work exploring solutions to continually optimize our system’s performance. Recently, they’ve invented a novel technique, called the Quantum Paldus Transform (QPT), that can offer significant resource savings in future applications.
The transform takes complex representations and makes them simple, by transforming into a different “basis”. This is like looking at a cube from one angle, then rotating it and seeing just a square, instead. Transformations like this save resources because the more complex your problem looks, the more expensive it is to represent and manipulate on qubits.
While it might sound like magic, transforms are a commonly used tool in science and engineering. Transforms simplify problems by reshaping them into something that is easier to deal with, or that provides a new perspective on the situation. For example, sound engineers use Fourier transforms every day to look at complex musical pieces in terms of their frequency components. Electrical engineers use Laplace transforms; people who work in image processing use the Abel transform; physicists use the Legendre transform, and so on.
In a new paper outlining the necessary tools to implement the QPT, Dr. Nathan Fitzpatrick and Mr. Jędrzej Burkat explain how the QPT will be widely applicable in quantum computing simulations, spanning areas like molecular chemistry, materials science, and semiconductor physics. The paper also describes how the algorithm can lead to significant resource savings by offering quantum programmers a more efficient way of representing problems on qubits.
The efficiency of the QPT stems from its use of one of the most profound findings in the field of physics: that symmetries drive the properties of a system.
While the average person can “appreciate” symmetry, for example in design or aesthetics, physicists understand symmetry as a much more profound element present in the fabric of reality. Symmetries are like the universe’s DNA; they lead to conservation laws, which are the most immutable truths we know.
Back in the 1920’s, when women were largely prohibited from practicing physics, one of the great mathematicians of the century, Emmy Noether, turned her attention to the field when she was tasked with helping Einstein with his work. In her attempt to solve a problem Einstein had encountered, Dr. Noether realized that all the most powerful and fundamental laws of physics, such as “energy can neither be created nor destroyed” are in fact the consequence of a deep simplicity – symmetry – hiding behind the curtains of reality. Dr. Noether’s theorem would have a profound effect on the trajectory of physics.
In addition to the many direct consequences of Noether’s theorem is a longstanding tradition amongst physicists to treat symmetry thoughtfully. Because of its role in the fabric of our universe, carefully considering the symmetries of a system often leads to invaluable insights.
Many of the systems we are interested in simulating with quantum computers are, at their heart, systems of electrons. Whether we are looking at how electrons move in a paired dance inside superconductors, or how they form orbitals and bonds in a chemical system, the motion of electrons are at the core.
Seven years after Noether published her blockbuster results, Wolfgang Pauli made waves when he published the work describing his Pauli exclusion principle, which relies heavily on symmetry to explain basic tenets of quantum theory. Pauli’s principle has enormous consequences; for starters, describing how the objects we interact with every day are solid even though atoms are mostly empty space, and outlining the rules of bonds, orbitals, and all of chemistry, among other things.
It is Pauli's symmetry, coupled with a deep respect for the impact of symmetry, that led our team at Quantinuum to the discovery published today.
In their work, they considered the act of designing quantum algorithms, and how one’s design choices may lead to efficiency or inefficiency.
When you design quantum algorithms, there are many choices you can make that affect the final result. Extensive work goes into optimizing each individual step in an algorithm, requiring a cyclical process of determining subroutine improvements, and finally, bringing it all together. The significant cost and time required is a limiting factor in optimizing many algorithms of interest.
This is again where symmetry comes into play. The authors realized that by better exploiting the deepest symmetries of the problem, they could make the entire edifice more efficient, from state preparation to readout. Over the course of a few years, a team lead Dr. Fitzpatrick and his colleague Jędrzej Burkat slowly polished their approach into a full algorithm for performing the QPT.
The QPT functions by using Pauli’s symmetry to discard unimportant details and strip the problem down to its bare essentials. Starting with a Paldus transform allows the algorithm designer to enjoy knock-on effects throughout the entire structure, making it overall more efficient to run.
“It’s amazing to think how something we discovered one hundred years ago is making quantum computing easier and more efficient,” said Dr. Nathan Fitzpatrick.
Ultimately, this innovation will lead to more efficient quantum simulation. Projects we believed to still be many years out can now be realized in the near term.
The discovery of the Quantum Paldus Transform is a powerful reminder that enduring ideas—like symmetry—continue to shape the frontiers of science. By reaching back into the fundamental principles laid down by pioneers like Noether and Pauli, and combining them with modern quantum algorithm design, Dr. Fitzpatrick and Mr. Burkat have uncovered a tool with the potential to reshape how we approach quantum computation.
As quantum technologies continue their crossover from theoretical promise to practical implementation, innovations like this will be key in unlocking their full potential.