By Duncan Jones
In September, nearly 200 senior cybersecurity leaders from around the world convened to discuss the state of U.S. cybersecurity at the 2022 Billington Cybersecurity Summit. Topics around cybersecurity were varied and included discussions about moral asymmetry of today’s global threat actors, lessons learned from Ukraine and general discussions around all things that “keep us up at night” concerning cyber threats.
As a speaker at the Summit, I wanted to take a moment to share my take-aways from an important discussion that took place during our breakout session, “Future of Encryption: Moving to a Quantum Resistant World.” My esteemed fellow panelists from NSA, NIST, CMU and AWS exchanged insights as to where U.S. government agencies stand in their preparation for current and future threats to encryption, the likely hurdles they face, and the resources that exist to assist in the transition. Those responsible for moving their agency to a quantum-resistant world should find the following insights worth considering.
With the prospect of powerful quantum computers breaking known encryption methods on the horizon and with federal mandate NSM-10 now in place, the good news is that quantum-proof encryption is finally being discussed. The not-so-good-news is that it isn’t clear to cybersecurity practitioners what they need to do first. Understanding the threat is not nearly as difficult as understanding the timing, which seems to have left agency personnel at the starting gate of a planning process fraught with challenges – and urgency.
Why is the timeline so difficult to establish? Because there is no way of knowing when a quantum-based attack will take place. The Quantum-safe Security Working Group of the Cloud Security Alliance (CSA) chose the date, April 14, 2030, to represent “Y2Q,” also known as “Q-Day” – the moment secure IT infrastructure becomes vulnerable to the threat of a fault-tolerant quantum computer running Shor’s algorithm. The Biden Administration based its implementation timeline on the day that NIST announced the four winning algorithms for standardization. Then there is the “hack now, decrypt later” timeline which suggests that quantum-related attacks may already be underway.
Regardless of the final timeline or potential drivers, one thing that was clear to the panel attendees was that they need to start the transition now.
I get this question often and was not disappointed when one attendee asked, “How can I convince my agency leadership that migrating to quantum-proof encryption is a priority when they are still trying to tackle basic cyber threats?”
The panelists responded and agreed that the U.S. government’s data storage requirements are unique in that classification dates are typically 20 years. This means that systems in development today, that are typically fielded over the next 10 years, will actually have a storage shelf life of 30 years minimum. Those systems need to be “future-proofed” today, a term that should be effective when trying to convince agency leaders of the priority.
The need to future-proof is driven by a variety of scenarios, such as equipment and software upgrades. In general, it takes a long time (and perhaps even longer for government entities) to upgrade or change equipment, software, etc. It will take an extremely long time to update all of the software that has cryptography in place.
The panelists also agreed that given the extensive supply chain supporting federal systems, vendors are a critical component to the overall success of an agency’s future-proofing for the quantum age. In 10-15 years, there will be some government partner/vendor somewhere who will not have transitioned to quantum-proof encryption. For leaders who have not yet prioritized their agency’s cryptography migration, let them ponder that thought — and start to focus on the need to prepare.
The panel shared several past technology migrations that were similar in their minds to the adoption of quantum computing.
Y2K was similar to the looming quantum threat by both the urgency and scale of the government’s need to migrate systems. However, without a deadline assigned to implementing the encryption migration, Y2K is really only similar in scale.
The panelists also recalled when every company had to replace the SHA-1 hash function, but concluded that the amount of time, effort, and energy required to replace current encryption will be way more important than SHA-1 — and way more ubiquitous.
While previous technology migrations help to establish lessons learned for the government’s quantum-proof cryptography migration, the panel concluded that this go-round will have a very unique set of challenges — the likes of which organizations have never had to tackle before.
The consensus among panelists was that agencies need to first understand what data they have today and how vulnerable it is to attack. Data that is particularly sensitive, and vulnerable to the “hack-now, decrypt-later” attacks, should be prioritized above less sensitive data. For some organizations, this is a very challenging endeavor that they’ve never embarked upon before. Now is an opportune time to build inventory data and keep it up to date. From a planning and migration perspective, this is an agency’s chance to do it once and do it well.
It is important to assume from the start that the vast majority of organizations will need to migrate multiple times. Panelists emphasized the need for “crypto agility” that will enable future replacement of algorithms to be made easily. Crypto agility is about how easy it is to transition from one algorithm (or choice of parameters) to another. Organizations that prioritize long-term thinking should already be looking at this.
The panelists added that communicating with vendors early on in the planning process is vital. As one panelist explained, “A lot of our service providers, vendors, etc. will be flipping switches for us, but a lot won’t. Understanding what your priorities are for flipping the switch and communicating it to your vendors is important.”
Matt Scholl of NIST shared about the work that NCCOE is doing to provide guidance, tips, and to answer questions such as what are discovery tools and how do I budget? The Migration to Post-Quantum Cryptography project, announced in July 2022, is working to develop white papers, playbooks, demonstrations, tools that can help other organizations implement their conversions to post-quantum cryptography. Other resources that offer good guidance, according to Scholl, include recent CISA Guidance, DHS’ roadmap and the Canadian Centre for Cybersecurity.
One additional resource that has been extremely helpful for our CISO customers is Quantinuum’s CISO’s Guide to Post-Quantum Standardization. The guide outlines what CISOs from any organization should be doing now and provides a basic transition roadmap to follow.
The discussion wrapped up with the acknowledgement that quantum has finally become part of the mainstream cybersecurity discussion and that the future benefit of quantum computing far outweighs the challenges of transitioning to new cryptography. As a parting thought, I emphasized the wonderful opportunity that agencies have to rethink how they do things and encouraged attendees to secure management commitment and funding for this much-needed modernization.
I want to give a special thanks to my fellow panelists for the engaging discussion: Margaret Salter, Director, Applied Cryptography, AWS, Dr. Mark Sherman, Director, Cybersecurity Foundations, CMU, Matthew Scholl, Chief of the Computer Security Division, ITL, NIST, and Dr. Adrian Stanger, Cybersecurity Directorate Senior Cryptographic Authority NSA.
Quantinuum, the world’s largest integrated quantum company, pioneers powerful quantum computers and advanced software solutions. Quantinuum’s technology drives breakthroughs in materials discovery, cybersecurity, and next-gen quantum AI. With over 500 employees, including 370+ scientists and engineers, Quantinuum leads the quantum computing revolution across continents.
Wherever you’re sitting right now, you’re probably surrounded by the fruits of modern semiconductor technology. Chips aren't only in your laptops and cell phones – they're in your car, your doorbell, your thermostat, and even your toaster. Importantly, semiconductor-based chips are also in the heart of most quantum computers.
While quantum computing holds transformative potential, it faces two major challenges: first, achieving low error operations (say one in a billion), and second, scaling systems to enough qubits to address complex, real-world problems (say, on the order of a million). Quantinuum is proud to lead the industry in providing the lowest error rates in the business, but some continue to question whether our chosen modality, trapped-ion technology, can scale to meet these ambitious goals.
Why the doubt? Well, early demonstrations of trapped-ion quantum computers relied on bulky, expensive laser sources, large glass optics, and sizeable ion traps assembled by hand. By comparison, other modalities, such as semiconductor and superconductor qubits, resemble conventional computer chips. However, our quantum-charge-coupled device (QCCD) architecture shares the same path to scaling: at their core, our quantum computers are also chip-based. By leveraging modern microfabrication techniques, we can scale effectively while maintaining the advantage of low error rates that trapped ions provide.
Fortunately, we are at a point in history where QCCD quantum computing is already more compact compared to the early days. Traditional oversized laser sources have already been replaced by tiny diode lasers based on semiconductor chips, and our ion traps have already evolved from bulky, hand-assembled objects to traps fabricated on silicon wafers. The biggest remaining challenge lies in the control and manipulation of laser light.
For this next stage in our journey, we have turned to Infineon. Infineon not only builds some of the world’s leading classical computer chips, but they also bring in-house expertise in ion-trap quantum computing. Together, we are developing a chip with integrated photonics, bringing the control and manipulation of light fully onto our chips. This innovation drastically reduces system complexity and paves the way for serious scaling.
Since beginning work with Infineon, our pace of innovation has accelerated. Their expertise in fabricating waveguides, building grating couplers, and optimizing deposition processes for ultra-low optical loss gives us a significant advantage. In fact, Infineon has already developed deposition processes with the lowest optical losses in the world—a critical capability for building high-performance photonic systems.
Their impressive suite of failure analysis tools, such as electron microscopes, SIMS, FIB, AFMs, and Kelvin probes, allow us to diagnose and correct failures in days rather than weeks. Some of these tools are in-line, meaning analysis can be performed without removing devices from the cleanroom environment, minimizing contamination risk and further accelerating development.
Together, we are demonstrating that QCCD quantum computing is fundamentally a semiconductor technology—just like conventional computers. While seeming like it’s a world away, quantum computing is now closer to home than ever.
As organizations assess the impact of quantum computing on cryptography, many focus on algorithm migration and timelines. But preparing for PQC requires a broader view—one that includes not just new algorithms, but also the quality of the inputs that support them, including randomness.
That’s why Quantinuum joined with partners Thales, Keyfactor, and IBM Consulting to form the QSafe 360 Alliance, a collaboration focused on helping organizations build crypto-agile security architectures that are ready for the quantum era. Together, we’ve released a whitepaper—Digital Trust & Cybersecurity After Quantum Computing—to offer practical guidance on post-quantum readiness, from discovery and planning to deployment.
The history of cryptography offers clear examples of what happens when randomness fail, and how long those issues can go unnoticed. The Polynonce attack, first disclosed in 2023, exploited weak randomness in Bitcoin transaction signatures and enabled the theft of at least $25 million across 773 wallets. The vulnerability persisted undetected for nine years. The Randstorm disclosure, published in 2022, revealed that biased key generation in widely used Bitcoin wallet libraries exposed millions of wallets—across a window of more than a decade (2011–2022). In both cases, cryptographic algorithms functioned as designed; it was the randomness beneath them that silently failed, leaving companies vulnerable for many years
Post-quantum cryptography (PQC) algorithms are being designed to resist attacks from quantum computers. But they still depend on random values to generate key material. That means any implementation of PQC inherits the same reliance on randomness—but without a way to prove its quality, that layer remains a potential vulnerability.
As security teams run cryptographic inventories, develop crypto-agility plans, or build software bill-of-materials (SBOMs) for PQC migration, it’s important to include randomness in that scope. No matter how strong the algorithm, poor randomness can undermine its security from the start.
Quantum Origin takes a fundamentally different approach to randomness quality to deliver proven randomness which improves key generation, algorithms, and the entire security stack. It leverages strong seeded randomness extractors—mathematical algorithms that transform even weak local entropy into provably secure output. These extractors are uniquely powered by a Quantum Seed, which is generated once by Quantinuum's quantum computers using quantum processes verified through Bell tests.
This one-time quantum generation enables Quantum Origin as a software-only solution designed for maximum flexibility. It works with existing infrastructure—on cloud systems, on-premises environments, air-gapped networks, and embedded platforms—without requiring special hardware or a network connection. It's also validated to NIST SP 800-90B standards (Entropy Source Validation #E214). This approach strengthens today’s deployments of AES, RSA, ECC, and other algorithms, and lays a secure foundation for implementing the NIST PQC algorithms.
The QSafe 360 Alliance whitepaper outlines the path to post-quantum readiness, emphasizing crypto-agility as a guiding principle: the ability to adapt cryptographic systems without major disruption, from randomness to key generation to algorithmic strength.
For security architects, CISOs, and cryptographic engineering teams building their post-quantum transition strategies, randomness is not a peripheral concern. It is a starting point.
The QSafe 360 Alliance whitepaper offers valuable guidance on structuring a comprehensive PQC journey. As you explore that framework, consider how proven randomness—available today—will help strengthen your security posture from the ground up.
Our quantum algorithms team has been hard at work exploring solutions to continually optimize our system’s performance. Recently, they’ve invented a novel technique, called the Quantum Paldus Transform (QPT), that can offer significant resource savings in future applications.
The transform takes complex representations and makes them simple, by transforming into a different “basis”. This is like looking at a cube from one angle, then rotating it and seeing just a square, instead. Transformations like this save resources because the more complex your problem looks, the more expensive it is to represent and manipulate on qubits.
While it might sound like magic, transforms are a commonly used tool in science and engineering. Transforms simplify problems by reshaping them into something that is easier to deal with, or that provides a new perspective on the situation. For example, sound engineers use Fourier transforms every day to look at complex musical pieces in terms of their frequency components. Electrical engineers use Laplace transforms; people who work in image processing use the Abel transform; physicists use the Legendre transform, and so on.
In a new paper outlining the necessary tools to implement the QPT, Dr. Nathan Fitzpatrick and Mr. Jędrzej Burkat explain how the QPT will be widely applicable in quantum computing simulations, spanning areas like molecular chemistry, materials science, and semiconductor physics. The paper also describes how the algorithm can lead to significant resource savings by offering quantum programmers a more efficient way of representing problems on qubits.
The efficiency of the QPT stems from its use of one of the most profound findings in the field of physics: that symmetries drive the properties of a system.
While the average person can “appreciate” symmetry, for example in design or aesthetics, physicists understand symmetry as a much more profound element present in the fabric of reality. Symmetries are like the universe’s DNA; they lead to conservation laws, which are the most immutable truths we know.
Back in the 1920’s, when women were largely prohibited from practicing physics, one of the great mathematicians of the century, Emmy Noether, turned her attention to the field when she was tasked with helping Einstein with his work. In her attempt to solve a problem Einstein had encountered, Dr. Noether realized that all the most powerful and fundamental laws of physics, such as “energy can neither be created nor destroyed” are in fact the consequence of a deep simplicity – symmetry – hiding behind the curtains of reality. Dr. Noether’s theorem would have a profound effect on the trajectory of physics.
In addition to the many direct consequences of Noether’s theorem is a longstanding tradition amongst physicists to treat symmetry thoughtfully. Because of its role in the fabric of our universe, carefully considering the symmetries of a system often leads to invaluable insights.
Many of the systems we are interested in simulating with quantum computers are, at their heart, systems of electrons. Whether we are looking at how electrons move in a paired dance inside superconductors, or how they form orbitals and bonds in a chemical system, the motion of electrons are at the core.
Seven years after Noether published her blockbuster results, Wolfgang Pauli made waves when he published the work describing his Pauli exclusion principle, which relies heavily on symmetry to explain basic tenets of quantum theory. Pauli’s principle has enormous consequences; for starters, describing how the objects we interact with every day are solid even though atoms are mostly empty space, and outlining the rules of bonds, orbitals, and all of chemistry, among other things.
It is Pauli's symmetry, coupled with a deep respect for the impact of symmetry, that led our team at Quantinuum to the discovery published today.
In their work, they considered the act of designing quantum algorithms, and how one’s design choices may lead to efficiency or inefficiency.
When you design quantum algorithms, there are many choices you can make that affect the final result. Extensive work goes into optimizing each individual step in an algorithm, requiring a cyclical process of determining subroutine improvements, and finally, bringing it all together. The significant cost and time required is a limiting factor in optimizing many algorithms of interest.
This is again where symmetry comes into play. The authors realized that by better exploiting the deepest symmetries of the problem, they could make the entire edifice more efficient, from state preparation to readout. Over the course of a few years, a team lead Dr. Fitzpatrick and his colleague Jędrzej Burkat slowly polished their approach into a full algorithm for performing the QPT.
The QPT functions by using Pauli’s symmetry to discard unimportant details and strip the problem down to its bare essentials. Starting with a Paldus transform allows the algorithm designer to enjoy knock-on effects throughout the entire structure, making it overall more efficient to run.
“It’s amazing to think how something we discovered one hundred years ago is making quantum computing easier and more efficient,” said Dr. Nathan Fitzpatrick.
Ultimately, this innovation will lead to more efficient quantum simulation. Projects we believed to still be many years out can now be realized in the near term.
The discovery of the Quantum Paldus Transform is a powerful reminder that enduring ideas—like symmetry—continue to shape the frontiers of science. By reaching back into the fundamental principles laid down by pioneers like Noether and Pauli, and combining them with modern quantum algorithm design, Dr. Fitzpatrick and Mr. Burkat have uncovered a tool with the potential to reshape how we approach quantum computation.
As quantum technologies continue their crossover from theoretical promise to practical implementation, innovations like this will be key in unlocking their full potential.