By Duncan Jones
In September, nearly 200 senior cybersecurity leaders from around the world convened to discuss the state of U.S. cybersecurity at the 2022 Billington Cybersecurity Summit. Topics around cybersecurity were varied and included discussions about moral asymmetry of today’s global threat actors, lessons learned from Ukraine and general discussions around all things that “keep us up at night” concerning cyber threats.
As a speaker at the Summit, I wanted to take a moment to share my take-aways from an important discussion that took place during our breakout session, “Future of Encryption: Moving to a Quantum Resistant World.” My esteemed fellow panelists from NSA, NIST, CMU and AWS exchanged insights as to where U.S. government agencies stand in their preparation for current and future threats to encryption, the likely hurdles they face, and the resources that exist to assist in the transition. Those responsible for moving their agency to a quantum-resistant world should find the following insights worth considering.
With the prospect of powerful quantum computers breaking known encryption methods on the horizon and with federal mandate NSM-10 now in place, the good news is that quantum-proof encryption is finally being discussed. The not-so-good-news is that it isn’t clear to cybersecurity practitioners what they need to do first. Understanding the threat is not nearly as difficult as understanding the timing, which seems to have left agency personnel at the starting gate of a planning process fraught with challenges – and urgency.
Why is the timeline so difficult to establish? Because there is no way of knowing when a quantum-based attack will take place. The Quantum-safe Security Working Group of the Cloud Security Alliance (CSA) chose the date, April 14, 2030, to represent “Y2Q,” also known as “Q-Day” – the moment secure IT infrastructure becomes vulnerable to the threat of a fault-tolerant quantum computer running Shor’s algorithm. The Biden Administration based its implementation timeline on the day that NIST announced the four winning algorithms for standardization. Then there is the “hack now, decrypt later” timeline which suggests that quantum-related attacks may already be underway.
Regardless of the final timeline or potential drivers, one thing that was clear to the panel attendees was that they need to start the transition now.
I get this question often and was not disappointed when one attendee asked, “How can I convince my agency leadership that migrating to quantum-proof encryption is a priority when they are still trying to tackle basic cyber threats?”
The panelists responded and agreed that the U.S. government’s data storage requirements are unique in that classification dates are typically 20 years. This means that systems in development today, that are typically fielded over the next 10 years, will actually have a storage shelf life of 30 years minimum. Those systems need to be “future-proofed” today, a term that should be effective when trying to convince agency leaders of the priority.
The need to future-proof is driven by a variety of scenarios, such as equipment and software upgrades. In general, it takes a long time (and perhaps even longer for government entities) to upgrade or change equipment, software, etc. It will take an extremely long time to update all of the software that has cryptography in place.
The panelists also agreed that given the extensive supply chain supporting federal systems, vendors are a critical component to the overall success of an agency’s future-proofing for the quantum age. In 10-15 years, there will be some government partner/vendor somewhere who will not have transitioned to quantum-proof encryption. For leaders who have not yet prioritized their agency’s cryptography migration, let them ponder that thought — and start to focus on the need to prepare.
The panel shared several past technology migrations that were similar in their minds to the adoption of quantum computing.
Y2K was similar to the looming quantum threat by both the urgency and scale of the government’s need to migrate systems. However, without a deadline assigned to implementing the encryption migration, Y2K is really only similar in scale.
The panelists also recalled when every company had to replace the SHA-1 hash function, but concluded that the amount of time, effort, and energy required to replace current encryption will be way more important than SHA-1 — and way more ubiquitous.
While previous technology migrations help to establish lessons learned for the government’s quantum-proof cryptography migration, the panel concluded that this go-round will have a very unique set of challenges — the likes of which organizations have never had to tackle before.
The consensus among panelists was that agencies need to first understand what data they have today and how vulnerable it is to attack. Data that is particularly sensitive, and vulnerable to the “hack-now, decrypt-later” attacks, should be prioritized above less sensitive data. For some organizations, this is a very challenging endeavor that they’ve never embarked upon before. Now is an opportune time to build inventory data and keep it up to date. From a planning and migration perspective, this is an agency’s chance to do it once and do it well.
It is important to assume from the start that the vast majority of organizations will need to migrate multiple times. Panelists emphasized the need for “crypto agility” that will enable future replacement of algorithms to be made easily. Crypto agility is about how easy it is to transition from one algorithm (or choice of parameters) to another. Organizations that prioritize long-term thinking should already be looking at this.
The panelists added that communicating with vendors early on in the planning process is vital. As one panelist explained, “A lot of our service providers, vendors, etc. will be flipping switches for us, but a lot won’t. Understanding what your priorities are for flipping the switch and communicating it to your vendors is important.”
Matt Scholl of NIST shared about the work that NCCOE is doing to provide guidance, tips, and to answer questions such as what are discovery tools and how do I budget? The Migration to Post-Quantum Cryptography project, announced in July 2022, is working to develop white papers, playbooks, demonstrations, tools that can help other organizations implement their conversions to post-quantum cryptography. Other resources that offer good guidance, according to Scholl, include recent CISA Guidance, DHS’ roadmap and the Canadian Centre for Cybersecurity.
One additional resource that has been extremely helpful for our CISO customers is Quantinuum’s CISO’s Guide to Post-Quantum Standardization. The guide outlines what CISOs from any organization should be doing now and provides a basic transition roadmap to follow.
The discussion wrapped up with the acknowledgement that quantum has finally become part of the mainstream cybersecurity discussion and that the future benefit of quantum computing far outweighs the challenges of transitioning to new cryptography. As a parting thought, I emphasized the wonderful opportunity that agencies have to rethink how they do things and encouraged attendees to secure management commitment and funding for this much-needed modernization.
I want to give a special thanks to my fellow panelists for the engaging discussion: Margaret Salter, Director, Applied Cryptography, AWS, Dr. Mark Sherman, Director, Cybersecurity Foundations, CMU, Matthew Scholl, Chief of the Computer Security Division, ITL, NIST, and Dr. Adrian Stanger, Cybersecurity Directorate Senior Cryptographic Authority NSA.
Quantinuum, the world’s largest integrated quantum company, pioneers powerful quantum computers and advanced software solutions. Quantinuum’s technology drives breakthroughs in materials discovery, cybersecurity, and next-gen quantum AI. With over 500 employees, including 370+ scientists and engineers, Quantinuum leads the quantum computing revolution across continents.
At the heart of quantum computing’s promise lies the ability to solve problems that are fundamentally out of reach for classical computers. One of the most powerful ways to unlock that promise is through a novel approach we call Generative Quantum AI, or GenQAI. A key element of this approach is the Generative Quantum Eigensolver (GQE).
GenQAI is based on a simple but powerful idea: combine the unique capabilities of quantum hardware with the flexibility and intelligence of AI. By using quantum systems to generate data, and then using AI to learn from and guide the generation of more data, we can create a powerful feedback loop that enables breakthroughs in diverse fields.
Unlike classical systems, our quantum processing unit (QPU) produces data that is extremely difficult, if not impossible, to generate classically. That gives us a unique edge: we’re not just feeding an AI more text from the internet; we’re giving it new and valuable data that can’t be obtained anywhere else.
One of the most compelling challenges in quantum chemistry and materials science is computing the properties of a molecule’s ground state. For any given molecule or material, the ground state is its lowest energy configuration. Understanding this state is essential for understanding molecular behavior and designing new drugs or materials.
The problem is that accurately computing this state for anything but the simplest systems is incredibly complicated. You cannot even do it by brute force—testing every possible state and measuring its energy—because the number of quantum states grows as a double-exponential, making this an ineffective solution. This illustrates the need for an intelligent way to search for the ground state energy and other molecular properties.
That’s where GQE comes in. GQE is a methodology that uses data from our quantum computers to train a transformer. The transformer then proposes promising trial quantum circuits; ones likely to prepare states with low energy. You can think of it as an AI-guided search engine for ground states. The novelty is in how our transformer is trained from scratch using data generated on our hardware.
Here's how it works:
To test our system, we tackled a benchmark problem: finding the ground state energy of the hydrogen molecule (H₂). This is a problem with a known solution, which allows us to verify that our setup works as intended. As a result, our GQE system successfully found the ground state to within chemical accuracy.
To our knowledge, we’re the first to solve this problem using a combination of a QPU and a transformer, marking the beginning of a new era in computational chemistry.
The idea of using a generative model guided by quantum measurements can be extended to a whole class of problems—from combinatorial optimization to materials discovery, and potentially, even drug design.
By combining the power of quantum computing and AI we can unlock their unified full power. Our quantum processors can generate rich data that was previously unobtainable. Then, an AI can learn from that data. Together, they can tackle problems neither could solve alone.
This is just the beginning. We’re already looking at applying GQE to more complex molecules—ones that can’t currently be solved with existing methods, and we’re exploring how this methodology could be extended to real-world use cases. This opens many new doors in chemistry, and we are excited to see what comes next.
Last year, we joined forces with RIKEN, Japan's largest comprehensive research institution, to install our hardware at RIKEN’s campus in Wako, Saitama. This deployment is part of RIKEN’s project to build a quantum-HPC hybrid platform consisting of high-performance computing systems, such as the supercomputer Fugaku and Quantinuum Systems.
Today, a paper published in Physical Review Research marks the first of many breakthroughs coming from this international supercomputing partnership. The team from RIKEN and Quantinuum joined up with researchers from Keio University to show that quantum information can be delocalized (scrambled) using a quantum circuit modeled after periodically driven systems.
"Scrambling" of quantum information happens in many quantum systems, from those found in complex materials to black holes. Understanding information scrambling will help researchers better understand things like thermalization and chaos, both of which have wide reaching implications.
To visualize scrambling, imagine a set of particles (say bits in a memory), where one particle holds specific information that you want to know. As time marches on, the quantum information will spread out across the other bits, making it harder and harder to recover the original information from local (few-bit) measurements.
While many classical techniques exist for studying complex scrambling dynamics, quantum computing has been known as a promising tool for these types of studies, due to its inherently quantum nature and ease with implementing quantum elements like entanglement. The joint team proved that to be true with their latest result, which shows that not only can scrambling states be generated on a quantum computer, but that they behave as expected and are ripe for further study.
Thanks to this new understanding, we now know that the preparation, verification, and application of a scrambling state, a key quantum information state, can be consistently realized using currently available quantum computers. Read the paper here, and read more about our partnership with RIKEN here.
In our increasingly connected, data-driven world, cybersecurity threats are more frequent and sophisticated than ever. To safeguard modern life, government and business leaders are turning to quantum randomness.
The term to know: quantum random number generators (QRNGs).
QRNGs exploit quantum mechanics to generate truly random numbers, providing the highest level of cryptographic security. This supports, among many things:
Quantum technologies, including QRNGs, could protect up to $1 trillion in digital assets annually, according to a recent report by the World Economic Forum and Accenture.
The World Economic Forum report identifies five industry groups where QRNGs offer high business value and clear commercialization potential within the next few years. Those include:
In line with these trends, recent research by The Quantum Insider projects the quantum security market will grow from approximately $0.7 billion today to $10 billion by 2030.
Quantum randomness is already being deployed commercially:
Recognizing the value of QRNGs, the financial services sector is accelerating its path to commercialization.
On the basis of the latter achievement, we aim to broaden our cybersecurity portfolio with the addition of a certified randomness product in 2025.
The National Institute of Standards and Technology (NIST) defines the cryptographic regulations used in the U.S. and other countries.
This week, we announced Quantum Origin received NIST SP 800-90B Entropy Source validation, marking the first software QRNG approved for use in regulated industries.
This means Quantum Origin is now available for high-security cryptographic systems and integrates seamlessly with NIST-approved solutions without requiring recertification.
The NIST validation, combined with our peer-reviewed papers, further establishes Quantum Origin as the leading QRNG on the market.
--
It is paramount for governments, commercial enterprises, and critical infrastructure to stay ahead of evolving cybersecurity threats to maintain societal and economic security.
Quantinuum delivers the highest quality quantum randomness, enabling our customers to confront the most advanced cybersecurity challenges present today.