It’s believed that quantum computing will transform the way we solve chemistry problems, and the Quantinuum scientific team continues to push the envelope towards making that a reality.
In their latest research paper published on the arXiv, Quantinuum scientists describe a new hybrid classical-quantum solver for chemistry. The method they developed can model complex molecules at a new level of efficiency and precision.
Dr. Michał Krompiec, Scientific Project Manager, and his colleague Dr. David Muñoz Ramo, Head of Quantum Chemistry, co-authored the paper, "Strongly Contracted N-Electron Valence State Perturbation Theory Using Reduced Density Matrices from a Quantum Computer".
The implications are significant as their innovation “tackles one of the biggest bottlenecks in modelling molecules on quantum computers,” according to Dr. Krompiec.
Quantum computers are a natural platform to solve chemistry problems. Chemical molecules are made of many interacting electrons, and quantum mechanics can describe the behavior and energies of these electrons.
As Dr. Krompiec explains, “nature is not classical, it is quantum. We want to map the quantum system of interacting electrons into a quantum system of interacting qubits, and then solve it.”
Solving the full picture of electron interactions is extremely difficult, but fortunately it is not always necessary. Scientists usually simplify the task by focusing on the active space of the molecule, a smaller subset of the problem which matters most.
Even with these simplifications, difficulties remain. One challenge is carefully choosing this smaller subset, which describes strongly correlated electrons and is therefore more complex. Another challenge is accurately solving the rest of the system. Solving the chemistry of the complex subset can often be done from perturbation theory using so-called “multi-reference” methods.
In their work, the Quantinuum team came up with a new multi-reference technique. They maintain that only the strongly correlated part of the molecule should be calculated on a quantum computer. This is important, as this part usually scales exponentially with the size of the molecule, making it classically intractable.
The quantum algorithm they used on this part relied on measuring reduced density matrices and feeding them into a multi-reference perturbation theory calculation, a combination that had never been used in this context. Implementing the quantum electronic structure solver on the active space and using measured reduced density matrices makes the problem less computationally expensive and the solution more accurate.
The team tested their workflow on two molecules - H2 and Li2 – using Quantinuum’s hybrid solver implemented in the InQuanto quantum computational chemistry platform and IBM’s 27-qubit device. Quantinuum software is platform inclusive and is often tested on both its own H Series ion-trap quantum systems as well as others.
The non-strongly correlated regions of the molecules were run classically, as they would not benefit from a quantum speedup. The team’s results showed excellent agreement with previous models, meaning their method worked. Beyond that, the method showed great promise for reaching new levels of speed and accuracy for larger molecules.
The future impact of this work could create a new paradigm to perform quantum chemistry. The authors of the paper believe it may represent the best way of computing dynamic correlation corrections to active space-type quantum methods.
As Dr. Krompiec said, “Quantum chemistry can finally be solved with an application of a quantum solver. This can remove the factorial scaling which limits the applicability of this rigorous method to a very small subsystem.”
The idea to use a multi-reference method along with reduced density matrix measurement is quite novel and stems from the diverse backgrounds of the team at Quantinuum. It is a unique application of well-known quantum algorithms to a set of theoretical quantum chemistry problems.
The use cases are vast. Analysis of catalyst and material properties may first benefit from this new method, which will have a tremendous impact in the automotive, aerospace, fine chemicals, semiconductor, and energy industries.
Implementing this method on real hardware is limited by the current noise levels. But as the quality of the qubits increases, the method will unleash its full potential. Quantinuum’s System Model H1 trapped-ion hardware, Powered by Honeywell, benefits from high fidelity qubits, and will be a valuable resource for quantum chemists wishing to follow this work.
This hybrid quantum-classical method promises a path to quantum advantage for important chemistry problems, as machines become more powerful.
As Dr. Krompiec summarizes, “we haven’t just created a toy model that works for near-term devices. This is a fundamental method that will still be relevant as quantum computers continue to mature.”
By Duncan Jones
In September, nearly 200 senior cybersecurity leaders from around the world convened to discuss the state of U.S. cybersecurity at the 2022 Billington Cybersecurity Summit. Topics around cybersecurity were varied and included discussions about moral asymmetry of today’s global threat actors, lessons learned from Ukraine and general discussions around all things that “keep us up at night” concerning cyber threats.
As a speaker at the Summit, I wanted to take a moment to share my take-aways from an important discussion that took place during our breakout session, “Future of Encryption: Moving to a Quantum Resistant World.” My esteemed fellow panelists from NSA, NIST, CMU and AWS exchanged insights as to where U.S. government agencies stand in their preparation for current and future threats to encryption, the likely hurdles they face, and the resources that exist to assist in the transition. Those responsible for moving their agency to a quantum-resistant world should find the following insights worth considering.
With the prospect of powerful quantum computers breaking known encryption methods on the horizon and with federal mandate NSM-10 now in place, the good news is that quantum-proof encryption is finally being discussed. The not-so-good-news is that it isn’t clear to cybersecurity practitioners what they need to do first. Understanding the threat is not nearly as difficult as understanding the timing, which seems to have left agency personnel at the starting gate of a planning process fraught with challenges – and urgency.
Why is the timeline so difficult to establish? Because there is no way of knowing when a quantum-based attack will take place. The Quantum-safe Security Working Group of the Cloud Security Alliance (CSA) chose the date, April 14, 2030, to represent “Y2Q,” also known as “Q-Day” – the moment secure IT infrastructure becomes vulnerable to the threat of a fault-tolerant quantum computer running Shor’s algorithm. The Biden Administration based its implementation timeline on the day that NIST announced the four winning algorithms for standardization. Then there is the “hack now, decrypt later” timeline which suggests that quantum-related attacks may already be underway.
Regardless of the final timeline or potential drivers, one thing that was clear to the panel attendees was that they need to start the transition now.
I get this question often and was not disappointed when one attendee asked, “How can I convince my agency leadership that migrating to quantum-proof encryption is a priority when they are still trying to tackle basic cyber threats?”
The panelists responded and agreed that the U.S. government’s data storage requirements are unique in that classification dates are typically 20 years. This means that systems in development today, that are typically fielded over the next 10 years, will actually have a storage shelf life of 30 years minimum. Those systems need to be “future-proofed” today, a term that should be effective when trying to convince agency leaders of the priority.
The need to future-proof is driven by a variety of scenarios, such as equipment and software upgrades. In general, it takes a long time (and perhaps even longer for government entities) to upgrade or change equipment, software, etc. It will take an extremely long time to update all of the software that has cryptography in place.
The panelists also agreed that given the extensive supply chain supporting federal systems, vendors are a critical component to the overall success of an agency’s future-proofing for the quantum age. In 10-15 years, there will be some government partner/vendor somewhere who will not have transitioned to quantum-proof encryption. For leaders who have not yet prioritized their agency’s cryptography migration, let them ponder that thought — and start to focus on the need to prepare.
The panel shared several past technology migrations that were similar in their minds to the adoption of quantum computing.
Y2K was similar to the looming quantum threat by both the urgency and scale of the government’s need to migrate systems. However, without a deadline assigned to implementing the encryption migration, Y2K is really only similar in scale.
The panelists also recalled when every company had to replace the SHA-1 hash function, but concluded that the amount of time, effort, and energy required to replace current encryption will be way more important than SHA-1 — and way more ubiquitous.
While previous technology migrations help to establish lessons learned for the government’s quantum-proof cryptography migration, the panel concluded that this go-round will have a very unique set of challenges — the likes of which organizations have never had to tackle before.
The consensus among panelists was that agencies need to first understand what data they have today and how vulnerable it is to attack. Data that is particularly sensitive, and vulnerable to the “hack-now, decrypt-later” attacks, should be prioritized above less sensitive data. For some organizations, this is a very challenging endeavor that they’ve never embarked upon before. Now is an opportune time to build inventory data and keep it up to date. From a planning and migration perspective, this is an agency’s chance to do it once and do it well.
It is important to assume from the start that the vast majority of organizations will need to migrate multiple times. Panelists emphasized the need for “crypto agility” that will enable future replacement of algorithms to be made easily. Crypto agility is about how easy it is to transition from one algorithm (or choice of parameters) to another. Organizations that prioritize long-term thinking should already be looking at this.
The panelists added that communicating with vendors early on in the planning process is vital. As one panelist explained, “A lot of our service providers, vendors, etc. will be flipping switches for us, but a lot won’t. Understanding what your priorities are for flipping the switch and communicating it to your vendors is important.”
Matt Scholl of NIST shared about the work that NCCOE is doing to provide guidance, tips, and to answer questions such as what are discovery tools and how do I budget? The Migration to Post-Quantum Cryptography project, announced in July 2022, is working to develop white papers, playbooks, demonstrations, tools that can help other organizations implement their conversions to post-quantum cryptography. Other resources that offer good guidance, according to Scholl, include recent CISA Guidance, DHS’ roadmap and the Canadian Centre for Cybersecurity.
One additional resource that has been extremely helpful for our CISO customers is Quantinuum’s CISO’s Guide to Post-Quantum Standardization. The guide outlines what CISOs from any organization should be doing now and provides a basic transition roadmap to follow.
The discussion wrapped up with the acknowledgement that quantum has finally become part of the mainstream cybersecurity discussion and that the future benefit of quantum computing far outweighs the challenges of transitioning to new cryptography. As a parting thought, I emphasized the wonderful opportunity that agencies have to rethink how they do things and encouraged attendees to secure management commitment and funding for this much-needed modernization.
I want to give a special thanks to my fellow panelists for the engaging discussion: Margaret Salter, Director, Applied Cryptography, AWS, Dr. Mark Sherman, Director, Cybersecurity Foundations, CMU, Matthew Scholl, Chief of the Computer Security Division, ITL, NIST, and Dr. Adrian Stanger, Cybersecurity Directorate Senior Cryptographic Authority NSA.
Quantinuum President and COO Tony Uttley announced three major accomplishments during his keynote address at the IEEE Quantum Week event in Colorado last week.
The three milestones, representing actionable acceleration for the quantum computing eco-system, are: (i) new arbitrary angle gate capabilities on the H-series hardware, (ii) another QV record for the System Model H1 hardware, and (iii) over 500,000 downloads of Quantinuum’s open-sourced TKET, a world-leading quantum software development kit (SDK).
The announcements were made during Uttley’s keynote address titled, “A Measured Approach to Quantum Computing.”
These advancements are the latest examples of the company’s continued demonstration of its leadership in the quantum computing community.
“Quantinuum is accelerating quantum computing’s impact to the world,” Uttley said. “We are making significant progress with both our hardware and software, in addition to building a community of developers who are using our TKET SDK.”
This latest quantum volume measurement of 8192 is particularly noteworthy and is the second time this year Quantinuum has published a new QV record on their trapped-ion quantum computing platform, the System Model H1, Powered by Honeywell.
A key to achieving this latest record is the new capability of directly implementing arbitrary angle two-qubit gates. For many quantum circuits, this new way of doing a two-qubit gate allows for more efficient circuit construction and leads to higher fidelity results.
Dr. Brian Neyenhuis, Director of Commercial Operations at Quantinuum, said, “This new capability allows for several user advantages. In many cases, this includes shorter interactions with the qubits, which lowers the error rate. This allows our customers to run long computations with less noise.”
These arbitrary angle gates build on the overall design strength of the trapped-ion architecture of the H1, Neyenhuis said.
“With the quantum-charged coupled device (QCCD) architecture, interactions between qubits are very simple and can be limited to a small number of qubits which means we can precisely control the interaction and don’t have to worry about additional crosstalk,” he said.
This new gate design represents a third method for Quantinuum to improve the efficiency of the H1 generation, said Dr. Jenni Strabley, Senior Director of Offering Management at Quantinuum.
“Quantinuum’s goal is to accelerate quantum computing. We know we have to make the hardware better and we have to make the algorithms smarter, and we’re doing that,” she said. “Now we can also implement the algorithms more efficiently on our H1 with this new gate design.”
Currently, researchers can do single qubit gates – rotations on a single qubit – or a fully entangling two-qubit gate. It’s possible to build any quantum operation out of just those building blocks.
With arbitrary angle gates, instead of just having a two-qubit gate that's fully entangling, scientists can use a two-qubit gate that is partially entangling.
“There are many algorithms where you want to evolve the quantum state of the system one tiny step at a time. Previously, if you wanted a tiny bit of entanglement for some small time step, you had to entangle it all the way, rotate it a little bit, and then unentangle it almost all the way back,” Neyenhuis said. “Now we can just add this tiny little bit of entanglement natively and then go to the next step of the algorithm.”
There are other algorithms where this arbitrary angle two-qubit gate is the natural building block, according to Neyenhuis. One example is the quantum Fourier transform. Using arbitrary angle two-qubit gates cuts the number of two-qubit gates (and the overall error) in half, drastically improving the fidelity of the circuit. Researchers can use this new gate design to run harder problems that resulted in catastrophic errors in previous experiments.
“By going to an arbitrary angle gate, in addition to cutting the number of two-qubit gates in half, the error we get per gate is lower because it scales with the amplitude of that gate,” Neyenhuis said.
This is a powerful new capability, particularly for noisy intermediate-scale quantum algorithms. Another demonstration from the Quantinuum team was to use arbitrary angle two-qubit gates to study non-equilibrium phase transitions, the technical details of which are available on the arXiv here.
“For the algorithms that we are going to want to run in this NISQ regime that we're in right now, this is a more efficient way to run your algorithm,” Neyenhuis said. “There are lots of different circuits you would want to run where this arbitrary angle gate gives you a fairly significant increase in the fidelity of your overall circuit. This capability also allows for a speed up in the circuit execution by removing unneeded gates, which ultimately reduces the time of executing a job on our machines.”
Researchers working with machine learning algorithms, variational algorithms, and time evolution algorithms would see the most benefit from these new gates. This advancement is particularly relevant for simulating the dynamics of other quantum systems.
“This just gave us a big win in fidelity because we can run the sort of interaction you're after natively, rather than constructing it out of some other Lego blocks,” Neyenhuis said.
Quantum volume tests require running arbitrary circuits. At each slice of the quantum volume circuit, the qubits are randomly paired up and a complex two-qubit operation is performed. This SU(4) gate can be constructed more efficiently using the arbitrary angle two-qubit gate, lowering the error at each step of the algorithm.
The H1-1’s quantum volume of 8192 is due in part to the implementation of arbitrary angle gates and the continued reduction in error rates. Quantinuum’s last quantum volume increase was in April when the System Model H1-2 doubled its performance to become the first commercial quantum computer to pass Quantum Volume 4096.
This new increase is the seventh time in two years that Quantinuum’s H-Series hardware has set an industry record for measured quantum volume as it continues to achieve its goal of 10X annual improvement.
Quantum volume, a benchmark introduced by IBM in 2019, is a way to measure the performance of a quantum computer using randomized circuits, and is a frequently used metric across the industry.
Quantinuum has also achieved another milestone: over 500,000 downloads of TKET.
TKET is an advanced software development kit for writing and running programs on gate-based quantum computers. TKET enables developers to optimize their quantum algorithms, reducing the computational resources required, which is important in the NISQ era.
TKET is open source and accessible through the PyTKET Python package. The SDK also integrates with major quantum software platforms including Qiskit, Cirq and Q#. TKET has been available as an open source language for almost a year.
This universal availability and TKET’s portability across many quantum processors are critical for building a community of developers who can write quantum algorithms. The number of downloads includes many companies and academic institutions which account for multiple users.
Quantinuum CEO Ilyas Khan said, “Whilst we do not have the exact number of users of TKET, it is clear that we are growing towards a million people around the world who have taken advantage of a critical tool that integrates across multiple platforms and makes those platforms perform better. We continue to be thrilled by the way that TKET helps democratize as well as accelerate innovation in quantum computing.”
Arbitrary angle two-qubit gates and other recent Quantinuum advances are all built into TKET.
“TKET is an evolving platform and continues to take advantage of these new hardware capabilities,” said Dr. Ross Duncan, Quantinuum’s Head of Quantum Software. “We’re excited to put these new capabilities into the hands of the rapidly increasing number of TKET users around the world.”
The average single-qubit gate fidelity for this milestone was 99.9959(5)%, the average two-qubit gate fidelity was 99.71(3)% with fully connected qubits, and state preparation and measurement fidelity was 99.72(1)%. The Quantinuum team ran 220 circuits with 90 shots each, using standard QV optimization techniques to yield an average of 175.2 arbitrary angle two-qubit gates per circuit.
The System Model H1-1 successfully passed the quantum volume 8192 benchmark, outputting heavy outcomes 69.33% of the time, with a 95% confidence interval lower bound of 68.38% which is above the 2/3 threshold.
The IEEE International Conference on Quantum Computing and Engineering – or IEEE Quantum Week -- begins this week, serendipitously located in Broomfield, Colorado this year, home to Quantinuum’s U.S. corporate headquarters.
At the conference, Quantinuum’s leadership in bridging the gap between the science of quantum computing and the development of a commercial industry will be on full display.
Quantinuum President and COO Tony Uttley will deliver a much-anticipated keynote address at IEEE Quantum Week titled, “A Measured Approach to Quantum Computing” on Thursday. An additional 17 company engineers, physicists and other scientists will participate in four panels, three workshops and a mentorship session as well as deliver a tutorial and technical paper presentation at the conference this week.
Quantinuum team members will be participating in a variety of sessions vital to the growth of the quantum ecosystem, from educating students about the field and mapping out careers in the industry to explaining the science behind trapped ion quantum computers and describing the architectures of logical qubits.
An important discussion about the UCSB NSF Quantum Foundry and its mission to develop materials and interfaces to power quantum-based electronics will be led by Dr. Bob Horning, Senior Technical Manager for Wafer Fabrication at Quantinuum.
In addition to hosting sessions and speaking at the event, Quantinuum researchers will present the following posters during the conference:
Quantinuum looks forward to connecting with the diverse community of quantum researchers, learners, and industry experts at IEEE Quantum Week who are all helping to pave the way forward in the field.
Please see the complete list of sessions featuring Quantinuum team members below.
Keynote: President and COO Tony Uttley, “A measured approach to quantum computing,” Thursday, Sept. 22, 5:30 pm.
Workshop: Principal Scientist Curtis Volin, “Careers in quantum computing: How to get started with quantum computing—A workshop for high schoolers,” Sunday, Sept. 18, 10:00 am.
Technical paper: Jacob Johansen, Atomic, Molecular, and Optical Physicist; Brian Estey, Physicist; Mary Rowe, Research Scientist; and Anthony Ransford, Research Scientist, “Quantum hardware-1—Fast loading of a trapped ion quantum computer using a 2D magneto-optical trap,” Monday, Sept. 18, 1:00 pm.
Mentorship program: R&D Manager Brian Mathewson, “Student mentorship breakfast,” Monday, Sept. 19, 9:30 am.
Workshop: Advanced Software Engineer Peter Campora, “Azure Quantum: A Platform for Quantum Computing Research, Education and Innovation,” Tuesday, Sept. 10:00 am.
Workshop: Senior Director of Technology Development Steve Sanders, “Classical control systems for quantum computing,” Tuesday, Sept. 20, 10:00 am.
Panel: Senior Technical Manager for Wafer Fabrication Dr. Bob Horning, “The Quantum Foundry,” Sept. 20, 3:15 pm.
Panel: Senior Advanced Physicist Ciaran Ryan-Anderson, “Architectures for logical qubits,” Wednesday, Sept. 21, 10:00 am.
Tutorial: Daniel Mills, Research Scientist, and Cristina Cirstoiu, Research Scientist, “Developing and Executing Error-mitigated NISQ Algorithms across Devices and Simulators,” Thursday, Sept. 22, 10:00 am.
Workshop: Natalie Brown, Advanced Physicist, and Ciaran Ryan-Anderson, Senior Advanced Physicist, “Real-time decoding for fault-tolerant quantum computing,” Thursday, Sept. 22, 10:00 a.m.
Panel: Caroline Figgatt, Senior Atomic, Molecular and Optical Physicist; Liz Argueta, Software Engineer; and Tammie Borders, Senior Business Development Manager, “Being your authentic self: Promoting DEI in quantum computing,” Thursday, Sept. 22, 3:15 pm.
*All sessions are listed in Colorado time, Mountain Time Zone, or UTC-6
Alex Chernoguzov, the Chief Engineer of Commercial Products at Quantinuum, is helping to bring this programming platform to Quantinuum’s world-class quantum hardware.
“The more languages that support quantum, the better, because that opens up an opportunity for different software specialists to start programming in quantum environments,” Chernoguzov said. “We need to develop a new workforce that's educated on quantum information science topics and capable of generating new algorithms that can run on quantum computers.”
Tony Uttley, president and chief operating officer at Quantinuum, said platforms such as QODA are important for the company and the quantum computing industry.
“At Quantinuum, our objective is to accelerate quantum computing’s utility to the world,” Uttley said. “By bringing forward additional tools like QODA, we expand the number of brilliant people aiming their talents at getting the most out of today’s quantum computers.”
Quantum computers speak a different language than classical machines. Also, the current landscape doesn’t have many effective quantum compilers to support interoperability with classical machines. The NVIDIA QODA platform aims to change that. Until recently, most quantum programming languages were based on Python because many scientists are familiar with it, Chernoguzov said.
“QODA adds quantum capabilities to C++ because this language is what's often used to program high performance computing machines,” he said. “Having a C++ dialect expands the possible languages that you can program quantum with.”
Chernoguzov said interoperability between classical and quantum systems was another core goal of this project.
“Let’s say you have a hybrid program that has some classical parts and some quantum parts,” he said. “You compile the program. There is a classical piece that you can run on a CPU or a GPU, and there is a quantum piece that you need to send to a quantum computer. In a sense, you could look at it as a quantum processor acting as a co-processor for the other classical processors you need for your program. After completion, you gather everything together and do some more classical computations and repeat the process.”
Quantinuum’s H1 quantum machine will act as a quantum processor working in conjunction with larger classical systems. If a computational task has an element that could be solved more easily by a quantum architecture, this task can be passed off to H1 so researchers can solve quantum problems. This process will currently work in a similar fashion to other cloud-based services with programs submitted for execution over the cloud to H1.
Quantinuum hardware and the NVIDIA QODA platform are bridging the gap between existing classical architectures and emerging quantum resources and using the strengths of each system to solve complex problems.
“Let’s say you want to model a complex chemical molecule. Atomic interactions are best handled by a quantum computer,” Chernoguzov said, “but directing the overall program flow to tell it what to model and how to model it is best done by the classical computers.” NVIDIA’s QODA platform helps reveal a world where these two ecosystems coexist and thrive together.
Chernoguzov also explained the benefits of the Quantum Intermediate Representation (QIR) Alliance: a group of people and organizations who are committed to improving interoperability for quantum machines. This group’s work forms the basis for the hybrid approach that uses both classical and quantum machines.
“Interoperability in the quantum world is possible and the QIR is a good fit for that,” he said. “Quantum computers cannot do everything themselves, but classical compute is also clearly limited. We need both, and they need to work closely together to solve difficult problems that neither technology can solve on its own.”