Artificial intelligence for quantum computing
Artificial intelligence (AI) advancements over the past few years have had an unprecedented and revolutionary impact across everyday application areas. Its significance also extends to technical challenges within science and engineering, including the nascent field of quantum computing (QC). The counterintuitive nature and high-dimensional mathematics of QC make it a prime candidate for AI’s data-driven learning capabilities, and in fact, many of QC’s biggest scaling challenges may ultimately rest on developments in AI. However, bringing leading techniques from AI to QC requires drawing on disparate expertise from arguably two of the most advanced and esoteric areas of computer science. Here we aim to encourage this cross-pollination by reviewing how state-of-the-art AI techniques are already advancing challenges across the hardware and software stack needed to develop useful QC - from device design to applications. We then close by examining its future opportunities and obstacles in this space.
Quantum computing (QC) has the potential to impact every domain of science and industry, but it has become increasingly clear that delivering on this promise rests on tightly integrating fault-tolerant quantum hardware with accelerated supercomputers to build accelerated quantum supercomputers. Such large-scale quantum supercomputers form a heterogeneous architecture with the ability to solve certain otherwise intractable problems. Many of these problems, such as chemical simulation or optimization, are projected to have significant scientific, economic and societal impact1.
However, transitioning hardware from noisy intermediate-scale quantum (NISQ) devices to fault-tolerant quantum computing (FTQC) faces a number of challenges. Though recent quantum error correction (QEC) demonstrations have been performed2,3, all popular qubit modalities suffer from hardware noise, preventing the below-threshold operation needed to perform fault-tolerant computations. But even qubits performing below threshold face scaling obstacles. FTQC is demanding and necessitates more resourceful QEC codes, faster decoder algorithms, and carefully designed qubit architectures. Both QC hardware research and current quantum algorithms also require further development with explorations of more resource-efficient techniques, having the potential to dramatically shorten the roadmap to useful quantum applications.
Though high-performance computing (HPC)4,5,6, and in particular, accelerated GPU computing7,8, already drives QC research through circuit and hardware simulations, the rise of generative artificial intelligence (AI) paradigms9 has only just begun. Foundational AI models10, characterized by their broad training data and ability to adapt to a wide array of applications, are emerging as an extremely effective way to leverage accelerated computing for QC. While the architecture landscape of these models is diverse, transformer models11 have proven particularly powerful, and especially popularized by OpenAI’s generative pre-trained transformer (GPT) models12,13. There is already a strong precedent for these models being applied to technical yet pragmatic tasks in other fields, ranging from biomedical engineering14 to materials science15. Bringing the deep utility and broad applicability of such models to bear on the problems facing QC is a key goal of this review.
There is ample intuition to motivate exploring AI as a breakthrough tool for QC. The inherent nonlinear complexity of quantum mechanical systems16 makes them well-suited to the high-dimensional pattern recognition capabilities and inherent scalability of existing and emerging AI techniques17. Many AI for quantum applications are being realized for near-term development of quantum computers and the long-term operation of scalable FTQC workflows18. This review examines applications of state-of-the-art AI techniques that are advancing QC with the goal of fostering greater synergy between the two fields.
Despite the considerable promise of AI, it is critical to recognize its limitations when applied to QC. AI, as a fundamentally classical paradigm, cannot efficiently simulate quantum systems in the general case due to exponential scaling constraints imposed by the laws of quantum mechanics. Classical simulation of quantum circuits suffers from exponential growth in computational cost and memory consumption. This exponential scaling fundamentally limits the size of quantum systems that classical AI can simulate, impacting their generalizability to larger problems. For example, the GroverGPT-219, which uses large language models (LLMs) to simulate Grover’s algorithm, encounters these constraints. GroverGPT-2’s ability to simulate full circuits is limited by the maximum context length of the LLM, making larger circuits infeasible. It faces limitations in generalization, with performance deteriorating for problem sizes significantly beyond the training data. This suggests that classical resource bottlenecks are effectively relocated rather than removed, contributing to scaling costs and deployment hurdles. AI in the context of simulating large-scale quantum systems serves as a complementary tool for interpreting, approximating, and reasoning about quantum processes, rather than a direct substitute for quantum hardware.
It is worth clarifying that this review focuses solely on the impact of AI techniques for developing and operating useful quantum computers (AI for quantum) and does not touch upon the longer-term and more speculative prospect of quantum computers one day enhancing AI techniques (often referred to as quantum for AI), which are surveyed in ref. 20.
The content of this review is organized according to the causal sequence of tasks undertaken in operating a quantum computer (Fig. 1). We immediately stretch this taxonomy by beginning in “AI for quantum computer development and design” with how AI techniques can accelerate fundamental research into designing and improving the quantum hardware needed to operate a useful device. Then, the sections entitled “AI for preprocessing”, “AI for device control and optimization”, “AI for quantum error correction”, and “AI for postprocessing” step through AI’s roles in the widely accepted QC workflow: preprocessing, tuning, control and optimization, QEC, and postprocessing. AI’s use in algorithm development, that is, AI’s impact on various algorithmic subroutines, is covered throughout the whole manuscript, where relevant, and spans many tasks across the workflow. Each section also concludes with a discussion of the key limitations and challenges for applying AI to such use cases. The review concludes with an “Outlook” looking ahead to fruitful areas where AI might still be applied and speculating on areas of development that will further AI’s ability to solve QC’s remaining challenges.
Posted on: 12/4/2025 9:48:38 AM
|