As quantum mechanics marks its centennial, this issue of Nature Computational Science features a Focus that outlines the impact of quantum mechanics in advancing computing technologies, while discussing the challenges and opportunities that lie ahead.
Quantum mechanics emerged in the early twentieth century when scientists sought to explain phenomena that classical physics could not elucidate, such as the discrete energy levels of the hydrogen atom. In 1900, Max Planck introduced the concept of energy quantization to explain blackbody radiation1, which is considered the birth of quantum theory. Later, Niels Bohr’s atomic model[2](https://www.nature.com/article…
As quantum mechanics marks its centennial, this issue of Nature Computational Science features a Focus that outlines the impact of quantum mechanics in advancing computing technologies, while discussing the challenges and opportunities that lie ahead.
Quantum mechanics emerged in the early twentieth century when scientists sought to explain phenomena that classical physics could not elucidate, such as the discrete energy levels of the hydrogen atom. In 1900, Max Planck introduced the concept of energy quantization to explain blackbody radiation1, which is considered the birth of quantum theory. Later, Niels Bohr’s atomic model2,3, Werner Heisenberg’s matrix mechanics4, and Erwin Schrödinger’s wave equation5 collectively established a comprehensive framework for quantum mechanics that explained why electrons occupy discrete energy levels and exhibit wave–particle duality, fundamentally changing our view of matter. These breakthroughs also paved the way for modern computing technologies.
Credit: adventtr / E+ / Getty Images
This year marks the centennial of quantum mechanics, honoring Heisenberg and his contemporaries’ works on laying the foundation for modern quantum theory. To celebrate the anniversary, this issue of Nature Computational Science presents a Focus that explores the profound impact of quantum mechanics on advancing computational capabilities.
The first notable impact of quantum mechanics on computing was its provision of a theoretical framework to understand electron behavior in solids, which is essential for semiconductor design. By explaining how electrons move through crystal lattices and interact with impurities, quantum mechanics enabled precise doping strategies that control conductivity in materials. Between the 1940s and 1950s, this understanding led to the creation of p–n junctions, the building block of the transistor, a tiny electronic switch that serves as the physical basis for modern digital computing, enabling smaller, faster, and more reliable computing machines. This fueled the later exponential growth of computing power, such as the spread of supercomputers.
A more modern application of quantum principles to computation is undoubtedly quantum computing. Instead of classical bits (0 or 1), quantum computers use qubits, which can exist in superpositions of states and leverage entanglement and interference. This gives quantum computers the capability of potentially solving certain problems much faster than conventional digital computers, such as factoring large numbers with the well-known Shor’s algorithm6,7, simulating quantum systems with variational quantum eigensolvers8, and optimizing problems with the quantum approximate optimization algorithm (QAOA)9. Despite this potential, many practical challenges, such as the control of quantum error and the difficulties in scaling up, have ignited a central debate in the community: whether or when quantum computers will surpass classical digital counterparts. As part of the Focus, an Article by Stefan Woerner and colleagues demonstrates that QAOA with a parameter transfer approach can efficiently approximate Pareto fronts for complex problems, outperforming classical approaches in certain cases for multi-objective optimization. Vishwanathan Akshay and Mile Gu highlight in an accompanying News & Views that this study sharpens the debate on practical quantum advantage, as many real-world problems can be formulated as multi-objective optimization tasks, such as city network design, which requires balancing multiple factors, including infrastructure costs and shifting demand. Another example of potential quantum advantage is quantum machine learning (QML), which leverages quantum computers to enhance learning and inference. In a Comment, Dong-Ling Deng and colleagues discuss the challenges and opportunities associated with implementing QML in more practical applications, including gradient vanishing during training when scaling system size (known as barren plateaus), and the high encoding cost required to use widely available classical datasets in QML.
It is worth emphasizing that one of the most substantial challenges in quantum computing is its susceptibility to errors: qubits are extremely fragile and prone to decoherence, which introduces errors during computation. Consequently, developing effective quantum error correction and mitigation methods with a low qubit overhead remains critical to fully unlock the potential of quantum computing. In a Perspective, Hengyun Zhou and colleagues underscore the importance of full-stack fault-tolerant design strategies, aiming to reduce overhead through the co-optimization of algorithms, error-correcting codes, and hardware architecture. They also note that effective decoding that infers actual errors from error signals (also known as syndromes) can become particularly challenging for high-rate error-correcting codes. Correspondingly, in an Article, Yiqing Zhou, Eun-Ah Kim and colleagues propose an efficient machine-learning (ML)-assisted quantum error decoder that efficiently decodes correlated errors in logical circuits, achieving high accuracy and competitive speed compared to traditional methods. As highlighted in an accompanying News & Views by Xiu-Hao Deng and Yuan Xu, the proposed decoder can handle both single and multiple gates within a unified framework, distinguishing it from existing ML-based decoders. Together, these advances chart a path toward practical, scalable quantum computing, signaling a future where quantum systems could tackle problems far beyond the reach of classical machines.
Another notable achievement of quantum mechanics is its application to answer chemistry and materials questions. Central to this progress is density functional theory (DFT), which provides a quantum-mechanical framework for predicting material properties with noteworthy accuracy and affordable computational cost. However, when it comes to more complex problems, including large-scale systems or systems with strong correlation effects, DFT faces many challenges in maintaining a good balance between accuracy and efficiency. In a Review, Yong Xu and colleagues discuss deep-learning electronic structure methods, such as deep-learning quantum Monte Carlo and deep-learning DFT, which overcome the long-standing accuracy–efficiency trade-off and extend first-principles simulations to unprecedented scales. Similarly, in a Perspective, Olexandr Isayev and colleagues highlight ML interatomic potentials trained by DFT results as an essential tool, addressing challenges of chemical accuracy, computational speed, interpretability, and generalizability through physics-informed architectures and foundation models trained on vast datasets. Complementing these efforts, a recent Article by Philipp Grohs and co-authors introduces transferable neural wavefunctions for solids, where a single optimized neural network can model diverse crystalline systems, drastically reducing computational costs. As Yubing Qian and Ji Chen note in their News & Views, the use of a single neural network to learn and represent electronic wavefunctions across diverse physical configurations represents a conceptual shift in quantum system simulation. The integration of ML and quantum chemistry methods promises to expand our simulation capability into more complex and larger chemical systems.
The advances highlighted in this Focus are not exhaustive but illustrate both the enduring impact of quantum mechanics and its expanding frontier, where ML, quantum algorithms, and quantum computing technologies converge to tackle real-world challenges. These developments signal a future in which quantum technologies will become indispensable tools for our society, potentially unlocking capabilities far beyond the reach of classical approaches.