Course introduction
Before beginning, please complete this short pre-course survey, which is important to help improve our content offerings and user experience.
Click below to hear a course introduction from Olivia Lanes, or open the video in a separate window on YouTube.
About this course
Welcome to Quantum Computing in Practice — a course that focuses on today's quantum computers and how to use them to their full potential. It covers realistic potential use cases for quantum computing as well as best practices for running and experimenting with quantum processors having 100 or more qubits.
Quantum utility
It's an exciting time for quantum computing. After many years of theoretical and experimental research and development, quantum computers are approaching a point at which they can begin to compete with classical computers and demonstrate utility.
Utility is not the same thing as quantum advantage, which refers to quantum computers outperforming classical computers for meaningful tasks. Classical computers have incredible power and adaptability, and the fact of the matter is that quantum computers simply aren't able to beat them yet. We've seen decades of advancements in classical computation — not only in computing hardware but also in algorithms for classical computers — and we can observe with clarity that the technology of electronic digital computing has radically changed our world.
Quantum computing, on the other hand, is at a different stage in its development. Quantum computing places extreme demands on our control of quantum mechanical systems and pushes the boundaries of today's technology — and we cannot realistically expect to master this new technology and beat classical computing right out of the starting gate. But we are seeing signs that suggest that quantum computers are starting to become competitive with classical computing methods for selected tasks, which is a natural step in the technological evolution of quantum computing known as quantum utility.
As the technology advances and new methods for quantum computing are developed, we can reasonably expect that its advantages will become increasingly pronounced — but this will take time. As this happens we're likely to see a back-and-forth interaction with classical computing: quantum computing demonstrations will be performed and classical computing will respond, quantum computing will take another turn, and the pattern will repeat. And one day, when a quantum computer's performance can't be matched classically, we'll hypothesize that we've seen a quantum advantage — but even then we won't know for sure! Proving impossibility results for classical computers is itself an impossibly difficult problem as far as we know.
Simulating Nature
Classical simulators — meaning computer programs running on classical computers that simulate physical systems — can make predictions about quantum mechanical systems. But classical simulators are not quantum and cannot directly emulate quantum systems. Instead they use mathematical calculations to approximate quantum behavior. As the sizes of the systems being simulated grow the overhead required to do this increases dramatically, placing limits on which quantum systems can be simulated classically, how long the simulations require, and the accuracy of the results.
Quantum computers, on the other hand, can emulate quantum systems more directly — and as a result the overhead they require scales significantly better as the system size grows. This, in fact, was Richard Feynman's idea in the 1980s that first motivated an investigation into the potential of quantum computers. We'll have more to say about this later!
IBM® researchers published a paper in 2023 that showed, for the first time, that a quantum computer can compete with state-of-the-art classical techniques for simulating a particular physical model. It's results can still be matched by advanced techniques running on classical computers — but it bested brute-force algorithms, and it also offers a new data point to which different simulation methods (which are not exact and don't all agree in their predictions) can be compared.