IBM today announced a new strategy for implementing several “fault mitigating” techniques designed to bring about the era of fault-tolerant quantum computing.
In front: Anyone who still clings to the idea that quantum circuits are too noisy for useful computing is about to become disillusioned.
A decade ago, the idea of a working quantum computer system seemed far-fetched to most of us. Today, researchers around the world are connecting to IBM’s cloud-based quantum systems with such frequency that, according to IBM’s director of quantum infrastructure, some three billion quantum circuits are completed every day.
IBM and other companies are already using quantum technology to do things that either could not be done by classical binary computers or would require too much time or energy. But there is still much work to be done.
The dream is to create a usable, fault-tolerant quantum computer capable of clear quantum advantage – the point at which quantum processors are able to do things that classical processors simply cannot.
Background: Here at Neural, we have identified quantum computing as the key technology of 2022 and that is unlikely to change as we continue the eternal march forward.
The bottom line is that quantum computing promises to lift our current computational limits. Instead of replacing the CPU or GPU, it adds the QPU (quantum processing unit) to our tool belt.
What this means depends on individual usage. Most of us don’t need quantum computers because our daily problems are not that difficult.
But for sectors like banking, energy and security, the existence of new technologies capable of solving more complex problems than current technology could represent a paradigm shift like we may not have seen since the advent of steam power.
If you can imagine a magic machine capable of increasing efficiency in numerous high-impact domains – it could save time, money and energy on a scale that could eventually affect every human being on Earth – then you understand. why IBM and others are so excited about building QPUs that demonstrate quantum advantage.
The problem: It is, as you can imagine, very difficult to build pieces of hardware that can manipulate quantum mechanics as a method of performing a calculation.
IBM has spent the past decade figuring out how to solve the fundamental problems plaguing the field — to include the basic infrastructure, cooling, and power source requirements needed to get started in the labs.
Today IBM’s quantum roadmap shows how far the industry has come:
But to get where it’s going, we need to solve one of the few remaining fundamental problems associated with developing useful quantum processors: they’re so noisy.
The solution: Noisy qubits are the current bane of the quantum computer engineer. Essentially, the more processing power you try to squeeze out of a quantum computer, the noisier the qubits get (qubits are essentially the computer bits of quantum computing).
Until now, most of the work suppressing this noise has involved scaling qubits so that the signal the scientists are trying to read is strong enough to squeeze through.
In the experimental stages, solving noisy qubits was largely a game of Wack-a-mole. When scientists came up with new techniques — many of which were pioneered in IBM labs — they sent them to researchers for new applications.
But nowadays the field has progressed quite a bit. The art of error mitigation has evolved from targeted one-time solutions to a full range of techniques.
Today’s quantum hardware is subject to several sources of noise, the most famous of which are qubit decoherence, individual gate errors, and measurement errors. These errors limit the depth of the quantum circuit we can implement. But even for shallow circuits, noise can lead to erroneous estimates. Fortunately, quantum error mitigation provides a collection of tools and methods that allow us to evaluate accurate expectation values of noisy, shallow depth quantum circuits even before the introduction of fault tolerance.
In recent years, we have developed and implemented two common error mitigation methods called zero noise extrapolation (ZNE) and probabilistic error cancellation (PEC).
Both techniques encompass extremely complex applications of quantum mechanics, but they essentially come down to finding ways to eliminate or suppress the noise coming from quantum systems and/or amplify the signal scientists are trying to measure for quantum computations and other things. processes.
Neural Mind: We spoke to IBM’s director of quantum infrastructure, Jerry Chow, who seemed quite excited about the new paradigm.
He explained that the techniques touted in the new press release were already in production. IBM has already shown massive improvements in their ability to scale solutions, iterate advanced results, and accelerate classic processes using quantum hardware.
The bottom line is that there are quantum computers and they work. Currently it’s a bit hit and miss as to whether they can solve a specific problem better than classic systems, but the last remaining hard hurdle is fault tolerance.
IBM’s new “fault mitigation” strategy signals a shift from discovery phase from fault tolerance solutions to implementation.
We give our hats off to the IBM quantum research team. Learn more here on IBM’s official blog.