NISQ Computers – Can We Escape the Noise?
We all hear the term NISQ computer a lot in the quantum computing space these days. NISQ stands for Noisy Intermediate-Scale Quantum. The term was coined by John Preskill in 2018 and is generally used to describe the first generation in the fabrication of quantum processors.
In today’s market, NISQ processors offer a range from around 20 to slightly over 100 qubits, which is the quantum equivalent of a processing unit.
Due to the design of NISQ computers, they are not advanced enough to deliver error-free results, nor are they scalable enough to solve today’s real-world business problems. Again, due to the error-correction requirements that are still needed.
We all know that these early quantum computers aren’t yet capable of delivering the ultimate production power and value that quantum computers will deliver. They suffer from significant issues surrounding scalability and fault-tolerance, each of which is directly related to the need for quantum error correction.
What is Quantum Error Correction?
As you can imagine, quantum processing is very different from classical computing. Classical computing enjoys much more stability of discrete binary states, specifically on and off.
Quantum information processing adds the complexity of superpositions that result in non-discrete states, as well as many additional types of noise. Superposition states must be preserved, which means avoiding quantum decoherence from the noise in the environment.
NISQ computers are designed with the premise that noise is the enemy, and must be precluded to avoid direct changes in the superposed quantum states. Therefore, the goal of current quantum error correction is to preserve the states of superposition of the encoded qubit.
Quantum Error Correction and NISQ Computers
The term ‘noisy’ reflects the extreme sensitivity of these NISQ computers to the occurrences in the environment around them. Noise is part of any quantum system, even in nature. That means that it’s difficult to protect a quantum computer from external noise.
For example, a microwave operating across the street can disrupt the quantum states, resulting in a loss of these states due to quantum decoherence. This is the primary cause of the errors, and drives the need for error-correction methods in these NISQ machines.
Quantum Error Correction (QEC) encompasses code or algorithms specifically designed to identify and fix errors in quantum computers. Early QEC algorithms distribute, or encode, a logical quantum bit.
This means that quantum information stored in a single qubit is shared across “supporting” qubits. The goal of QEC is to protect the original quantum information from any errors while the quantum system processes this information.
The challenge is that QEC has a significant cost regarding how many qubits you need to implement it. Plus, the more noise you have, the more qubits you need.
The required number of error-correcting qubits depends on the specific hardware architecture and the type of algorithm/computation you are running. This number varies, but current estimates put it at around 1000 error correcting qubits for each single computational qubit.
Since today’s NISC computers are not scalable to more than 500 qubits in a quantum annealing machine and 127 in a gate model machine, this makes quantum error correction an impossible task for current NISQ systems. We need much more sophisticated and scalable machines to deliver this critical capability.
A Glimpse of the Future
We know that noise and decoherence, and the errors that result, will limit the scale of machines that can be built.
One piece of good news is that users and vendors have started to discover some new problems that can potentially be solved with noisy intermediate scale quantum computers.
Another opportunity is the use of algorithms designed to minimize the size and complexity of quantum computations, which reduces the scalability requirements for NISQ computers.
Our QAmplify software is an example of such algorithms. QAmplify has demonstrated the ability to amplify the capabilities of gate-model machines by 5x and quantum annealing machines by up to 20x, using real-world optimization problems.
Another shift in the quantum computing market is the advent of new approaches to quantum computing.
QCi recently announced landmark results with regard to scale and precision using an entropy quantum computer, solving a 3800+ variable, 500 constraint Autonomous Vehicle problem for the BMW Group.
This represents the largest problem solved to date on a quantum computer, and also shows the opportunity for innovative approaches beyond NISQ to fuel the acceleration of quantum computing adoption in the marketplace.
The Bottom Line
Quantum computing will change our world. The only question is when, not if.
Today’s NISQ computers represent the initial shift from classical to pure quantum computing. It’s obvious for many that additional innovation and discovery is needed to realize the full value of quantum, beyond science experiments to real world value
A major challenge for quantum computing is noise, the primary source of processing errors. NISQ computers are, by name and nature, noisy.
Will NISQ computers enjoy that innovation from QEC? Or will new quantum approaches be the most effective path to quantum value?
The subjective answers depend on who you talk to, and how their experience shapes their perceptions.
The real answer comes in the ability to solve problems that deliver real-world value, in the most effective way. That quantitative evidence can only come from real-world applications and computations, solved by a quantum computer with precision, speed, and a hefty dose of elegance.