IBM sees technology close to breakthrough
Four years ago, the Google Quantum AI team announced that their quantum computer could outperform classical calculators – but only on a specific task with no practical applications. Only recently, Google announced that it had made a breakthrough in terms of error correction in quantum computers.
Now competitor IBM is going a step further and claims that in two years’ time, quantum computers will have what it takes to outperform ordinary computers in useful tasks such as calculating material properties or the interactions between elementary particles. In an experiment published in the journal Nature is described, the company is said to have already succeeded.
In the proof-of-principle experiment, the researchers simulated the behavior of a magnetic material on the IBM Eagle quantum processor. In doing so, they managed to circumvent the biggest obstacle to the success of this technology, quantum noise, and obtain reliable results.
When simulating material components, a task that classical computers cannot efficiently handle, some noise is unavoidable in today’s quantum systems. This noise introduces a variety of errors that degrade performance. It is due to the sensitivity of quantum bits or qubits to environmental influences.
The researchers decided to reduce the errors instead of correcting them. In particular, these novel, error-reducing techniques allowed the team on the new experiment to perform quantum calculations “on a scale where classical computers struggle,” says Katie Pizzolato, head of IBM’s quantum theory group in Yorktown Heights, New York.
The research team used IBM’s Eagle quantum processor, which contains 127 superconducting qubits on a chip, to create complex entangled states. These states were used to simulate the dynamics of spins in a material model and to predict precise properties such as magnetization.
Editor’s Recommendations
IBM’s breakthrough shows for the first time that quantum computers with performance in excess of 100 qubits can deliver accurate results that outperform leading classical approaches. Classic computers based on silicon chips use bits that can only take on one of two values: 0 or 1. In contrast, quantum computers use quantum bits or qubits, which can take on many states at the same time.
“It’s optimistic that this can work in other systems and with more complex algorithms,” John Martinis, a physicist at the University of California, Santa Barbara who led the Google team to their 2019 milestone, told Nature News.