In the history of AITwo notable moments stand out: the success of AlphaGo and the arrival of ChatGPT. These two milestones serve as instructive events in analyzing how far we are from being truly useful. quantum computing.
AlphaGo’s 2016 triumph over world Go champion Lee Sedol marked a turning point in the development of AI. More than a victory in a complex board game, it demonstrated that AI could generate superior results in a specific, albeit complex, task. It gave us insight into truly useful, task-specific AI applications, such as the protein folding tool AlphaFold, which was introduced two years later.
The ChatGPT moment was different. Launched in 2022, ChatGPT showcased AI’s ability to understand context and generate coherent responses, making it immediately useful for a wide range of applications. Although AlphaGo focused strictly on one topic, ChatGPT has broad applicability.
AI took many years to develop. The birth of the technology is commonly traced to a conference at Dartmouth College in 1956, 60 years before AlphaGo. How many years will we have to wait for the AlphaGo and ChatGPT moments of quantum computing, which is itself an exciting technology that promises to solve many problems that are intractable for its classical counterpart?
What is quantum advantage?
The term quantum advantage, also called quantum supremacy, refers to a theoretical point at which a quantum computer is able to accurately solve a computational problem that no classical computer could solve in a reasonable period of time. Achieving this milestone will broadly indicate that quantum computing has surpassed its classical counterpart.
Where is quantum computing today?
Perhaps we should first ask ourselves how far we have come in recent years. Most quantum computer users access a computer through a public cloud, such as those from IBM, Amazon, or Microsoft. When looking at quantum specifications, the number of qubits (quantum bits) is an important parameter to compare different providers. Five years ago, the largest publicly accessible quantum computer had 20 qubits and only two or three vendors made their computers accessible.
Today, however, we have made significant progress. The largest accessible quantum computers have hundreds of qubits, and more than a dozen vendors offer public access. In the past five years, several quantum computing companies have gone public and venture capital firms have invested approximately $5 billion in quantum computing companies. Quantum computing is progressing much faster than AI in its infancy.
A major milestone was achieved in 2019 when Google published what became known as the “Quantum supremacy” experiment. His Sycamore computer completed a task in 200 seconds that Google said the most powerful classical computer would have taken 10,000 years to complete. Does that qualify as the AlphaGo quantum moment? Probably not. Google’s demo used an algorithm designed specifically to show off the capabilities of its computer, but had no practical computational use.
When will quantum computing break through?
What would be the AlphaGo moment in quantum computing? It would likely be a breakthrough in which quantum computers solve a complex but specific problem that goes far beyond the capabilities of classical computers.
By these criteria, it seems that we are getting closer. He Energy company Aramco recently revealed that is putting a quantum algorithm into production, moving it from a “sandbox” test environment where most quantum computing projects operate. The algorithm helps decode subsurface imaging signals, a form of so-called “Earth ultrasound” used to discover minerals.
Likewise, Deloitte Consulting recently reported that a quantum technology machine learning algorithmwhich used a technique called quantum reservoir computing, it produced results superior to classical machine learning algorithms operating on the same data. IBM and UC Berkeley published recent experiments on the 127-qubit IBM Quantum Eagle processor that demonstrated accurate results in complex physics simulations, outperforming classical approximation methods in certain scenarios. Quantinuum reported first signs of quantum advantage for Monte Carlo simulations.
While these events likely don’t qualify as true quantum advantages, they are harbingers of things to come. Classical computers cannot simulate more than about 50 qubits, and with more than 100 qubit computers increasingly available, achieving a demonstrable quantum advantage in a useful algorithm appears to be just a question of when, not if.
Where will quantum computing have an impact?
It is unclear which field would benefit first from such a quantum advantage. Some sellers are very optimistic about the Value of quantum computing in machine learning and AI.while others focus on material and pharmaceutical progress either financial and supply chain optimizations. microsoft published a comprehensive assessment of your view on which applications will achieve quantum advantage first, as well as the quantum resources required. This gives interested customers the ability to estimate when they might be able to use quantum computers to their advantage.
For example, a financial services company can experiment with quantum computers to optimize an asset portfolio made up of 10 assets. However, such a company might decide that quantum technology would be really useful once a portfolio of 500 assets can be efficiently optimized. Resource estimation work provides a roadmap for when this will be possible.
What will quantum computing be?‘What is the ChatGPT moment?
The ChatGPT quantum moment, on the other hand, is further away. It could be a general-purpose quantum computer that can solve many problems beyond those that classical machines can simulate. Therefore, it would not be specifically designed for a specific problem, but could solve several classes of problems. For example, a quantum computer that may be useful for general-purpose optimizations could be used to optimize production schedules, package delivery routes, container loading, inventory portfolio, or the location of vehicle charging stations. electrical devices that provide optimal coverage.
The main challenge to reach that moment is to scale quantum systems while minimizing errors. This problem is two-dimensional. The first dimension is to increase the number of qubits beyond the classical simulation limit, and the second is to create conditions that allow long calculations to be performed without accumulating excessive errors or losing coherence, which is a critical part of the system’s quantum properties. Indeed, quantum error correction is a key focus of multiple industry and academic groups, all of which share the view that performing sustained calculations is critical to achieving true usability of quantum systems.
For context, classical computers have negligible error rates; perhaps only one estimate in a trillion operations could be wrong. By contrast, current quantum computers have much worse error rates. Today, even a 1 percent error (that is, an error in one in a hundred operations) is considered pretty good in quantum computers. But if a computer makes one mistake in one operation out of a hundred, and if an algorithm requires only one hundred operations, it will almost certainly provide an incorrect result. Therefore, for quantum computers to be useful, error rates must be dramatically reduced so that longer, more complex calculations can be performed with confidence.
The future is quantum
Although current quantum computers have yet to demonstrate practical applications that significantly surpass classical ones, the pace of innovation and investment in quantum computing suggests that a breakthrough may not be far away. Collaborations between academia, industry, and government entities are driving the field forward, raising the possibility that we could witness the quantum computing equivalent of the AlphaGo or ChatGPT moment within the next decade.