Analysis In the early 1940s, the first vacuum tube computers were launched to solve problems beyond the reach of their human counterparts. These huge machines were complex, specific and generally unreliable.
In many ways, today’s quantum systems bear striking similarities to early vacuum tube computers in that they are also incredibly expensive, specialized, and unintuitive.
Later computers like the UNIVAC I in 1951 or the IBM 701 presented the possibility of a competitive advantage for the few companies with the budgets and expertise to implement, program and maintain such beasts. According to Matthew Brisse, an analyst at Gartner, a similar phenomenon is occurring today with quantum systems, as companies seek to gain efficiencies by any means necessary.
The issue of quantum supremacy – that is, scenarios in which quantum systems outperform classical computers – is a topic of ongoing debate. But Brisse emphasizes that “there is not a single thing that quantum technology can do today that cannot be done classically.”
However, he notes that by combining classical and quantum computing, some early adopters (particularly in the financial and banking industry) have been able to achieve some sort of advantage over classical computing alone. It’s not always clear whether these advantages rise to the level of a competitive advantage, but they do contribute to the fear that those who don’t invest early risk missing out.
Quantum FOMO is real
If history repeats itself, as it often does, it will be the early adopters of quantum systems who will gain the most, hence the FOMO. But is that fear well raised?
Governments, for example, have invested a significant amount in the possibility of quantum technology materializing as a true competitive threat without yet having defined the “killer application” for quantum technology. Earlier this year, the Defense Advanced Research Projects Agency, better known as DARPA, launched the Underexplored Systems for Utility-Scale Quantum Computing (US2QC) initiative to accelerate the development and application of quantum systems. The idea behind this is that if a quantum system becomes capable of cracking modern encryption like Colossus did with German ciphers all those years ago, Uncle Sam doesn’t want to be left trying to catch up.
It’s still open to debate whether quantum systems that crack encryption are something we really need to worry about, but the same logic applies to businesses, especially those looking to gain an advantage over their competitors in the medium to long term.
“It’s not about what you can achieve today. It’s about preparing for the innovations that will come next,” according to Brisse. “We are out of the laboratory and now we are thinking about commercialization.”
That’s why companies like Toyota, Hyundai, BBVA, BSAF, ExxonMobil and others have partnered with quantum computing providers in case the technology can help develop better batteries, optimize routes and logistics, and/or reduce investment risk. .
But while the commercialization of quantum computing may be underway, recent developments around generative AI may end up hindering adoption of the technology, at least in the short term.
Brisse notes that most CIOs look to invest in technologies with relatively short returns on investment. With GPUs and other accelerators used to power AI models, they can expect short-term results, while quantum computing remains a long-term investment.
Still, Brisse says he hasn’t seen companies abandon their quantum investments; He certainly has seen a shift in priority toward generative AI.
Quantum comparisons are a bit complicated
To make matters worse for those trying to get their hands on cross-buying quantum systems, it can be something of a minefield.
There are dozens of providers claiming to offer quantum services in systems ranging from a few dozen qubits to thousands of them. While this may seem like an obvious metric for judging the maturity and performance of a quantum system, it actually depends on a number of factors, including things like decoherence and the quality of the qubits themselves.
We compare this to the “core wars” in modern processors. Those in an Intel CPU will have very different performance characteristics compared to the CUDA cores in an Nvidia GPU. Depending on what you’re doing, a job that might work well on a handful of Intel cores might require thousands of CUDA cores, if it works at all.
The same goes for quantum systems, which are often optimized for specific workloads. For example, Brisse argues that an IBM system might work better in computational chemistry, while D-Wave systems might be better suited for optimization tasks like path planning. “Different quantum systems solve quantum problems differently,” she explained.
The high cost and often exotic conditions, such as operating temperatures close to absolute zero, mean that many quantum systems up to this point have been rented in cloud form “as a service”. However, some vendors, such as IonQ, have recently teased rack-mounted quantum systems that can be deployed in enterprise data centers. It remains to be seen when these “next” machines will actually ship.
That said, Brisse still doesn’t see many benefits to on-premises deployments other than latency-sensitive applications. He expects most local deployments to focus on scientific research, likely along with high-performance computing deployments.
We’ve already seen this to some extent with the European supercomputer Lumi, which received a small quantum computing upgrade last fall.
For Brisse, this research remains important in taking quantum computing beyond conventional problems.
“Today, we are only solving classical problems in a quantum way, but the real innovation… will come when we solve quantum problems in a quantum way with quantum algorithms,” he opined. “I think that’s the big ‘aha’ in quantum: not whether we can go faster, but whether we can actually solve new kinds of problems.” ®