AI needs quantum more than you think

Share this article

As AI models grow larger and classical infrastructure reaches its limits, the spotlight is turning to quantum computing. With hybrid architectures emerging and new partnerships forming between AI and quantum systems, the question facing enterprises is no longer whether to invest but how quickly they can prepare. A new report from IQM Quantum Computers sets out why that future is closer than many realise.

It is no longer helpful to talk about quantum computing as a distant horizon. As highlighted in the State of Quantum 2025 report from IQM Quantum Computers, the sector has passed a critical threshold. According to Jan Goetz, Co-Founder and co-CEO of IQM, “When we published this report last year, people still talked about a ‘quantum winter’ as investments in technology were slowing down. However, since then, many things have changed. We saw large investment rounds, major announcements about tech milestones, and the sales and deployment of quantum computers globally.”

This acceleration is not limited to laboratories or venture capitalists. Industries are now actively shaping use cases and building systems to exploit quantum’s strengths. What is emerging is not a revolution deferred but one that is already unfolding in policy frameworks, enterprise pilots, and infrastructure investment. Goetz calls it “an inflexion point not only for researchers and developers but for entire industries poised to be reshaped by this transformative paradigm.”

Part of what makes this shift so urgent is that AI’s demands are outstripping the capabilities of conventional hardware. The training of foundation models, the scaling of simulation environments, and the growing appetite for real-time inference have all driven up energy and compute requirements. In parallel, classical architecture is being forced into increasingly inefficient compromises. Quantum does not replace classical systems, but it offers a new axis of scale, one that deals not just with volume but with complexity.

The quantum-AI connection

The relationship between quantum computing and artificial intelligence is no longer theoretical; it is now a reality. It is becoming operational. The two disciplines are starting to accelerate one another, each enabling new capabilities in the other. “AI is catalysing quantum development,” Goetz explains, “auto-generating circuits, optimising pulse sequences, and even proposing novel error-correction techniques. On the other hand, future fault-tolerant quantum processors may become critical enablers for AI, offering efficient pathways for training large models and solving combinatorial problems.”

This dynamic is most visible in hybrid systems, where quantum processors are embedded into high-performance computing (HPC) clusters. Institutions such as EuroHPC, RIKEN, and Oak Ridge National Laboratory are already integrating quantum systems into their broader computing stacks. The aim is not to replace classical computing but to extend its capabilities by incorporating quantum computing as an additional tool in the architecture. “Low-latency interconnects, job scheduling systems and unified APIs are under development to ensure both quantum and classical resources are used optimally,” Goetz notes.

These early examples point to a broader principle. The future of AI will not be built on any single technology. It will be built on layered, interoperable systems where classical infrastructure, accelerated hardware, and quantum resources work in concert. The risk is not that organisations adopt quantum too early. The real risk is that they fail to prepare for its integration until it is too late.

From qubits to capability

Technical benchmarks are also evolving. Until recently, progress in quantum computing was judged mainly by qubit count, with the higher the number, the better the system. But as the ecosystem matures, attention is shifting toward more meaningful metrics of computational utility. “A composite measure that accounts for qubit number, coherence, gate fidelity, and circuit depth is needed for benchmarking system performance,” Goetz says.

This redefinition of performance is significant for AI applications. In most enterprise use cases, raw speed is less valuable than reliable, scalable throughput across complex problems. Optimisation, simulation, and machine learning workflows demand systems that can perform predictably across varied conditions. As Goetz puts it, “Researchers and practitioners alike are asking for a metric that truly matters in usable computational power.”

In this context, quantum’s most significant near-term impact is likely to appear in “small-data, high-complexity” problems. These are problems where the volume of data is manageable, but the solution space is vast, an ideal environment for quantum algorithms. “Chemistry, materials modelling, financial optimisation, and aerodynamic simulation are consistently ranked as high-priority domains,” Goetz notes, “not only for their practical relevance but for their alignment with quantum’s strengths.”

The implication for AI is clear. Enterprises seeking to accelerate discovery, optimise logistics, or simulate biological processes cannot continue to rely solely on classical systems. Where the limits of Moore’s Law and classical physics begin to show, quantum offers a viable and increasingly necessary alternative.

Infrastructure, talent, and fragmentation

The road to that future, however, is uneven. Many of the current barriers to quantum-AI integration are not technological but structural. Talent remains a limiting factor. “The supply of quantum-literate engineers is far below projected demand,” Goetz warns. Without a broader pipeline of trained developers, integrators, and system architects, the expansion of quantum capability will continue to be hindered by human resource limitations.

Software fragmentation is another issue. At present, most quantum software development kits (SDKs), such as Qiskit, Cirq, PennyLane, and t|ket⟩, are tightly coupled to specific hardware platforms. This makes it challenging to develop applications that can port easily between systems. “Most SDKs are closely tied to specific hardware, limiting portability and creating friction in multi-vendor environments,” Goetz explains.

This lack of interoperability mirrors the early challenges in classical computing that occurred before open standards and abstraction layers matured. Encouragingly, there are signs of progress. “Promising alternatives are emerging,” Goetz continues. “High-level languages like Qrisp are showing how hardware-agnostic, ‘write once, run anywhere’ quantum programming can reduce complexity and drive broader adoption.”

Another often overlooked challenge lies in the problem definition itself. Organisations struggle not with data acquisition or model execution but with formulating problems in quantum terms. “Users consistently pointed to problem selection and circuit formulation, not execution or data analysis,” Goetz says. The tools and abstractions needed to make quantum programming accessible to non-specialists are still in development. Until they mature, many enterprises will remain on the sidelines.

Building readiness, not just hype

Governments and policy institutions are taking notice. National strategies are now being written not only for quantum capability but for its alignment with digital sovereignty and AI readiness. European initiatives in Finland and Germany, as well as programmes in the United States, Japan, and South Korea, all signal a new level of intent. Funding is no longer limited to academic research; it is also for industrial ecosystems and infrastructure deployment.

But these advances bring their own risks. Without coordination, regional disparities in late-stage investment could lead to asymmetric progress and fragmented innovation. “Regional disparities in late-stage capital investment threaten to create lopsided progress,” Goetz warns. “Quantum is a global technology. Its success will require standards, skills, and infrastructure that span borders.”

That challenge of coordination applies equally to the private sector. Enterprises often find themselves in one of two camps: either locked-in pilots with no clear roadmap for scaling or over-reliant on external providers without developing internal competency. What is needed is not just experimentation but preparation. Quantum must be treated not as a curiosity but as a future platform, one that demands technical, strategic, and cultural readiness.

The shift has already begun

As Goetz reflects, “This study presents a detailed and grounded view of quantum computing as it exists today: no longer just a science experiment, but not yet a mature technology. It is a landscape defined by breakthroughs and bottlenecks, opportunity and uncertainty.”

For enterprise leaders in artificial intelligence, the message is unambiguous. AI systems are becoming more capable but also more dependent on infrastructure that can handle their scale and nuance. Quantum computing is not a replacement, but it is a reinforcement, an enabling layer that will allow AI to evolve beyond its current constraints.

“The quantum future is closer than it appears,” Goetz concludes. “But realising its full potential will require not only scientific ingenuity, but sustained collaboration across disciplines, sectors, and borders.”

In the years ahead, success will belong to those who prepare early, partner wisely and invest in the messy, foundational work of integration. Quantum is not coming. It has already arrived. The question is whether the rest of the stack is ready for it.

Related Posts
Others have also viewed

The data centre is now the machine

For years, artificial intelligence has been framed as a software problem, defined by models, algorithms, ...

Why the next phase of AI will be built in gigawatts not models

Artificial intelligence is moving into an industrial phase where scale, power and physical infrastructure matter ...

The front-runners are no longer experimenting

Most enterprises believe they are doing AI. Very few are reinventing themselves around it. Accenture’s ...

The AI hangover is real, and the hard work is only just starting

The first wave of enterprise AI delivered experimentation at unprecedented speed but left many organisations ...