AI’s growing hunger for power is exposing a grid built for a different era

Share this article

Artificial intelligence is accelerating at a speed the energy sector is struggling to match, putting pressure on grids that are slow to adapt and fragile in the face of concentrated demand. The challenge is not whether enough electricity can be produced but whether it can be delivered fast enough to the right places.

There is a stark difference between the electricity AI needs and the system that must supply it. Global energy demand from data centres may only account for 1.5 per cent of consumption, but the local impact is where the problem begins. If poorly located or rushed into service, a single facility can overwhelm regional infrastructure built for a more balanced and distributed form of demand.

“Data centres can be built in two to three years,” Laura Cozzi, Director of sustainability, Technology and Outlooks at the International Energy Agency, says. “Grid infrastructure, on the other hand, takes six to ten years to plan, permit and build. This mismatch in timelines is at the heart of the problem.”

The pace of AI development does not wait for infrastructure. In some cases, delivery times for essential components such as transformers have extended to four years, with costs rising alongside. The result is not an energy shortage but an infrastructure bottleneck, a logistical drag that delays deployments, raises costs and distorts planning cycles.

Unlike electric vehicles or domestic heat pumps, which spread their load gradually across millions of consumers, AI arrives like a flash flood. Its demand is geographically concentrated, temporally compressed and structurally inflexible unless deliberate choices are made early in the design process.

These demands are also becoming more geographically selective. Many AI developers favour proximity to existing data ecosystems, abundant fibre connectivity, or regions with favourable tax incentives. Yet, these locations are not always aligned with grid capacity. The result is intensifying tension between sitting preference and infrastructure reality.

The silent power behind digital progress

Power, like data, underpins every AI innovation, yet its role is often relegated to the background. The IEA’s recent work highlights how this silence has made critical issues unexamined. When estimates for AI-related electricity consumption vary by a factor of two, it reveals how little shared understanding exists between sectors now inextricably linked.

“If we do not have accurate, transparent data on current energy use, chip efficiency, cooling requirements and deployment patterns, we cannot meaningfully project future needs,” Cozzi explains. “That is why collaboration between the energy and AI sectors is essential.” To support this collaboration, the IEA is launching an AI and Energy Observatory, a platform that will offer publicly available, region-specific data on energy usage patterns in data centres and across AI workloads. It will also serve as a library of case studies showing how AI is applied to improve energy systems, creating a feedback loop between consumption and optimisation.

More than just a data repository, the Observatory is a call to action. Without consistent measurement and benchmarking, policy responses will be misaligned, grid investment will lag demand, and AI deployments will be at the mercy of local political and infrastructure limitations.

The challenge is not confined to electricity. The raw materials that underpin AI and energy revolutions, particularly copper, face supply and demand imbalances. Coordinated planning around these dependencies is still rare. The resource scarcity that slows solar deployment can also delay chip production or infrastructure buildout.

Energy efficiency is not a luxury — it is a lifeline

While much of the AI discourse focuses on performance, accuracy and scale, energy efficiency has quietly become the sector’s most remarkable achievement. Cozzi notes that the energy system improves efficiency by around one per cent per year. In contrast, GPU efficiency increased by 50 per cent annually between 2016 and 2024.

The result is not only a technological leap but an environmental one. The most advanced AI models require vast computational resources; one leading model needed 10^24 calculations to train. Had it been built using early-2000s hardware, its training phase alone would have consumed as much electricity as China uses in a year. Thanks to modern architecture, energy usage was roughly equivalent to Paris’s.

These efficiency gains are not inevitable. They result from targeted hardware design, software optimisation and cooling innovation, each dependent on close coordination between industry, energy providers and regulators. And yet, while efficiency reduces intensity, it cannot offset the sheer velocity of growth.

Even under conservative scenarios, AI’s electricity demand will continue to rise. The issue is not just how much power AI will need but how fast those needs will change and whether planning systems can adapt in time.

There is also a question of equity. Regions with insufficient infrastructure may be excluded from AI-driven investment simply because they cannot offer the required power delivery guarantees. In these cases, energy policy becomes industrial policy, and the two must be treated as part of a shared planning framework.

Flexibility is an untapped asset

The grid is not simply a supply problem. It is increasingly a flexibility problem. As renewable energy grows, variability becomes harder to manage. This is where AI’s architecture may hold an overlooked advantage. If the underlying systems are designed accordingly, their workloads can be highly mobile and temporally agile.

“Electricity systems today require flexible loads just as much as additional supply,” Cozzi explains. “Newer, modern chips allow for more flexible operation, shifting where and when inference or training occurs. This growing flexibility can help stabilise the grid, particularly in congested systems, by dynamically matching supply and demand.”

It is a fundamental shift in thinking. If AI facilities are built with grid-awareness in mind, considering location, load balancing and time-of-day flexibility, they can become stabilising assets rather than volatile additions. That, however, requires foresight and coordination, not reactive approvals based on data centre land availability or speculative real estate plays.

AI’s flexibility also opens the door to policy reform. Both sides win if the energy sector can identify which regions are best equipped to host large-scale digital infrastructure and if developers can adapt their siting strategies accordingly. However, such coordination remains rare, and communication between utilities and tech firms is still nascent in many regions.

Flexibility is also relevant in scheduling. Training workloads, for instance, do not always need to occur in real time. With appropriate planning and incentive structures, they could be delayed or moved to off-peak hours or renewable-heavy periods, reducing pressure on peak grid usage and aligning computing with decarbonisation goals.

Mutual dependence is the new reality

There is a clear symmetry emerging. AI cannot grow without energy, but energy cannot modernise without AI. The relationship is no longer linear or transactional; it is cyclical, with each side shaping the future of the other.

The IEA has documented use cases where AI is already optimising grid performance. In one example, AI models incorporating real-time weather data and sensor networks reduced grid congestion by 60 per cent. These systems can unlock existing capacity, reduce costs and lower emissions, results that align closely with the broader goals of the energy transition.

AI is also accelerating material discovery in energy technologies. From new solar materials like perovskites to improved battery chemistries, machine learning can explore the vast space of chemical possibilities faster and more efficiently. Cozzi points out that only a fraction of the possible perovskite formulations have been studied, yet the right combinations could dramatically increase solar panel performance.

As AI systems become more deeply embedded in industrial operations, they will also drive efficiency across construction, transport and manufacturing sectors. Early modelling by the IEA suggests that AI has significant potential to lower energy consumption in buildings and industrial processes, even as it fuels growth elsewhere.

What is emerging is a feedback loop of innovation. AI creates new energy demands but also delivers new tools for meeting them. Managing that loop will require alignment between infrastructure timelines, regulatory frameworks and investment strategies. The technical capability exists, but the governance is still catching up.

From fragmented awareness to systemic planning

The most urgent need is not a new chip or a bigger server farm. It is a shared understanding of the energy-AI nexus and a move toward integrated planning. Too often, decisions about data centre expansion, site selection, or hardware design are made in silos. The result is a series of local tensions and global inefficiencies.

The IEA’s upcoming report avoids a single forecast, opting for a range of scenarios to reflect the many uncertainties around AI demand and deployment. However, one conclusion is consistent across all models: electricity demand from AI will grow rapidly, and without systemic planning, that growth will strain fragile networks.

Permitting reform, grid development, raw materials coordination, and transparent data sharing are not glamorous priorities, but they are foundational. Without them, AI’s expansion risks hitting physical, political, and environmental limits far sooner than anticipated.

The infrastructure challenge is not one of ambition but alignment. If the AI and energy sectors treat each other as partners rather than externalities, then the dual goals of digital innovation and sustainable development remain within reach. However, without shared timelines, common data, and coordinated action, the grid could become AI’s most intractable bottleneck and its first real failure point.

Related Posts
Others have also viewed

A new era for AI ecosystem innovation

David Terry, Schneider Electric’s AI Enterprise & Alliance Partner Director for EMEA discusses the emergence ...

AI-scale cooling enters a new phase as data centres seek waterless thermal control

As artificial intelligence reshapes the demands placed on digital infrastructure, data centres face mounting pressure ...

NVIDIA raises the stakes as AI inference enters its industrial phase

As artificial intelligence shifts from experimental models to full-scale production, the economic engine powering it, ...

AI data centres drive demand for real-time renewable energy tracking

A new energy agreement covering nLighten’s French data centres signals a shift in how AI-driven ...