AI infrastructure is fuelling explosive energy demand, pushing data centers into the spotlight. But without a radical rethink of how they consume, generate, and manage power, their role in a sustainable future remains uncertain. Data centers must evolve beyond consumers to become intelligent energy participants. And fast.
The exponential rise of AI workloads has triggered a silent but significant surge in electricity demand. The energy profile of AI is unlike anything that came before it. Unlike traditional web searches, which consume fractions of a watt-hour, AI queries can be ten times, or even thousands of times, more power-intensive. A single ChatGPT request might burn through 2.9 watt-hours. Multiply that across billions of queries, and you are looking at a new kind of infrastructure burden, one that is largely invisible to end users but sharply felt by utility operators.
What we are witnessing is not simply an increase in energy consumption. It is a qualitative shift in how that energy is used, demanded, and distributed. “It will be sustainable if it is done right,” Rolf Bienert, Technical and Managing Director of the OpenADR Alliance, says. “I have strong confidence that if we put our minds to it, we can make enough energy available. The challenge is not just availability; it is timing, location, and system resilience.”
False signals and green illusions
Many of today’s so-called green data centers still rely heavily on power purchase agreements and carbon offsets to meet environmental targets. However, these mechanisms are increasingly under scrutiny, particularly when used to justify energy-intensive AI services. If the same renewable certificate is counted across multiple facilities, the numbers start to fall apart.
“There is not necessarily distrust,” Bienert explains. “It is simply that data centers are focused on one thing, processing data. Sustainability becomes a secondary consideration, often enforced by regulation or contracts rather than driven by strategy. Like air conditioning units that were never designed with flexibility in mind, these systems do what they were built to do unless compelled otherwise.”
The real risk is that we underestimate the strain these facilities place on the grid. Beneath the surface of every prompt, query, or generative interaction lies a vast electrical footprint, not just in processing but in cooling, distribution, and redundancy. And unlike traditional infrastructure, this demand profile is spiky, unpredictable, and global.
Smart grids are not science fiction
Microgrids, virtual power plants (VPPs), and distributed energy systems may have once sounded like prototypes, but they are now operational today. These technologies enable data centers to manage their energy demands intelligently, integrate renewables, and, crucially, feed power back into the grid. “A solar system and a battery already form a basic microgrid,” Bienert explains. “If you add a management layer that can also handle EV chargers and appliances, you are effectively operating a VPP. Now imagine that at the scale of a data center.”
The model is not theoretical. In Amsterdam, the Ajax football stadium is powered by wind, solar panels, and second-life batteries from Nissan Leafs. The energy peaks are managed intelligently during matches, and the system dynamically offsets demand. The same model could be applied to data infrastructure, with far more significant consequences.
Redefining the role of the data center
A facility built to host AI workloads should not only consume power, it should also generate, store, and trade it. This requires a fundamental mindset shift, particularly among executive leadership. “You are not just building on land and plugging into fibre,” Bienert adds. “You are creating an energy node. And that changes everything.”
A data center with renewable generation and storage capacity is not merely sustainable; it is also environmentally responsible. It becomes an active grid participant. During off-peak hours, it can export excess power. During high-load periods, it can stabilise demand. Bienert notes that while most current deployments are geared solely toward self-sufficiency, there is untapped opportunity in grid balancing and local generation.
“This is the same transformation that happened with thermostats,” he adds. “They were once dumb devices sold as hardware. Now, they are connected systems used by utilities to manage demand response. Someone saw the business model in that. Data centers can do the same.”
Regulatory drag and standards misalignment
This vision will remain out of reach, however, if power markets and regulations fail to catch up. Much of today’s energy policy was written in an era when utilities delivered power, and customers consumed it. In Germany, for example, energy providers are legally prohibited from influencing customer consumption, a law designed to protect fairness but one that now obstructs demand flexibility. “The idea that utilities should control customer equipment directly is flawed,” Bienert argues. “Instead, systems should motivate customers to act voluntarily. It is about flexibility, not control. Inform, incentivise, and let the systems respond.”
That requires open standards. However, the current standards landscape is fragmented. The industry suffers from what Bienert calls “standards creep,” as each new project introduces yet another specification because existing ones are not 100 per cent fit for purpose. “Developers look at five standards, see none of them meet their needs exactly, and decide to write a sixth,” he says. “But most of the time, they would only need to tweak their requirements slightly to use what is already available.”
What is missing is not technology but integration. The components exist, including standards for devices, interfaces, and protocols; however, without a systems-level definition of how they interact, interoperability becomes a bottleneck.
AI could be the glue
There is a paradox at the heart of the energy-AI relationship. AI infrastructure creates the energy demand, but AI itself might be the solution to managing it. Bienert reveals that work is underway to create an AI-powered operating system for buildings. This platform would sit above the device layer, enabling natural language commands and automated decision-making across diverse systems and standards.
“You tell the system what you want, and the AI manages the building accordingly,” he explains. “It becomes the orchestrator, integrating solar, battery, HVAC, EV chargers, all of it, without the need for humans to worry about which protocol connects to what.” That orchestration layer could be transformative for data centers, allowing AI to both justify and regulate its own energy appetite.
The urgent need for collaboration
Despite the scale of the challenge, a visible disconnect remains between data center operators and the utility sector. While smart grid experts and renewable advocates populate their own industry events, CIOs and CTOs responsible for hyperscale AI infrastructure are rarely in the room. “There is not enough cross-pollination,” Bienert says. “Even within utilities, different departments are siloed. We have begun conversations with one or two data center players who are looking at flexibility, but it is rare. There needs to be more shared vocabulary and open channels.”
If data centers are to become energy producers, grid stabilisers, and intelligent agents in a decentralised power system, the conversation must move beyond offsets and net-zero claims. It must become technical, collaborative, and aligned.
Bienert’s conclusion is clear. “This is not about adding one more responsibility to the CIO’s portfolio. It is about rethinking the entire mission of infrastructure. If AI is going to reshape the world, then the grid that powers it must become equally intelligent.”




