For years, artificial intelligence has been framed as a software problem, defined by models, algorithms, and data. At re:Invent 2025 in Las Vegas, Matt Garman made it clear that AI has crossed a more consequential threshold, it has become an infrastructure system governed by energy, physics, and capital.
When Matt Garman took the stage in Las Vegas for AWS re:Invent 2025, the scale of the numbers alone hinted that this was no longer a conversation about software platforms or digital services. He spoke of millions of customers, hundreds of availability zones, global networks spanning millions of kilometres, and data centre capacity measured in gigawatts rather than racks. But what mattered far more than the statistics was the underlying shift in how he framed the nature of artificial intelligence itself. This was not a keynote about features or products. It was a keynote about physical systems, industrial scale, and the reality that AI has now moved beyond the conceptual world of software into the material world of infrastructure.
“The data centre campus is the new computer,” Garman said. “When you are training this next generation of models, it turns out the unit of computation is no longer a server or even a cluster. It is an entire campus acting as a single system.”
That single sentence quietly dismantles much of how AI is still discussed in boardrooms, policy documents, and even technology roadmaps. The prevailing assumption that progress in AI is primarily driven by better models, smarter algorithms, or more data no longer holds in isolation. What now defines the frontier is not code, but power. How much electricity can be delivered to silicon. How efficiently heat can be removed from physical space. How reliably thousands of interconnected components can operate as a single system without failure.
Garman returned to this theme repeatedly, often implicitly, sometimes explicitly, describing AI not as a digital capability but as an industrial system that behaves more like heavy machinery than software. “We added more than 3.8 gigawatts of data centre capacity in the last year alone,” he said. “That is more than anyone in the world. And it is because the demands of AI are fundamentally different from anything we have seen before.”
The campus becomes the computer
A gigawatt is not a cloud metric. It is the output of a nuclear reactor. It is a measure of national energy policy, land use, capital investment, and long-term planning. Once intelligence requires power at this scale, it becomes inseparable from the physical limits of civilisation itself. AI stops being something you deploy and starts being something you build, operate, and govern as part of national infrastructure.
For most of the past decade, the dominant metaphor of cloud computing was abstraction. Virtual machines, containers, serverless functions, and software-defined everything promised to hide the messy reality of hardware behind clean interfaces and infinite scalability. Garman’s keynote inverted that logic. Hardware is no longer invisible. It is the defining constraint. “We used to say the data centre was the new computer,” he said. “Now the data centre campus is the computer.”
This is not rhetorical flourish. It is an architectural reality. Modern AI systems no longer operate within discrete machines or even clusters. They operate across vast, tightly coupled physical environments, where performance depends on networking topology, memory bandwidth, power distribution, cooling efficiency, and fault tolerance across thousands of components. The language of software stops being sufficient once intelligence is produced by systems that behave more like industrial plants than digital platforms.
“No one else can deliver this without co-designing the entire system,” Garman said. “It requires custom silicon, scale-up and scale-out networking, integrated software stacks, and data centres built specifically for this purpose.”
Stripped of branding, this is a description of AI as industrial machinery. Not metaphorically, but literally. Intelligence is now manufactured through physical processes governed by the same forces that once defined steel production, electricity generation, and telecommunications networks. The frontier of AI no longer fits inside traditional innovation narratives. Startups can still build transformative applications, but the substrate beneath them is increasingly controlled by those capable of deploying infrastructure measured in gigawatts, not developer hours.
Intelligence measured in megawatts
One of the most revealing moments in the keynote was how frequently energy appeared as a first-order concern, even when discussing performance. “Five times more AI tokens per megawatt of power,” Garman said, describing efficiency gains. “This is now one of the most important metrics in AI.”
That framing is extraordinary. A few years ago, performance was measured in accuracy, latency, or throughput. Today it is measured in output per unit of energy. Intelligence has become an efficiency problem. Power is no longer a background cost. It is a limiting factor. Power grids, cooling systems, and land availability now directly shape the pace of AI development. The idea that intelligence can scale infinitely through software innovation alone is no longer plausible.
“There are no shortcuts,” Garman said. “To deliver this level of performance, you must optimise across every layer of hardware and software. And you must do it inside data centres designed for this purpose.”
This is where AI stops resembling a digital revolution and starts resembling an energy transition. The future of machine intelligence is inseparable from the future of electricity generation, transmission infrastructure, and environmental policy. It raises uncomfortable questions about sustainability. If AI systems require gigawatts of continuous power, what does that imply for carbon targets, energy markets, and national resilience. These are not technical questions. They are political, economic, and ultimately civilisational.
The rise of AI factories
Perhaps the most quietly radical concept introduced in the keynote was the idea of AI factories. Garman described them as dedicated environments where organisations operate their own large-scale AI infrastructure. “These operate like private regions,” he said. “Giving organisations exclusive access to massive AI systems while maintaining sovereignty and control.”
This is not a product category. It is a new industrial model. AI is no longer something you simply consume. It is something you manufacture, operate, and govern. That reframes enterprise AI entirely. Instead of selecting tools, organisations are being asked to build and manage intelligence as infrastructure. That requires capital investment, physical space, specialised engineering talent, and long-term strategic planning.
Garman framed this partly in terms of regulation and sovereignty. “Customers want security, privacy, and control over where their systems operate,” he said. “They want AI that lives inside their own infrastructure, governed by their own policies.”
But the deeper implication is geopolitical. AI is becoming part of national critical infrastructure. Like power stations or telecom networks, large-scale AI systems are now strategic assets. The barriers to entry are rising, not falling. While models may be open, the infrastructure required to operate them at meaningful scale is increasingly concentrated among those with access to land, energy, and capital.
Throughout the keynote, Garman returned to a simple claim that is far more disruptive than it appears. “Everything we do starts with infrastructure,” he said. “Without secure, scalable, resilient physical systems, none of this is possible.”
This is a fundamental shift in how intelligence is produced and governed. AI is no longer a tool that lives inside organisations. It is an industrial system that organisations must integrate into their physical and economic reality. The historical parallel is not the internet, but electrification. Just as electricity reshaped cities, industries, and geopolitics, AI is now embedding itself into the physical fabric of civilisation.
The difference is speed. Electrification took decades. AI infrastructure is being deployed in years. Entire regions are now competing to host data centres, attract energy investment, and position themselves as AI hubs. Garman did not frame this as a geopolitical contest, but the subtext was unmistakable. “We operate the world’s largest private network,” he said. “Millions of kilometres of terrestrial and subsea cable. Infrastructure at planetary scale.”
This is no longer about competing cloud platforms. It is about who controls the physical systems that will shape economic power over the next generation. AI has escaped the world of software and entered the world of concrete, steel, and electricity. It is no longer virtual. It is physical, capital intensive, and politically consequential. And once intelligence becomes infrastructure, it stops being a technical problem and becomes a societal one.




