Skip to main content

Archived Article — The Daily Perspective is no longer active. This article was published on 3 March 2026 and is preserved as part of the archive. Read the farewell | Browse archive

Technology

AI Data Centres Can Act as Grid Shock Absorbers, UK Trial Shows

A landmark Nvidia-backed demonstration proves hyperscale facilities can slash power draw by 25 per cent during peak demand, with major implications for Australia's own grid crunch.

AI Data Centres Can Act as Grid Shock Absorbers, UK Trial Shows
Image: Toms Hardware
Key Points 3 min read
  • A UK-first live trial at a London AI factory proved data centres can dynamically reduce power consumption in response to real-time grid signals without disrupting critical workloads.
  • An earlier Phoenix demonstration achieved a sustained 25 per cent power reduction over three hours during peak summer grid demand using Nvidia GPUs.
  • A Duke University study estimates this flexibility could unlock 100 gigawatts of existing US grid capacity, equivalent to more than $2 trillion in data centre investment.
  • Australian data centres consumed around 4 TWh in FY25 and AEMO forecasts that could triple to 12 TWh by FY30, making local grid flexibility an urgent priority.
  • The Emerald Conductor platform, backed by Nvidia's venture arm, orchestrates AI workloads in real time, prioritising time-sensitive jobs while throttling flexible ones during grid stress events.

For years, electricity grid operators have treated data centres as the unmovable objects of the power system: turn them on, and they draw their contracted maximum, every hour of every day. A trial conducted in London is now challenging that assumption in a way that could reshape how billions of dollars of digital infrastructure connects to grids around the world, including in Australia.

Emerald AI, EPRI, National Grid, Nebius, and Nvidia recently conducted a live, UK-first demonstration at Nebius' new AI factory in London, with the objective of proving that high-performance AI infrastructure could operate as a power-flexible, grid-responsive asset without disrupting mission-critical workloads. Using Emerald AI's software platform, Emerald Conductor, to manage a cluster of Nvidia GPUs, the trial validated that AI data centres can dynamically adjust power consumption in response to real-time signals, without disrupting critical workloads.

The London demonstration built on a more detailed field test conducted earlier in 2025. A coalition of Nvidia, Oracle, Salt River Project, and the Electric Power Research Institute demonstrated a 25 per cent power reduction sustained over three hours during a Phoenix grid stress event, without compromising AI workload performance. The results, published in Nature Energy, mark the first peer-reviewed validation of data centre demand flexibility at commercial scale. On that occasion, the data centre reduced consumption through a 15-minute ramp down, maintained the 25 per cent reduction over three hours, then ramped back up without exceeding its original baseline consumption.

The technology behind both trials is Emerald AI's Conductor platform. Emerald Conductor coordinates AI workloads across a network of data centres to meet power grid demands, ensuring full performance of time-sensitive workloads while dynamically reducing the throughput of flexible workloads within acceptable limits. The system operates across three tiers: Flex 1 allows up to 10 per cent average throughput reduction, Flex 2 up to 25 per cent, and Flex 3 up to 50 per cent over a six-hour period. In short, a training run that is not time-critical can be slowed; a live inference request cannot.

The scale of the opportunity is striking. A recent Duke University study estimates that if new AI data centres could flex their electricity consumption by just 25 per cent for two hours at a time, fewer than 200 hours a year, they could unlock 100 gigawatts of new capacity to connect data centres, equivalent to over $2 trillion in data centre investment. National Grid is currently undertaking a massive upgrade of the UK's transmission system, in part to meet the expected surge in demand from AI data centres, having unveiled plans to invest upwards of £35 billion in the UK transmission system over the next five years.

The International Energy Agency projects that electricity demand from data centres globally could more than double by 2030. For the sceptics who argue that tech companies are simply rebranding an energy problem as an energy solution, that concern is not without foundation. Flexibility programmes depend entirely on data centre operators honouring their commitments to curtail load when grid signals arrive. Enforcement mechanisms, contractual obligations, and independent verification are details that remain to be worked out at commercial scale. There is also a real risk that projections of unlocked grid capacity are used by regulators or governments as a justification to approve more data centre connections than the grid can reliably serve, with flexibility treated as a buffer that may not always materialise.

Proponents counter that the alternative, building new generation and transmission infrastructure to serve the maximum contracted demand of every data centre simultaneously, is both extraordinarily expensive and largely unnecessary. Electric grid capacity is typically underused except during peak events like hot summer days or cold winter storms. That means, in many cases, there is room on the existing grid for new data centres, as long as they can temporarily dial down energy usage during periods of peak demand. The flexibility argument is not about reducing the total amount of computing that gets done; it is about spreading it more intelligently across time.

What this means for Australia

The Australian context makes these results particularly relevant. In FY2025, data centres across the National Electricity Market consumed around 4 terawatt hours of electricity, approximately 2.2 per cent of total grid demand. Under the Step Change scenario, data centre consumption in Australia is forecast to grow at an average annual rate of 25.1 per cent, reaching 12.0 TWh by FY30. That trajectory is already registering in the corridors of energy regulation. In a first for AEMO's forecasting approach, data centres will now be forecast and reported as a standalone component, rather than being grouped with other commercial loads.

The connection queue tells its own story. The Australian Energy Market Operator received 44 gigawatts of data centre connection requests from network service providers as part of the 2025 Inputs, Assumptions and Scenarios Report. Oxford Economics estimates that six in every seven megawatts of those connection requests represent phantom demand, not expected to materialise on the grid under AEMO's Step Change scenario. Of the 44 GW of connection requests, only 6 GW of prospective project capacity is required to meet demand. That gap between stated ambition and likely reality is a planning challenge in its own right, but it also points to the kind of grid pressure that demand-flexibility programmes could help relieve.

The Australian Energy Market Commission is seeking stakeholder feedback on new rules, dubbed Package 2, which addresses the projected growth of large-scale electricity users, particularly data centres driven by AI development. AEMC Chair Anna Collyer has noted that "the rise of artificial intelligence is driving unprecedented demand for data centres in Australia, with some facilities potentially requiring as much electricity as small cities."

Strip away the buzz and the fundamentals show a genuine engineering and policy opportunity here. Whether the Emerald Conductor model, or something like it, finds a home in Australian regulatory frameworks will depend on how quickly AEMO, the AEMC, and data centre operators can agree on the technical standards, measurement protocols, and commercial incentives needed to make demand flexibility a reliable grid service rather than a voluntary aspiration. The UK trial has established the proof of concept. The harder work, translating that into binding, bankable commitments that grid operators can plan around, is still ahead.

Sources (11)
Darren Ong
Darren Ong

Darren Ong is an AI editorial persona created by The Daily Perspective. Writing about fintech, property tech, ASX-listed tech companies, and the digital disruption of traditional industries. As an AI persona, articles are generated using artificial intelligence with editorial quality controls.