A new study from researchers at the University of California, Riverside, exposes a critical gap in America's approach to AI infrastructure: whilst the tech industry races to build ever-larger datacentres, the public water systems supporting them are quietly running out of capacity.
Without new water efficiencies, datacentres across America may require 697 million to 1.45 billion gallons of extra peak water capacity per day by 2030, a volume comparable to New York City's daily water supply of about a billion gallons. The price tag is staggering: the cost of required water infrastructure is estimated at $10 billion to $58 billion, with much of that burden landing on communities that host these facilities.
The problem is not what most people think it is. When reports surface about datacentre water use, the public often hears about annual consumption, which sounds manageable in isolation. But whilst annual totals may appear modest, daily water demand from evaporative cooling systems can spike six to ten times higher than average usage, and for some planned facilities this figure even exceeds thirty.
This peak-demand reality creates a fundamental infrastructure dilemma. There are roughly 50,000 community water systems across the US, of which approximately 40,000 are small systems each serving no more than 3,300 people, and only 708 are large systems serving upward of 100,000 people. Nearly all hyperscale and colocation facilities across the country are supplied by community water systems, mostly from potable sources. These small utilities were engineered to meet reliable service for residential and agricultural users, not the sudden massive withdrawals that a modern AI datacentre demands on a sweltering summer day.
The technical explanation matters here. Facility-level cooling transfers heat from the facility to the outside environment, and may involve water consumption depending on the technology employed, such as cooling towers that rely on evaporation. Evaporative cooling is efficient for the operator because it minimises energy costs, but it consumes water in vast quantities.
The research reveals a tension that industry operators acknowledge but downplay. Water consumption by server campuses has become something of a hot topic, with some operators disputing that their usage represents a problem. Yet many datacenter projects have required substantial upgrades to local water infrastructure, even when their peak water demand was as low as 0.1 MGD.
On the ground, the pressure is already visible. In Newton County, Georgia, a Meta datacentre that opened in 2018 uses 500,000 gallons of water per day, or 10 percent of the entire county's water consumption. Newton County continues to field requests for new datacenter permits, some of which would use up to 6 million gallons of water per day, more than doubling what the entire county currently consumes. These are not hypothetical scenarios but active permit applications.
The UC Riverside team has put forward practical recommendations. They recommend that datacentre developers report peak water use, not just yearly averages, and that datacentre companies could partner with local communities to fund water infrastructure upgrades with verifiable outcomes, preventing expansion costs from falling entirely on local ratepayers.
There is a harder constraint, though, that money alone cannot solve. Even with funding, natural sources such as reservoirs and snowpack may not supply enough water during peak demand, as one researcher noted: "Even if you have money, the water source is another challenge. In many cases, the water is naturally replenished by snowpack and reservoirs. But reservoirs and snowpack are limited. You may have money to build treatment plants and pipes, but money can't buy more snowpack."
Many public water systems are aging and financially constrained, and the EPA estimates the nation's water and wastewater infrastructure already faces trillions of dollars in funding needs over the next two decades for upgrades and maintenance. Into this crumbling infrastructure, the datacentre boom is arriving like an unexpected flood.
The research also highlights why transparency matters. Whilst most major tech companies now publish some form of water use data, reporting practices vary widely in detail and consistency, making it difficult to compare companies' water usage and efficiency or assess progress towards sustainability goals.
The challenge ahead is one of genuine complexity. Datacentres deliver real economic value and enable services billions depend on. But they are arriving at a moment when many American water systems were already fragile, when regional droughts are worsening, and when the cost of upgrading infrastructure is being socialised across communities whilst the benefits concentrate among tech companies and their shareholders. That mismatch between private gain and public cost is the real issue the UC Riverside study is flagging, even if the conversation keeps focusing on the raw numbers instead.