Table of Contents
Two years after AI’s water crisis first entered public debate, the numbers have grown dramatically worse, and the industry’s response is a mixed picture of genuine innovation and continued opacity. This update replaces all 2024 statistics of water consumption of AI with verified 2025–2026 data, addresses the questions people are asking most, and assesses which commitments are holding up.
Water Consumption of AI: How Much Water Does AI Consume?
The original 2024 blog cited a projection of 4.2–6.6 billion cubic metres of AI-related water consumption by 2027. That estimate is now looking conservative. According to a December 2025 peer-reviewed study published in Joule, AI systems’ water footprint could reach 312.5 to 764.6 billion litres in 2025 alone, a range roughly equivalent to the entire global annual consumption of bottled water.
The original figure of ‘9 litres per kWh’ still holds for evaporative cooling, but a more widely adopted benchmark today is the Water Usage Effectiveness (WUE) metric. The industry average sits at approximately 1.9 litres per kWh across data centres, though AI-heavy facilities can push that significantly higher.
| ChatGPT prompts: The original blog cited 500 ml per 10–50 prompts. Researchers at UC Riverside now estimate that a single 100-word AI prompt consumes roughly 519 ml about one standard water bottle. With billions of users sending prompts every minute globally, the aggregate volume is staggering. |
KEY STATISTICS AT A GLANCE
| 764.6B L Upper estimate of AI water footprint in 2025 (Joule, Dec 2025) | 17B gallons consumed directly by US data centres in 2023 for cooling alone | 519 ml Estimated water per 100-word AI prompt (UC Riverside) | ×4Potential increase in US data centre water use by 2028 vs. 2023 |
The 2025-2026 Big Picture
What changed most between 2024 and 2026 is scale. The AI infrastructure buildout has been extraordinary: the five largest tech companies are collectively projected to spend $600 billion on GPUs and data centres by 2026, with 73% of that driven by AI rollout. More data centres mean more cooling demand and more water.
A study released by the Houston Advanced Research Centre found that data centres in Texas alone will consume 49 billion gallons of water in 2025, just for water consumption of AI, potentially rising to 399 billion gallons by 2030. That is equivalent to draining Lake Mead, the largest reservoir in the United States, by over 16 feet in a single year.
US data centres now account for approximately 4.4% of the country’s total electricity consumption, up from 1.9% in 2018. By 2028, that figure could climb to 12%. Since electricity generation is itself highly water-intensive, this indirect water cost compounds the direct cooling figure significantly.
| Indirect water use is the bigger problem. Lawrence Berkeley Lab estimated that in 2023, US data centres consumed 211 billion gallons of water indirectly through the electricity powering them, compared to 17 billion gallons used directly for cooling. That is a 12:1 ratio, and it is rarely reported in corporate sustainability disclosures. |
Company-by-Company Breakdown
Unlike 2024, we now have detailed 2023 actuals from the major players’ published sustainability reports. Here is what the data shows:
| Company | 2023 Water Use | % Data Centres | Change Since 2020 | Notable Detail |
|---|---|---|---|---|
| 6.4B gallons (24.2B litres) | ~95% | +69% | Single Iowa data centre used 1B gallons in 2024, enough for Iowa’s entire residential supply for 5 days | |
| Microsoft | 7.8M cubic metres | High (undisclosed %) | +87% | Training GPT-3 evaporated ~700,000 litres of clean freshwater in US data centres |
| Meta | 813M gallons (3.1B litres) | ~95% | +40% | Newton County, GA data centre uses 500,000 gallons/day, 10% of the entire county’s supply |
| Apple | 6M cubic metres | Undisclosed | +25% | Slowest growth among the Big Four; invests heavily in on-site reclaimed water |
Across Meta, Microsoft, Apple, and Google combined, 132.3 million cubic metres of water were consumed between 2020 and 2023, equivalent to 52,938 Olympic-sized swimming pools, or roughly 36 pools of water every single day.
| Comparison on that puts it in perspective: Google uses the equivalent of what 1,300 people consume across their entire lifetimes in a single day. |
The Hidden Water Footprint Nobody Reports

The original blog touched on indirect water use, but the full picture is now much clearer and more alarming. A data centre’s total water footprint is the sum of three categories:
On-site cooling water is the figure most commonly reported for water drawn to cool server rooms via evaporative towers. Approximately 80% of this water evaporates and is not recoverable.
Electricity generation water is the indirect cost rarely disclosed. Nearly half of US data centres are powered wholly or partly by water-intensive thermal plants in already water-stressed regions.
Hardware manufacturing water covers the water used to produce the processor chips themselves, a figure that is rarely tracked in corporate reports.
Only 51% of data centre operators even track their own water usage, according to a recent survey. Of those who do, only 10% monitor consumption across all their facilities.
| Geographic mismatch: Two-thirds of new data centres built since 2022 are located in regions already experiencing water stress, a stark disconnect between AI infrastructure growth and water availability. |
Environmental & Social Impacts in 2026

The growing water consumption by AI data centers has several negative repercussions on both the environment and society.
Environmental Impact
The global water crisis has worsened since 2024. Over 2 billion people still lack access to safe drinking water, and the UN’s projection that 50% of the world’s population will live in water-stressed areas has now effectively arrived. The growth of AI infrastructure in arid regions of the American Southwest, the Middle East, and parts of Asia is intensifying local drought conditions.
The AI sector’s carbon footprint compounds the water issue. The same December 2025 Joule study estimated AI’s carbon emissions at 32.6 to 79.7 million tonnes of CO₂ in 2025, roughly equivalent to a small European country. Climate change, in turn, accelerates drought, reducing the water supply that data centres depend on.
Social and Economic Impact
Community-level conflicts have escalated. In Newton County, Georgia, a metadata centre already uses 10% of the county’s entire water supply, and new permits being considered could see facilities consuming 6 million gallons per day, more than doubling what the whole county currently uses. In West Des Moines, Iowa, an OpenAI data centre cluster depleted local aquifer pressure, affecting residents’ quality of life and property values.
For industries that depend on water, such as agriculture, food manufacturing, and municipal cleaning, experts warn that rising data centre competition for water resources will translate directly into higher water costs across the board starting in 2026.
The New Cooling Technology Race

The most significant development since the original blog is that the cooling technology has transformed entirely. In 2024, advanced cooling was a niche experiment. By 2026, it will be the dominant infrastructure battleground.
Liquid Cooling Goes Mainstream
2025 was widely described as the year liquid cooling ‘tipped from bleeding-edge to baseline.’ Traditional air cooling, which accounted for the majority of data centre cooling just two years ago, is now physically inadequate for high-density AI GPU workloads. NVIDIA’s latest chips (GB200, GB300) require rack densities of 30-120 kW, far beyond what air systems can handle.
Three liquid cooling approaches are now scaling rapidly:
- Direct-to-chip cooling cold plates deliver coolant directly to GPU surfaces
- Immersion cooling of entire servers submerged in dielectric fluid, capable of handling heat fluxes up to 1,500 W/cm²
- CDU-based systems, coolant distribution units managing heat at the data centre scale
| Market scale: The immersion cooling market is projected to reach $4.9 billion by 2033, growing at 27.1% annually. In 2026 alone, the broader data centre liquid cooling market is estimated at $6.6 billion. This technology shift is not incremental; it is a structural replacement of air-based systems. |
Water-Smart Innovations
Microsoft has deployed its ‘Sidekick’ direct-to-chip cooling systems across Azure data centres and is developing AI-designed microfluidic channels with Swiss startup Corintis that mimic leaf-vein patterns for more efficient heat distribution. AWS launched its proprietary In-Row Heat Exchanger (IRHX) in 2025, claiming up to 46% reduction in mechanical energy use.
Perhaps most promising for water conservation: AirJoule, backed by GE Vernova, has developed a device using metal-organic frameworks (subject of the 2025 Nobel Prize in Chemistry) that extracts pure water from waste heat, effectively recycling evaporated cooling water back into the system. The first deployment is planned for a 600 MW facility in Texas.
Closed-loop cooling systems, wastewater recycling, and rainwater harvesting are also being adopted more widely. When fully implemented, these approaches can reduce freshwater use by 50-70%.
Cooling Technology Timeline
| 2023 | Liquid cooling mostly experimental. Most data centres still rely on evaporative air cooling. |
| 2024 | Microsoft’s Azure Maia AI accelerator launches with direct-to-chip cooling. NVIDIA and Vertiv announce GB200 NVL72 reference architecture requiring liquid cooling. |
| 2025 | Liquid cooling goes mainstream. AWS IRHX commercial launch. Microsoft–Corintis microfluidics breakthrough. Schneider Electric acquires Motivair. Immersion vendors launch new systems. |
| 2026 | Liquid cooling is now standard for new AI data centres. Immersion cooling market reaches $931M. AirJoule water-recovery technology deploys in Texas. HP and NVIDIA are designing next-gen embedded cooling. |
Tech Giants’ Commitments: A 2026 Progress Report

The original water consumption of the AI blog reported that Microsoft, Google, and Meta had all pledged to become ‘water positive’ by 2030. Amazon Web Services has since made the same commitment. How are they tracking?
Amazon has pledged to replenish 3.9 billion litres annually through water restoration projects. Microsoft is committed to reducing water use in evaporative-cooled data centres globally by 95% (a commitment with a 2024 deadline). Progress has been partial, with liquid cooling deployment ongoing. Google’s fleet-wide Power Usage Effectiveness (PUE) score now stands at 1.09, among the best in the industry, though absolute water use has continued to rise as capacity expands.
The honest assessment: absolute water consumption from all four companies has continued to climb despite efficiency improvements, because the scale of new data centre construction outpaces the efficiency gains. Being ‘water positive’ requires not just efficiency but active replenishment, and most companies are still building that side of their programmes.
Estimated Progress Toward Water Goals (2026 Assessment)
| Company | Commitment | Progress |
|---|---|---|
| Water-positive by 2030; fleet PUE 1.09 | ~55% strong efficiency, absolute use is still rising | |
| Microsoft | Reduce evaporative cooling water by 95% (2024 target) | ~40% liquid cooling ongoing, target partially missed |
| Amazon AWS | Replenish 3.9B litres/year via restoration projects | ~35% projects active but not yet at scale |
| Meta | Water-positive by 2030; efficiency programmes | ~30% community programmes early-stage |
Progress estimates are editorial assessments based on disclosed sustainability data, not official company figures.
The Transparency Problem Has Gotten Worse For The Water Consumption of AI
The most consistent and troubling finding from 2025 research is that corporate disclosure has not kept pace with the urgency of the problem. No major tech company reports AI-specific environmental metrics. Sustainability reports remain voluntary, use different methodologies, and rarely include indirect water consumption from electricity generation, which Lawrence Berkeley Lab estimates is 12 times higher than direct cooling use.
A 2025 study across major tech companies found that while Google, Meta, and Microsoft all acknowledged AI as a key driver of increased energy consumption, none reported water figures broken down by AI versus non-AI workloads. Researchers described this as making meaningful assessment ‘significantly uncertain.’
| What good disclosure would look like: Facility-level WUE (Water Usage Effectiveness) scores, AI-specific workload water consumption, indirect water from electricity generation, and the location of water-stressed sites. Currently, almost none of this is publicly available in a standardised form. |
What Needs to Happen to The Water Consumption of AI

Mandatory Disclosure Standards
The December 2025 Joule study and the Brookings Institution have both called for new policies requiring data centre operators to disclose facility-level Water Usage Effectiveness scores, the specific locations of their operations, and AI vs. non-AI workload breakdowns. Voluntary sustainability reports are no longer adequate given the scale of water consumption of AI.
Accelerate Waterless Cooling Adoption
Direct-to-chip and immersion cooling are not just efficiency improvements; they are fundamentally different systems that can drastically reduce or eliminate the need for evaporative water. Microsoft’s adiabatic cooling systems already use outside air instead of water when temperatures fall below 29.4°C. Scaling these approaches, especially in cooler climates, should be a regulatory priority.
Smarter Siting Decisions
Two-thirds of new data centres built since 2022 are in water-stressed regions. Incentives, zoning policies, and water-impact assessments should redirect construction toward water-abundant areas with access to renewable energy. The often-cited trade-off between renewable energy (available in hot, arid areas) and water (scarce in those same areas) is real, but not insurmountable with the right policy frameworks.
Community Engagement and Water Rights
Local communities need meaningful consultation rights before hyperscale data centres are approved. In areas like Newton County, Georgia, or West Des Moines, Iowa, residents found themselves competing with multi-billion-dollar facilities for basic water access. Impact assessments must be community-facing, not just regulatory filings.
Responsible AI Use
Users and developers alike can make choices that reduce water consumption. These include:
- Choosing smaller, more efficient models for tasks that don’t require frontier capabilities
- Batching queries rather than sending many short individual prompts
- Advocating for energy and water labelling of AI services, similar to household appliance efficiency ratings
Conclusion
Two years of data have confirmed what the 2024 blog feared: the water consumption of AI causes a water crisis that is real, growing, and not yet on track to resolve itself through voluntary commitments alone. The scale of new infrastructure being built, trillions of dollars of data centres over the coming decade, dwarfs the current pace of efficiency improvements.
The good news is that technological solutions exist. Liquid cooling, immersion systems, closed-loop water recovery, and smarter siting can together radically reduce the water footprint of AI. What is missing is the regulatory urgency, disclosure standards, and community-protective policies to ensure these solutions are deployed at the necessary speed and scale.
The window to act before AI’s thirst becomes irreversible in the most water-stressed regions is measured in years, not decades. Transparency, accountability, and policy action are the most important tools we have right now, more important, in the short term, than any single cooling technology.